ryan1 wrote:
It would be very interesting and complex to code emotions into a robot. It would have to be capable of rewriting its source code on the fly in order to take in new information and form new "neurological" pathways.
I would say that it would be a good idea to create one robot with such a chip and observe it (the insights into human emotion and behavior could be invaluable); however, I think it would be unwise to create them en masse, for fear of what they might become if they get together in large numbers.
But if you create a robot with emotions and keep it locked up to be observed, would it not feel like an animal? And also, it knows that when you are done with it, it shall be destroyed. I would assume that robots would not like the feeling of death.
Quote:
Although perhaps it would be necessary to code emotion in order to prevent "logical genocide" (I'm thinking of the movie I, Robot). The robots might come to some logical conclusion which, unless tempered by emotion, could lead to potentially devastating results.
In other words, I have no idea whether I would choose to implement this emotion chip or not, because the results would be too unpredictable either way.
But then, if they have emotions they would hopefully know that killing is wrong. The more compassionate type robots may end trying to save human beings. Remember, that in I, robot the main computer did not have emotions, but the one that helped the humans did.