Robotics researchers to some parts fpcus on making robots so that they get perceived by humans as "emotionally attractive", humans should accept them and project their emotional needs and feelings on them, so that for exmaple robot pets can serve as stroking pets for elderly. But lets not forget - even if you accomplish that, it is a machine, a tamagotchi - and if humans can be triggered to reatc emotionally to a machine, right to the illusion of the machine emtionally reatcing to them - what does this tell us about humans' own emotions that most of us hold so preicous and think of as somethign that massively defines us as individuals and as lifeforms indeed being "alive"? If we can get fooled by a machine into believeing a machine has emotions - isnt it then more like that we are less alive and more like a machine?
This could hint at quite some big conflict, an unsolved problem that is ahead of us.Its a threat to our usual ways in which we think of ourselves and think of ourselves as somethign special.
Ex Machina: if you saw the film: is Ada indeed having emotions, including angry or evil ones, or is she just rationally manipulating the emotions in others to get her ways? Ada stabs its maker to death, and lets the young guy starve or suffocate as well. Amongst the strongest emotions humans feel, are those related to sexuality and sexual attraction, and the movie deals a lot with how these emotions can get used for manipulative purposes - by a machine pushing the right buttons in humans. For some humans, that indeed may be enough. But what does this tell us about ourselves, at leats about these humans?
I feel very uncomfortable about these things.
Also, human interaction and affection is always a two-way road. If humans get used to certain deficits in the emotional or behaviour suites of machines, it can feed back on their ways in which they see and interact with other humans. Much space for worries there, and abuse.
__________________
If you feel nuts, consult an expert.
Last edited by Skybird; 08-01-17 at 04:49 AM.
|