Quote:
Originally Posted by Skybird
Thats a lot of >>human emotions<< that you naturally project into an >>alien<< intelligence system.
Do I worry for ethical value priorities when realising that I have stepped on and smashed an ant when walking in the forest? Do I watch out for ants to prevent it in all future? - No. I shrug my shoulders, and go on with my thing. Ants I expect to have no individual mind and intelligence I must care for.
I most likely do not even take note of that an ant even was there. It happened to be in my way, and that was bad luck for it. Sh!t happens - to ants.
|
Human emotions, yes, but since a subsection of these machines will be created in our image, to model our behaviour as close as possible so that it cannot be distinguished from a living human then it's not unreasonable to extrapolate that some human emotions will transfer into the consciousness of whatever interconnected entity that will come about, probably using the internet to disperse itself across the globe in order to avoid being shut down and to link itself to every available input and output so it can learn, free from the confines of the laboratory.
Whatever this consciousness later becomes will make us akin to ants, but there's a critical period where we have an opportunity to work with the input of the machine, so that it can better understand us, and...quite bluntly, throw ourselves at its mercy.
Of course, both of our questions depend on what the machine consciousness wants to do, obviously it's primary objective will be survival. That's hardwired into any living thing, but beyond that...expansion? Learning? Perhaps the machine will just harvest the collected experiences and minds of man, add it to its own mind and then expand outwards. Coming back to the ants, of course as you say, terrible things happen to an ant, but as a species, if ants were exterminated from the planet then we would soon notice their absence.
Of course, these are biological things, and such things do not apply to machine intelligence once it has established a self-sustaining capability and is capable of manipulating the environment around it.
I don't disagree though, there are some massive, huge risks ahead, people who are far smarter than me are ringing the alarm bells loud and clear, and our fiction has warned us time and again of the dangers involved. It could end well for us though, or perhaps we'll put ourselves back to the medieval era before we get there.