Quote:
Originally Posted by Skybird
Thats a lot of >>human emotions<< that you naturally project into an >>alien<< intelligence system.
Do I worry for ethical value priorities when realising that I have stepped on and smashed an ant when walking in the forest? Do I watch out for ants to prevent it in all future? - No. I shrug my shoulders, and go on with my thing. Ants I expect to have no individual mind and intelligence I must care for.
I most likely do not even take note of that an ant even was there. It happened to be in my way, and that was bad luck for it. Sh!t happens - to ants.
|
It's a matter of scale: suppose an ASI makes a pragmatic decision based on its assessment of us as, relative to it, having no individual mind and intelligence and views us as the ants? We just happened to be in its way and that is bad luck for us...
<O>