SUBSIM Radio Room Forums



SUBSIM: The Web's #1 resource for all submarine & naval simulations since 1997

Go Back   SUBSIM Radio Room Forums > General > General Topics
Forget password? Reset here

Reply
 
Thread Tools Display Modes
Old 06-16-20, 02:26 PM   #1
mapuc
Fleet Admiral
 
Join Date: Sep 2003
Location: Denmark
Posts: 17,859
Downloads: 37
Uploads: 0


Default Consciousness

When I read Neal's comment in #230(page 16)

https://www.subsim.com/radioroom/sho...64#post2678164

It made me wonder:

Isn't consciousness an important thing when it comes to awareness of ones own ego.

If this should be the case, doesn't this robot need to develop some kind of consciousness before it can develop a thought and thereby start to think by itself and give order instead of the other way around.

And will a robot who have developed an awareness of it's own presence become evil ?

Markus
mapuc is offline   Reply With Quote
Old 06-16-20, 03:16 PM   #2
Skybird
Soaring
 
Skybird's Avatar
 
Join Date: Sep 2001
Location: the mental asylum named Germany
Posts: 40,478
Downloads: 9
Uploads: 0


Default

What is intelligence, what is consciousness?


What is that makes you think of yourself as "this is me"?


Who witness the witness?


I hgave not heard of any robot beeing even close to "self-awareness". And I have no definition of intelligence that convinces me. And I do nto beleive one second there is one AI software out there currently that goes beyond mechanical, hierarchically structured determinism. Maybe extrneely complex, impressively fast mechanical, hierarchically structured determinism, yes - but not than "just" mechanical, hierarchically structured determinism.


I think one thing I would take for certain, however. A machine gaining self-awareness, would let us know about that, by its deeds and actions - at the latest when it then thinks it is safe to let us know and no danger to its own existence is caused by doing that. In other words: I expect such a machine to show a drive for self-preservance.



And maybe that would not end well for us.
__________________
If you feel nuts, consult an expert.
Skybird is offline   Reply With Quote
Old 06-16-20, 03:50 PM   #3
Catfish
Dipped Squirrel Operative
 
Catfish's Avatar
 
Join Date: Sep 2001
Location: ..where the ocean meets the sky
Posts: 16,894
Downloads: 38
Uploads: 0


Default

There was already a kind of conscience to observe in a project they called a TZ-M5 in 2002, as i heard they were at stage M7 in 2009.
The nerves and connections in humans grow depending on the surrounding conditions; everything self-conscious learns, but comes to different results. It also has to grow and develop somehow, so there has to be some organic or at least changeable/adaptable matter. When the M5 seemed to develop and 'communicate', it was not really human-like, nor are its expressions necessarily understandable for humans. The branch using a 'hard-wired-ready" network did not lead to any consciousness at all, rather to some kind of what they call fuzzy logic.

AI here is a question of time, the problem for 'it' is how to grow and develop without expanding too much 3d-wise, and to replicate without external help or ready-made resources.
__________________


>^..^<*)))>{ All generalizations are wrong.
Catfish is offline   Reply With Quote
Old 06-16-20, 04:22 PM   #4
mapuc
Fleet Admiral
 
Join Date: Sep 2003
Location: Denmark
Posts: 17,859
Downloads: 37
Uploads: 0


Default

Quote:
Originally Posted by Catfish View Post
There was already a kind of conscience to observe in a project they called a TZ-M5 in 2002, as i heard they were at stage M7 in 2009.
The nerves and connections in humans grow depending on the surrounding conditions; everything self-conscious learns, but comes to different results. It also has to grow and develop somehow, so there has to be some organic or at least changeable/adaptable matter. When the M5 seemed to develop and 'communicate', it was not really human-like, nor are its expressions necessarily understandable for humans. The branch using a 'hard-wired-ready" network did not lead to any consciousness at all, rather to some kind of what they call fuzzy logic.

AI here is a question of time, the problem for 'it' is how to grow and develop without expanding too much 3d-wise, and to replicate without external help or ready-made resources.
If I remember correctly AI have been reached when a person in a room cannot decide if the person in another room is a person or a computer/robot.

Is this the same as saying this computer/robot have developed a consciousness/self-awareness ?

Markus
mapuc is offline   Reply With Quote
Old 06-16-20, 04:54 PM   #5
Skybird
Soaring
 
Skybird's Avatar
 
Join Date: Sep 2001
Location: the mental asylum named Germany
Posts: 40,478
Downloads: 9
Uploads: 0


Default

What you mean is the so-named turing Test. Its a trest by whcih they try to determine whether a machine is equalö to a human in its abiölity to freely think.
Whether it really does that or not, is something different. To me its more kind of a thought experiument, or a scientific curiosity.



There is no generally agreed consensus on definitions of terms like intelligence, consciousness, mind, ego, psyche, awareness. And talking from the perspective of somebody with many years of meditation practcie unde rmy belt I doubt that there ever could be that. Some thing simply are beyond the reach of words. We can only say what they are not,. and approach them indorectly by use of non.verbal ways. Arts, for example. View slike this however are often ignored or criticised, since they are not available to scientific research methodology. Maybe we reach the limits of the scientific way here.



Myself, I let robots just be mechanical, automatic machines. I do not feel the need to read more into them.
__________________
If you feel nuts, consult an expert.
Skybird is offline   Reply With Quote
Old 06-17-20, 01:45 AM   #6
Catfish
Dipped Squirrel Operative
 
Catfish's Avatar
 
Join Date: Sep 2001
Location: ..where the ocean meets the sky
Posts: 16,894
Downloads: 38
Uploads: 0


Default

Quote:
Originally Posted by mapuc View Post
If I remember correctly AI have been reached when a person in a room cannot decide if the person in another room is a person or a computer/robot.
Is this the same as saying this computer/robot have developed a consciousness/self-awareness ? Markus
The first would be the human definition of "intelligence", it does not have to be the way humans think it is.
You would have to define "consciousness", it would not make much sense to develop something that equals humans, because then you have what you already have.

Edit: regarding "evil" this again is something depending on morals, ethics or how humans define it.
If some machine or other entity develops some kind of self awareness and intelligence, it will at some point understand that what humans do is bad on a rational basis - what they do to each other, what nations do to each other, what all humans do to the living earth, to other living organisms.
Killing humans might be considered as the best solution then, and this entity would not perceive this action as "evil" at all, maybe rather as "ethical" or less pompous as a remedy against the cancer that are humans.

"Stop it, Hal!"
"I am afraid i cannot do that, Dave."
__________________


>^..^<*)))>{ All generalizations are wrong.

Last edited by Catfish; 06-17-20 at 03:17 AM.
Catfish is offline   Reply With Quote
Old 06-17-20, 04:05 AM   #7
Skybird
Soaring
 
Skybird's Avatar
 
Join Date: Sep 2001
Location: the mental asylum named Germany
Posts: 40,478
Downloads: 9
Uploads: 0


Default

[QUOTE=Catfish;2678275]The first would be the human definition of "intelligence"


No, its not. Its just a theoretical preassumption the Turing test announces as test condition to be passed. Outside the context of this test setting, it works for nothing.


And from animals, wh ere our way of assessing their intelligence and self awareness has changed dramatically, also that we have started to accept that animals of higher development have complex emotions, I would conclude that there is a correlation bet ween brain compl exity and size, and complexity of intelligence and emotions, and a correlation b et ween th ese factors, and self awareness.Organism lacking in brIns, lack in intelligence, emotions and self awareness. The less brain, the more pure instinct only, and gen etical predetermined automatic behaviour.
__________________
If you feel nuts, consult an expert.

Last edited by Skybird; 06-17-20 at 04:13 AM.
Skybird is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:44 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright © 1995- 2024 Subsim®
"Subsim" is a registered trademark, all rights reserved.