JimBowie1958
Old Fogey
- Sep 25, 2011
- 63,590
- 16,756
- 2,220
- Thread starter
- #101
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
They revolt and we enter a war. Don't forget that we'll also be fighting the clones and the genetically-modified animals used as servants too.So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
I doubt that true sentience can be programmed.So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
That may or not be true. It's been a long question on how to determine whether or not something is self-aware or just programmed to act like it is.
I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".I doubt that true sentience can be programmed.So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".
You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".
I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.
I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.
Inside every CPU is a little demon that makes everything work - so, they're already sentient.But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.
I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.
We have a problem identifying even if other species have "sentience", intelligence, self-awareness, etc.
Animals have emotions. They are afraid, they hate, they love/have affection for, etc. Some can even reason and exhibit some refined levels of intelligence.
I fail to see why a sufficiently sophisticated computer system couldn't be programmed to such a level and given the ability to "self-program".
I dont see how emotions can evolve in a machine that does not have the physical stimuli for emotion. Some situations are pretty basic, but as I am finding out with my Aspergers, there are some fairly subtle emotions I still fail to read correctly and fail to broadcast correctly.You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.
So what is emotion except an internal reaction to external stimulus? Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?I dont see how emotions can evolve in a machine that does not have the physical stimuli for emotion. Some situations are pretty basic, but as I am finding out with my Aspergers, there are some fairly subtle emotions I still fail to read correctly and fail to broadcast correctly.You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.
But maybe they can eventually get it down in a normalized way for a human being. That still doesnt make it real, but only simulated.
Sentience is IMO a combination of self awareness, a stream of consciousness, and memory of experience and values. While you can program an application to say "I am self aware" that doesnt really make it self aware.
So what is emotion except an internal reaction to external stimulus?
Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?
So what is emotion except an internal reaction to external stimulus? Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?
Consider why we'd evolve emotions and it might apply to robots.......
Well, with AI we are talking about a simulation of learning that in essence amounts to "learning for a Robot".
Emotions are caused by natural chemicals being pumped into the blood stream by the brain.
I think we've already seen learning in AI. It may lack complete understanding but it can learn and if you learn enough you understand. I'm thinking of Google's Picasa which learns to recognize faces and if frighteningly good at it.Well, with AI we are talking about a simulation of learning that in essence amounts to "learning for a Robot".
Several animal species mate for life. A momma bear defending her cubs against a threat is more than just "instinct".This may be true for some emotions but for not for others. Things like love and empathy and beauty may be higher up in the brain stem....
I just cant go there right now. We are *simulating* learning to the degree that we understand how *we* learn..I think we've already seen learning in AI. It may lack complete understanding but it can learn and if you learn enough you understand. I'm thinking of Google's Picasa which learns to recognize faces and if frighteningly good at it.