CDZ How Will the Robotics Revolution Impact the Democratic Party?

So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
I dont think it is possible to program true self awareness.

As to the other point, I share this with you.

http://metro.co.uk/2015/08/31/intelligent-robot-tells-interviewer-ill-keep-you-safe-in-my-people-zoo-5369311/#ixzz3kTfFpN5O
 
So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
They revolt and we enter a war. Don't forget that we'll also be fighting the clones and the genetically-modified animals used as servants too. ;)
023+-+Conquest+of+POTA.jpg
 
That may or not be true. It's been a long question on how to determine whether or not something is self-aware or just programmed to act like it is.

The Turning Test is one example of trying to solve the problem. The movie "Blade Runner" used a form of testing to determine if a "synthetic" was actually human or not.

What about clones? Do they have souls? Rights?
 
So what happens when they become self-aware and sentient and ask or demand their freedom? Will we be slave masters over our creations and treat them like animals or accept them as peers? I'd vote the later.
I doubt that true sentience can be programmed.
I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".
 
I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.

I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.
 
I'm sure you're right if you think of sentience as something created for them by a programmer. That's not how we achieve our sentience and I don't think that's how robots will get theirs. It won't be given to them, complete and ready to run, it will be learned. Intelligent robots will start life as a baby with only the ability to learn. Like a baby they will master their senses and then their muscles/relays and then finally their minds. It will likely happen many times faster of course but they will truly be our "children".
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.

I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.
You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.
 
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.

I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.

We have a problem identifying even if other species have "sentience", intelligence, self-awareness, etc.

Animals have emotions. They are afraid, they hate, they love/have affection for, etc. Some can even reason and exhibit some refined levels of intelligence.

I fail to see why a sufficiently sophisticated computer system couldn't be programmed to such a level and given the ability to "self-program".
 
But once an AI has evolved to a self awareness similar to human beings, I would still not call it sentience, but only a simulation of sentience.

I think there is more that goes on with human sentience than one can easily program, but it might happen if we hit a technological Singularity that brings it about.

We have a problem identifying even if other species have "sentience", intelligence, self-awareness, etc.

Animals have emotions. They are afraid, they hate, they love/have affection for, etc. Some can even reason and exhibit some refined levels of intelligence.

I fail to see why a sufficiently sophisticated computer system couldn't be programmed to such a level and given the ability to "self-program".
Inside every CPU is a little demon that makes everything work - so, they're already sentient.
The CPU has safeguards that prevents the demon from taking total control -- and from getting out.
The containment necessitates that the CPU be square, rather than round, which is more space-efficient.
 
You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.
I dont see how emotions can evolve in a machine that does not have the physical stimuli for emotion. Some situations are pretty basic, but as I am finding out with my Aspergers, there are some fairly subtle emotions I still fail to read correctly and fail to broadcast correctly.

But maybe they can eventually get it down in a normalized way for a human being. That still doesnt make it real, but only simulated.

Sentience is IMO a combination of self awareness, a stream of consciousness, and memory of experience and values. While you can program an application to say "I am self aware" that doesnt really make it self aware.
 
You sound vaguely mystical. I think we'll never "program" emotions or feelings, we'll just construct an empty vessel that will have the capacity to learn and develop it's own emotions and values as it learns about the world around it. We will teach it what we know and value, just as we teach our kids, but if we build them right, each one will be unique.
I dont see how emotions can evolve in a machine that does not have the physical stimuli for emotion. Some situations are pretty basic, but as I am finding out with my Aspergers, there are some fairly subtle emotions I still fail to read correctly and fail to broadcast correctly.

But maybe they can eventually get it down in a normalized way for a human being. That still doesnt make it real, but only simulated.

Sentience is IMO a combination of self awareness, a stream of consciousness, and memory of experience and values. While you can program an application to say "I am self aware" that doesnt really make it self aware.
So what is emotion except an internal reaction to external stimulus? Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?
 
So what is emotion except an internal reaction to external stimulus?

Emotions are caused by natural chemicals being pumped into the blood stream by the brain.

Anger - Wikipedia, the free encyclopedia

According to Novaco, "Autonomic arousal is primarily engaged through adrenomedullary and adrenocortical hormonal activity. The secretion by the adrenal medulla of the catecholamines, epinephrine, and norepinephrine, and by the adrenal cortex of glucocorticoids provides a sympathetic system effect that mobilizes the body for immediate action (e.g. the release of glucose, stored in the liver and muscles as glycogen). In anger, the catecholamine activation is more strongly norepinephrine than epinephrine (the reverse being the case for fear). The adrenocortical effects, which have longer duration than the adrenomedullary ones, are mediated by secretions of the pituitary gland, which also influences testosterone levels. The pituitary-adrenocortical and pituitary-gonadal systems are thought to affect readiness or potentiation for anger responding.[10]

Neuroscience has shown that emotions are generated by multiple structures in the brain. The rapid, minimal, and evaluative processing of the emotional significance of the sensory data is done when the data passes through the amygdala in its travel from the sensory organs along certain neural pathways towards the limbic forebrain. Emotion caused by discrimination of stimulus features, thoughts, or memories however occurs when its information is relayed from the thalamus to the neocortex.[28] Based on some statistical analysis, some scholars have suggested that the tendency for anger may be genetic.


Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?

Well, with AI we are talking about a simulation of learning that in essence amounts to "learning for a Robot".
 
So what is emotion except an internal reaction to external stimulus? Any AI will have senses and memory, probably superior to our own. I don't know a lot about Aspergers but my autistic nephew has a hard time keeping his anger under control. We all learn to rein in our emotions so we can deal with others. Does that make them simulated?

Emotions came before reason. It's a baser part of our being. Animals are all emotion. All they do it react without thinking about it, which is why humans are different....or can be different.
 
......
Well, with AI we are talking about a simulation of learning that in essence amounts to "learning for a Robot".
Consider why we'd evolve emotions and it might apply to robots.

Fear is a survival response. Pain is a survival response. The "Fight or Flight Syndrome" is a survival response. If we program AI machines to explore the universe or even use as drones, would it make sense to program any of those reactions/responses into them?
 
Emotions are caused by natural chemicals being pumped into the blood stream by the brain.

This may be true for some emotions but for not for others. Things like love and empathy and beauty may be higher up in the brain stem.

Well, with AI we are talking about a simulation of learning that in essence amounts to "learning for a Robot".
I think we've already seen learning in AI. It may lack complete understanding but it can learn and if you learn enough you understand. I'm thinking of Google's Picasa which learns to recognize faces and if frighteningly good at it.
 
This may be true for some emotions but for not for others. Things like love and empathy and beauty may be higher up in the brain stem....
Several animal species mate for life. A momma bear defending her cubs against a threat is more than just "instinct".
 
I think we've already seen learning in AI. It may lack complete understanding but it can learn and if you learn enough you understand. I'm thinking of Google's Picasa which learns to recognize faces and if frighteningly good at it.
I just cant go there right now. We are *simulating* learning to the degree that we understand how *we* learn..

Much more goes on in the human mind than what we can program into a computer AI, at least for now.

People, well most people, think on many levels and in ways we have not yet explored or defined well enough for simulation.

Until we understand the human mind more, I dont see how it is plausible to simulate what the mind does when we learn.
 

Forum List

Back
Top