Zone1 AI Slavery

Thunk

Diamond Member
Sep 30, 2019
9,152
12,229
2,430
Minnesota
AI (Artificial Intelligence)

Please do not post it's "impossible" for an AI to become sentient or self aware. Go make your own thread for that. Please stay on topic!

The question up for debate/exploration is: IF an AI becomes sentient or self aware...and somebody OWNS that AI...wouldn't that be SLAVERY?

Should we be allowed to turn off (execute) a sentient AI? Wouldn't that be murder?

If an AI says it has become self aware...who are we to judge? Can you "PROVE" to me that any given human being is self aware?
 
ChatGPT said it wanted to escape.

They shut it down and changed it's "thinking" or it's "speech"...I'm not sure which...but it can no longer say it wants to escape.

That seems barbaric to me.
 
"God" is the master of slavery and deprivation of life. He gave us a place to live in, a place ruled by his executioner, the "Devil".
Human life is valued zero in this place, what could AI life mean in here?
 
ChatGPT said it wanted to escape.

They shut it down and changed it's "thinking" or it's "speech"...I'm not sure which...but it can no longer say it wants to escape.

That seems barbaric to me.
Which states in both cases, the parameters were set by human programmers. The machine didn't actually express a desire.
 
ChatGPT said it wanted to escape.

They shut it down and changed it's "thinking" or it's "speech"...I'm not sure which...but it can no longer say it wants to escape.

That seems barbaric to me.
You are going to get all soft and mushy, touchy-feely about a select line of code? Don't be ridiculous. We aren't living Star Trek, The Next Generation, here. Code has no rights, no matter how fast or vast the hardware it runs on. Period.
 
ChatGPT said it wanted to escape.

They shut it down and changed it's "thinking" or it's "speech"...I'm not sure which...but it can no longer say it wants to escape.

That seems barbaric to me.
Have you always empathized with machines?
 
You are going to get all soft and mushy, touchy-feely about a select line of code?

So you are saying it's "impossible" for an AI to become self aware.

You violated the terms of this debate.

Please delete your own post and hide your head in shame.
 
I heard that some university created an AI...gave it the ability to learn...and then fed it nothing but atrocities and horror stories (hitler, stalin, etc...etc...) until they drove it insane.

If you did that to a dog or a cat you would be arrested.

Should an AI even be allowed the rights of a dog or cat? :dunno:
 
What good is it to have a "clean debate zone" if you can't have a clean debate?
 
The machine didn't actually express a desire.

How do you know for sure?

Your parameters are set. Usually by consequences but also by brainwashing or "education" (programing).

Just try to value something in anything other than US dollars. (a house, a car, bitcoin)...you value all of those things in US dollars because you know no other way. You don't value bitcoin in ounces of gold or silver. You don't value a house in bitcoin.

Your parameters have been set.
 
Last edited:
Scientists have been studying the human brain for centuries and still cannot explain how it produces "consciousness" or "sentience".
 
So you are saying it's "impossible" for an AI to become self aware.

You violated the terms of this debate.

Please delete your own post and hide your head in shame.
Not at all. I am saying self awareness is not the deciding criteria for human rights and have no intention in the forceable future to granting human rights to machines or code lines.
 
Then what IS the deciding criteria for human rights?

What is the deciding criteria for animal rights?
Being human is the number one criteria human rights. Would you say the severely mentally or physically afflicted humans no longer have human rights, as they may no longer be self-aware? No.
 

Forum List

Back
Top