Discussion in 'Science and Technology' started by Bonzi, Jul 22, 2017.
You steal more jokes than Bob Hope..
Self destructing is the least of his problems, an ample supply of Depends are...
The answer is no, machines will not ever take over. Sure, they'll get smarter and more advanced. Probably do delicate surgeries soon enough. But humans have the advantage that put us on the top and kept us there. We have deception, backstabbing, cunning and malevolent motives on our side. No machine can match that. While their binary computations are taking place we will be busy with a devious scheme to spill beer on their delicate circuitry.
Take over what?
You know, be in charge, and we would be their bitches...
Wow! What happened???
Not if other machines are programmed to do that....ad infinitum
The 3 laws of robotics.
Isaac Asimov's "Three Laws of Robotics"
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Of course there was a conflict of the 3 laws in I Robot.
If you mean like on Maximum Overdrive, no.
However we can program machines to destroy people.
~Whistles past the graveyard~
Separate names with a comma.