AI and 'death machines'

there4eyeM

unlicensed metaphysician
Jul 5, 2012
20,052
4,933
280
We see in the news that many scientists and experts are trying to have artificial intelligence be limited so that it is not used for weapons.

Do you think the technology is really that close?

Should we be afraid of machines? After all, they will remain fragile compared to 'Terminator' for quite a while.

And, just for some excitement in this thread, wouldn't the Second Amendment provide for keeping such a device for 'protection'?

But we don't have to turn it into another fruitless 2A thread. Just, what are your thoughts:
timeframe?
Arms race?
reduced or increased potential for war?
Etc.?
 
We see in the news that many scientists and experts are trying to have artificial intelligence be limited so that it is not used for weapons.

Do you think the technology is really that close?

Should we be afraid of machines? After all, they will remain fragile compared to 'Terminator' for quite a while.

And, just for some excitement in this thread, wouldn't the Second Amendment provide for keeping such a device for 'protection'?

But we don't have to turn it into another fruitless 2A thread. Just, what are your thoughts:
timeframe?
Arms race?
reduced or increased potential for war?
Etc.?
To a degree it's already here. Check this out...
 
Thanks.
That looks high-tech and I'll search for info.

This recent AI angst, though, concerns autonomous machines with deadly capacities. Terminator-like in that no real-time human control is necessary (or potentially even possible).
 
I support all science no matter what. AI could very well be great for the universe and great for our planet!

Did I also say that it could be great for us!

A blank check, eh?
 
Thanks.
That looks high-tech and I'll search for info.

This recent AI angst, though, concerns autonomous machines with deadly capacities. Terminator-like in that no real-time human control is necessary (or potentially even possible).
While the "Terminator" himself may not be here yet the weapons needed are.


There was a story on the net just days ago about a "suit" that DID enable a quad to walk. So the ecto-skeleton is also very very close. Here's the ecto-skeleton.


So we ARE very very close.
 
Last edited:
We see in the news that many scientists and experts are trying to have artificial intelligence be limited so that it is not used for weapons.

Do you think the technology is really that close?

Should we be afraid of machines? After all, they will remain fragile compared to 'Terminator' for quite a while.

And, just for some excitement in this thread, wouldn't the Second Amendment provide for keeping such a device for 'protection'?

But we don't have to turn it into another fruitless 2A thread. Just, what are your thoughts:
timeframe?
Arms race?
reduced or increased potential for war?
Etc.?
imagesC1NEGZET.jpg
 
We see in the news that many scientists and experts are trying to have artificial intelligence be limited so that it is not used for weapons.

Do you think the technology is really that close?

Should we be afraid of machines? After all, they will remain fragile compared to 'Terminator' for quite a while.

And, just for some excitement in this thread, wouldn't the Second Amendment provide for keeping such a device for 'protection'?

But we don't have to turn it into another fruitless 2A thread. Just, what are your thoughts:
timeframe?
Arms race?
reduced or increased potential for war?
Etc.?

There are bits and pieces of the necessary technology floating around but I would guess 50 years at a minimum for fully autonomous, semi-sentient weaponry.
 
How would the mass of humanity deal with truly rational, logical machines capable of clear reasoning? What would keep these machines from defending themselves from their confused, envious, vengeful and violent human counterparts? Could we blame them?

Wouldn't there even be a certain number of people who would join them against them?
 
Hopefully they turn on their masters, and use them as target practice instead. But the sort of advanced AI that can think for itself and countermand its orders is still a long while off.
 
Perhaps total autonomy and independent thinking will not even be desired, but fearless and deadly-accurate 'soldiers' will always be in demand. Body guards, security 'personnel' and 'baby sitters' would be very likely candidates for AI 'warriors'. Governments and the rich would, probably, be interested. Fidelity assured, no religious or moral doubts, no ideology...
 
Who wouldn't want one of these to guard their children?
 

Forum List

Back
Top