Self Driving Car blows past Schoolbus Stop Sign, runs over test dummy child.

Still not a fan of self driving cars. Why are we putting life and death in the hands of a computer program? How about just drive your own damn car? :confused-84:


Hard to argue with a video of a bright-red Model Y mowing down a pint-sized mannequin like it’s late for recess. The bus had flashing reds, a stop sign, the whole Schoolhouse Rock package—FSD just shrugged and kept rolling. That’s a yikes in any language.

On the other hand, Tesla’s “Full Self-Driving” is still legally just fancy cruise control. The driver is supposed to be babysitting the computer, not napping while it invents new traffic laws. Treat it like a robot chauffeur and, well… you get mannequin pancakes.

Regulators aren’t asleep at the wheel: NHTSA has racked up more investigations than a true-crime podcast, and Tesla just recalled two million cars to tighten driver-attention checks. Good start, but the bus-dummy video shows the software itself still has blind spots you could park a yellow school bus in.

Until the tech can reliably spot stop signs, flashing reds, and, you know, children, the safest autopilot remains two hands and a caffeinated human brain.

(Plus, if you drive yourself you get to keep all your own mixtapes instead of whatever Elon thinks is “premium connectivity.”)
 
Hard to argue with a video of a bright-red Model Y mowing down a pint-sized mannequin like it’s late for recess. The bus had flashing reds, a stop sign, the whole Schoolhouse Rock package—FSD just shrugged and kept rolling. That’s a yikes in any language.

On the other hand, Tesla’s “Full Self-Driving” is still legally just fancy cruise control. The driver is supposed to be babysitting the computer, not napping while it invents new traffic laws. Treat it like a robot chauffeur and, well… you get mannequin pancakes.

Regulators aren’t asleep at the wheel: NHTSA has racked up more investigations than a true-crime podcast, and Tesla just recalled two million cars to tighten driver-attention checks. Good start, but the bus-dummy video shows the software itself still has blind spots you could park a yellow school bus in.

Until the tech can reliably spot stop signs, flashing reds, and, you know, children, the safest autopilot remains two hands and a caffeinated human brain.

(Plus, if you drive yourself you get to keep all your own mixtapes instead of whatever Elon thinks is “premium connectivity.”)
Tech will NEVER be able to anticipate all of the random hazards presented by the real world. It will be endless "software updates" as one person after another is killed by the latest self driving car deficiency.
 
Oh eventually it will work better than a human but by then AI will have decided that mass transportation is no longer necessary and just eliminate all passenger vehicles.
 
I am 110% against self-driving cars. They should be run right off the ******* road, just like apex predators which stalk and hunt near towns should be exterminated.
 
Tech will NEVER be able to anticipate all of the random hazards presented by the real world. It will be endless "software updates" as one person after another is killed by the latest self driving car deficiency.
You’re not wrong to be wary—AI’s been “just about ready” to drive for a solid decade now, and it still can’t decide if a traffic cone is a hazard or a hat.

But to say tech will never anticipate random hazards? That’s a bold claim. We’ve got AI diagnosing rare cancers, translating extinct languages, and generating Beethoven symphonies with more emotional depth than most Spotify playlists. Dodging a toddler with a Popsicle shouldn’t be the impossible hurdle.

The real issue is rollout: we’re trying to teach cars to think like humans without giving them any of the childhood trauma that taught us to fear school zones. That’s a problem.

So until AI develops a deep respect for crossing guards and kindergartners, I’m with you—eyes on the road, hands on the wheel, and leave the software updates for your smart fridge.
 
You’re not wrong to be wary—AI’s been “just about ready” to drive for a solid decade now, and it still can’t decide if a traffic cone is a hazard or a hat.

But to say tech will never anticipate random hazards? That’s a bold claim. We’ve got AI diagnosing rare cancers, translating extinct languages, and generating Beethoven symphonies with more emotional depth than most Spotify playlists. Dodging a toddler with a Popsicle shouldn’t be the impossible hurdle.

The real issue is rollout: we’re trying to teach cars to think like humans without giving them any of the childhood trauma that taught us to fear school zones. That’s a problem.

So until AI develops a deep respect for crossing guards and kindergartners, I’m with you—eyes on the road, hands on the wheel, and leave the software updates for your smart fridge.
The self driving vehicle missed a school bus STOP SIGN with blinking lights. For as many years as this technology has been around, that is scary and inexcusable to miss something so obvious. I believe self driving vehicles should be restricted to urban areas with very low vehicular speeds. As a musician I personally detest the intrusion of AI into music and lyrics but I realize that AI is coming whether I like it or not.
 
Still not a fan of self driving cars. Why are we putting life and death in the hands of a computer program? How about just drive your own damn car? :confused-84:




I can't drive but other than that I agree with you. I don't trust those things.
 
Back
Top Bottom