Tesla driver that crashed into police car was warned by Autopilot 150 times

1srelluc

Diamond Member
Nov 21, 2021
41,207
58,010
3,488
Shenandoah Valley of Virginia
On February 27, 2021, a 2019 Tesla Model X that was reportedly on Autopilot crashed into a police vehicle at 54 mph. Five police offers who were conducting a routine traffic stop were injured as a result of the crash. The driver that was pulled over by the police was also injured.

Reports have indicated that the driver of the Model X was intoxicated at the time of the incident. Nevertheless, the five police offers who were injured have filed legal action against the electric vehicle maker. The lawsuit alleged that Tesla has not done enough to address issues with its Autopilot driver-assist system, and they are seeking damages between $1 million to $20 million for injuries and permanent disabilities.


Tesla should play the cop's own game and claim QI from the lawsuit. :laughing0301:

Tesla should face some culpability for making dumbasses believe "autopilot" in a car works.....It’s either auto pilot or it isn’t..... And it isn’t.

Besides that you would think after so many warnings the car's AP would just shut/slow down and pull-over.
 
On February 27, 2021, a 2019 Tesla Model X that was reportedly on Autopilot crashed into a police vehicle at 54 mph. Five police offers who were conducting a routine traffic stop were injured as a result of the crash. The driver that was pulled over by the police was also injured.

Reports have indicated that the driver of the Model X was intoxicated at the time of the incident. Nevertheless, the five police offers who were injured have filed legal action against the electric vehicle maker. The lawsuit alleged that Tesla has not done enough to address issues with its Autopilot driver-assist system, and they are seeking damages between $1 million to $20 million for injuries and permanent disabilities.


Tesla should play the cop's own game and claim QI from the lawsuit. :laughing0301:

Tesla should face some culpability for making dumbasses believe "autopilot" in a car works.....It’s either auto pilot or it isn’t..... And it isn’t.

Besides that you would think after so many warnings the car's AP would just shut/slow down and pull-over.
might be why they also call it "driver assist"

 
I don't get it...could the auto pilot see 2 minutes into the future?
Even at one warning per second that would be 2 and a half minutes of warnings.
 
Which is it? Either the car can drive on autopilot and be safe or it can't.

You CAN get a ticket for driving on autopilot, especially if you are sleeping or reading a book, which we have seen many examples of this on YouTube and on the news. People were outraged that it was happening and even said how unsafe it was. Yet there are people who swear by it's safety and say that they don't need to pay attention while on autopilot.

Or, they can say that autopilot is a driver assist program and must be used only when the driver is able to take control quickly of the car if needed (which is what Tesla says).

You can't have it both ways y'all. Instead of suing Tesla, they should be suing the driver of the vehicle.

They're only going after Tesla because Tesla has deeper pockets than the driver.
 
do me a favor and change the words auto pilot and replace it with AR-15 and tell me if you have the same opinion??

is it the tool or the person at fault??
 
Which is it? Either the car can drive on autopilot and be safe or it can't.

You CAN get a ticket for driving on autopilot, especially if you are sleeping or reading a book, which we have seen many examples of this on YouTube and on the news. People were outraged that it was happening and even said how unsafe it was. Yet there are people who swear by it's safety and say that they don't need to pay attention while on autopilot.

Or, they can say that autopilot is a driver assist program and must be used only when the driver is able to take control quickly of the car if needed (which is what Tesla says).

You can't have it both ways y'all. Instead of suing Tesla, they should be suing the driver of the vehicle.

They're only going after Tesla because Tesla has deeper pockets than the driver.
AP will never be viable unless manufactures are shielded from lawsuits....You know, like the clot-shot companies. ;)
 
do me a favor and change the words auto pilot and replace it with AR-15 and tell me if you have the same opinion??

is it the tool or the person at fault??
Hell, that ship has sailed.....Everyone know ARs grow little hands and feet, break out of the safe, load themselves and then go on killing sprees.....Haven't you been keeping-up with the MSN. ;)
 
Hell, that ship has sailed.....Everyone know ARs grow little hands and feet, break out of the safe, load themselves and then go on killing sprees.....Haven't you been keeping-up? ;)
great spin you got there,,

they should dump auto pilot and call it what it is,, driver assist,, problem solved,,
 
Tesla should face some culpability for making dumbasses believe "autopilot" in a car works.....It’s either auto pilot or it isn’t..... And it isn’t.
Besides that you would think after so many warnings the car's AP would just shut/slow down and pull-over.

Right. Where Tesla's culpability is I feel is that when autopilot issues several warnings (much less 150 of them) getting no response from the driver, it should have pulled the car over and stopped. For all the A-P knew, the driver was maybe having a heart-attacked.
 
Right. Where Tesla's culpability is I feel is that when autopilot issues several warnings (much less 150 of them) getting no response from the driver, it should have pulled the car over and stopped. For all the A-P knew, the driver was maybe having a heart-attacked.
if it can stop on a dime for a pedestrian it should be able to stop for no driver,,

but in the end the driver holds responsibility,,
 
Anyone trusting the software on driver assist is insane in my book. With the amount of technical glitches in every brand of vehicle I will certainly never trust them.
Add to that, most sensors become useless in inclement weather and can't detect black ice.

Want a vehicle you don't need to drive?

Take the bus.
 
No but a couple years ago I drove all the way across Texas which I bet is farther. As you've probably guessed I'm just old school.

I drove down to san antonio once,, didnt seem as bad as kansas,,

On this debate, I gotta give it to Progressive, sorry Mike.

When I ran the MEPS in Amarillo, I had to take a trip down to San Antonio about once every couple of months. The Navy gave me a gov't vehicle to drive, and it was around 550 miles each way. But, the drive wasn't that bad, as there were mountains and things to look at along the way.

I've also driven across Kansas (had to do that a couple of times when going on leave), and I can tell you that driving across that state (as well as a couple of others) IS BORING AS HELL. Roads are straight, flat, and there is nothing there to see. I hate driving in areas like that as they seem to wear you out more than if you have roads with curves, hills and the occasional town.

Even driving through Montana, as remote and unpopulated as it is, is preferable to driving through Kansas. At least there, you have some hills and the occasional creek or river. And, if you're on the western half, there are beautiful places to check out (if you don't mind driving along roads where there are drop offs of several hundred feet, Roger's Pass is a good example).

Kansas is boring as hell to drive through.
 
On February 27, 2021, a 2019 Tesla Model X that was reportedly on Autopilot crashed into a police vehicle at 54 mph. Five police offers who were conducting a routine traffic stop were injured as a result of the crash. The driver that was pulled over by the police was also injured.

Reports have indicated that the driver of the Model X was intoxicated at the time of the incident. Nevertheless, the five police offers who were injured have filed legal action against the electric vehicle maker. The lawsuit alleged that Tesla has not done enough to address issues with its Autopilot driver-assist system, and they are seeking damages between $1 million to $20 million for injuries and permanent disabilities.


Tesla should play the cop's own game and claim QI from the lawsuit. :laughing0301:

Tesla should face some culpability for making dumbasses believe "autopilot" in a car works.....It’s either auto pilot or it isn’t..... And it isn’t.

Besides that you would think after so many warnings the car's AP would just shut/slow down and pull-over.
Well this just goes to prove that the old adage is true: "You can't idiot proof the world"
 
Which is it? Either the car can drive on autopilot and be safe or it can't.

You CAN get a ticket for driving on autopilot, especially if you are sleeping or reading a book, which we have seen many examples of this on YouTube and on the news. People were outraged that it was happening and even said how unsafe it was. Yet there are people who swear by it's safety and say that they don't need to pay attention while on autopilot.

Or, they can say that autopilot is a driver assist program and must be used only when the driver is able to take control quickly of the car if needed (which is what Tesla says).

You can't have it both ways y'all. Instead of suing Tesla, they should be suing the driver of the vehicle.

They're only going after Tesla because Tesla has deeper pockets than the driver.
So what I would like to know is what is the status of a intoxicated driver, who can't legally operate a motor vehicle, but is behind the wheel of a "self-driving" car? Is he legally operating a vehicle, who routed the vehicle into the stopped police vehicles, is the vehicle legally under his control or should he just have called an Uber?
 

Forum List

Back
Top