bripat9643
Diamond Member
- Apr 1, 2011
- 170,170
- 47,328
- 2,180
It's actually very close. These cars have driven hundreds of thousands of miles on our roads, and this is the first time any human has been harmed.Passengers in driverless cars aren't supposed to have to watch the road.I, for one, welcome our new self driving overlords.
"The victim did not come out of nowhere. She's moving on a dark road, but it's an open road, so Lidar (laser) and radar should have detected and classified her" as a human, said Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles.
Smith said the video may not show the complete picture, but "this is strongly suggestive of multiple failures of Uber and its system, its automated system, and its safety driver."
Experts: Uber self-driving system should have spotted woman
Even though the video camera doesn't show the woman crossing the street until it's too late, I suspect the driver's more sensitive naked eye could have seen the woman crossing the street before the camera saw her. But the Hispanic felon in the driverless car wasn't watching the road.
I also suspect there's something about the scene that foiled the lidar, even though I would have expected the driverless car to be able to "see" into the darkness."
Either way, the woman crossing the road is at fault for crossing in the dark in front of an oncoming car that had the right of way.
And you validate my point. Technology is no where near being able to mimic the decision making process of the human mind.