You have commented 339 times on Rantburg.

Your Name
Your e-mail (optional)
Website (optional)
My Original Nic        Pic-a-Nic        Sorry. Comments have been closed on this article.
Bold Italic Underline Strike Bullet Blockquote Small Big Link Squish Foto Photo
Science & Technology
Crashes Involving Tesla's Driverless-Assistance System Are Far More Numerous Than Originally Reported
2023-06-11
[PJ] Tesla’s autopilot system is far more dangerous than previously reported. In June 2019, regulators reported only 217 crashes of the driverless-assistance system with three fatalities.

But a recent study by the Washington Post shows that since 2019, there have been 736 crashes directly linked to Autopilot, including 17 fatalities. These statistics have raised questions about how safe the driverless system is.

Tesla’s creator, Elon Musk, insists the system is safe and that he had a "moral obligation" to deploy it.

"At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people," Musk said last year. "Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know — or their state does."

But former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, isn’t so sure.

"Tesla is having more severe — and fatal — crashes than people in a normal data set," she said in response to the figures analyzed by the Post. She said that one likely cause is the expanded rollout over the past year and a half of Full Self-Driving. This brings driver assistance to city and residential streets. "The fact that ... anybody and everybody can have it. ... Is it reasonable to expect that might be leading to increased accident rates? Sure, absolutely."
Posted by:Besoeker

#5  It would be nice to type an address into an onboard computer then sit back and let the car take you there. Too bad. It would seem the vaunted AI is not yet ready for prime time.
Posted by: Abu Uluque   2023-06-11 18:47  

#4  You can feed the AI all the training data you want. There will always be new corner cases, or combinations thereof, that the AI can't handle. Even if/when emergent properties arise in the AI, any fault will now be the liability of the car maker, not the driver. The lawyers will have a field day.
Posted by: Enver Slager8035   2023-06-11 16:17  

#3  Yeah. It's like nobody has bothered to research why some things that people across a broad range of the intelligence spectrum can do with relative ease but even powerful computing hardware can't.

PhD dissertation in there somewhere...
Posted by: M. Murcek   2023-06-11 13:25  

#2  Which is genuinely too bad. Some of us older people aren't that far away from having to give up our driver's licenses, and self-driving cars would be a godsend.
Posted by: Tom   2023-06-11 13:15  

#1  Why would Big Tech throttle the news of the failure of Big Tech?
Posted by: Super Hose   2023-06-11 07:47  

00:00