Tesla “FSD and Autopilot” Crash More Than All Other Driver-Assist Products Combined

The big reveal from a new Washington Post report is that Tesla engineering leads to a higher frequency of higher severity crashes than any other car brand, and it’s quickly getting worse.

Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.

“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post.

This is precisely what I’ve been saying here and in conference presentations since 2016. I’m reminded of the graphic I made a while ago comparing Tesla deaths with other electric cars on the road.

Source: tesladeaths.com

And here’s a detail far too often overlooked: Tesla drivers are significantly overconfident in the ability of their car to drive itself because of what Tesla’s CEO tells them to think. There’s no other explanation for why they do things like this:

Authorities said Yee had fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands: Autopilot disables the functions if steering pressure is not applied after an extended amount of time.

Yee apparently really, really wanted to believe that Elon Musk wasn’t just a liar about car safety. Yee removed the required human oversight in his Tesla because (aside from it being far too easy to circumvent) somehow he was convinced by the con artist in charge of Tesla that it makes sense to significantly increase the chances of a dangerous crash.

And then, very predictably, Yee crashed.

Worse, also predictably, he caused serious harms.

Yee’s “trick” meant Tesla crashed at high speed through bright flashing red stop signs and into a child.

Tesla quickly has produced literally the worst safety in car industry history, the least realistic vision with the least capability, while falsely claiming it was making the safest car on the road.

What a lie. Without fraud there would be no Tesla.

Gabriel Hallevy in a 2013 book called “When Robots Kill: AI Under Criminal Law” famously predicted this problem could end up in one of at least three decisions.

  1. Direct liability: Tesla’s car acted with will, volition, or control. The robot would be guilty of the action and intent to commit it.
  2. Natural probable cause: The car’s action wasn’t intentional, but “reasonably forseen” by Tesla. They are charged with negligence, and the car is considered innocent.
  3. Perpetrator via another: The car was used by Tesla to intentionally do harm. Tesla is guilty with intent, and the car is considered innocent.

Waymo Robot “correctly identified” Dog Before Killing It

Incidents involving robots have been exploding when they get deployed, exactly the opposite of what the robot manufacturers promise. Here’s a new tragic case, where Waymo uses language that seems rather… unaware.

A Waymo spokesperson confirmed the incident details and said the company sends sincere condolences to the dog owner. “The investigation is ongoing, however, the initial review confirmed that the system correctly identified the dog….”

Condolences.

Perhaps if Waymo didn’t correctly identify the dog, that dog might have lived?

Hard to know what Waymo thinks they are doing by saying robots in public are aware of what they kill.

The robots are turning in worse and worse results, indicating they might be learning how to be very bad at their job.

Monthly reported incidents involving Waymo driverless operations have increased six-fold this year, including instances where they interfered with emergency services, according to a comment letter from SF officials. City data also says that reported incidents involving driverless Waymo and Cruise vehicles more than tripled from 24 to 87 between January and April.

Next step, watch someone simply change the Waymo algorithm (or the robot itself) from safe to unsafe mode and any dogs identified in city streets or sidewalks quickly are killed.

Call it a robot going into Thomas Edison mode because he infamously paid children cash to find and kidnap dogs (e.g. neighbors’ pets) for him to kill.

The Known Unknowns of Why Tesla Drivers Keep Dying

It is early in the morning and a Tesla traveling on an Interstate veered suddenly before crashing at high speed and catching on fire.

Again.

The Oregon State Police responded to a single-vehicle crash on Interstate 5, near milepost 33, in Jackson County, around 3:30 a.m. on June 5. Police say the investigation indicated a black Tesla Model S, driven by Shawn Douglas Kroll, 29, of Oakley, was traveling northbound on I-5 when for an “unknown reason” the car drifted off of the roadway and onto the shoulder. The car drove through a fence, struck a tree, and caught fire.

Unknown reason.

Was the driver asleep?

Probably.

Did the car maker dangerously encourage drivers to go to sleep behind the wheel?

Definitely.

Elon Musk: Tesla drivers can sleep behind the wheel ‘next year’ [by the end of 2020].

Did the car maker fraudulently say that the car could drive itself?

Definitely.

In 2014, Elon Musk continued to promise at least 90 percent self-driving by year’s end…. In 2015, Autopilot was fully rolled out to Model S drivers and Musk was promising the software would be able to handle freeways and simple roads in a matter of months.

Wrong. Dead wrong. Every year for ten years now since 2013, always a lie.

Some things are well known. What’s the unknown part, really?

Is it, who is at fault for the death of over thirty people? Should we fault the robot for killing these people, or should we fault the company for making fatally defective robots sold on a false premise they are to keep people safe?

Related: Tesla Autopilot Accused of Trying to Kill Its Owner.


Update June 9: In other news, Mercedes just achieved Level 3 autonomous driving authorization from California. That’s a capability far, far ahead of anything Tesla has ever produced (infamously failing at Level 2). For perspective, the best available driverless technology is getting authorized in 2023 under these conditions.

…California DMV is placing some serious restrictions on the Merc system. It can only operate during the day on certain limited roads, and only at speeds of up to 40 miles per hour.

In other words, it seems like an unknown… yet very high probability that Tesla knowingly killed Shawn Douglas Kroll.