Another “Veered” Tesla Crash. Off Duty Officer Killed

Early morning Tesla crashes suggest the car drifts off the road while the driver sleeps. Let’s call it “veered”. This is exactly what Tesla claimed to investors they would solve, eliminating deaths, many years ago.

Instead, Tesla seems to be getting much worse. And deaths are rising quickly.

Recently I read about cars in Oregon and Michigan. Now California is in the news.

Dhanoa was the driver of a Tesla Y that crashed about 5:30 a.m. Tuesday on Millerton Road east of Auberry Road when he allowed it to drift and strike a culvert, the California Highway Patrol said. The Tesla rolled at least once, CHP said.

The driver, an off duty police officer, was thrown from the Tesla.

American Truck Drivers Dying From Lack of Seatbelt Use

The federal safety regulators have highlighted a huge percentage of deaths related to seatbelt resistance.

“One thing we’re seeing in trucking and other sectors is that seat belt usage is going down,” said Polly Trottenberg, deputy secretary of the U.S. Department of Transportation, speaking at a DOT safety forum last week. “And when we look at the fatality numbers they are extraordinarily disproportionately people who are unbelted.”

In 2021, 64% of truck drivers killed in crashes of large trucks were not wearing a seat belt, according to the latest data compiled by the National Highway Traffic Safety Administration. That compares with 59% in 2019 and 44% in 2020.

Some speculate that decline in seatbelt use is related to political propaganda. Political operatives have tried to whip truckers into an angry anti-regulation suicide lobby group.

…while the [anti-safety] protests are generating a lot of noise and attention, the eruption actually points up a counterintuitive fact: The… far right is weak and ineffectual, especially when it comes to pandemic restrictions.

Seatbelt disuse could be a follow-on effect of vaccination disinformation. It doesn’t generate noise and attention, giving a fraudulent yet hightened sense of control because zero feedback/resistence… before sudden death.

Remember when seatbelts were controversial?

Tesla “FSD and Autopilot” Crash More Than All Other Driver-Assist Products Combined

The big reveal from a new Washington Post report is that Tesla engineering leads to a higher frequency of higher severity crashes than any other car brand, and it’s quickly getting worse.

Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.

“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post.

This is precisely what I’ve been saying here and in conference presentations since 2016. I’m reminded of the graphic I made a while ago comparing Tesla deaths with other electric cars on the road.

Source: tesladeaths.com

And here’s a detail far too often overlooked: Tesla drivers are significantly overconfident in the ability of their car to drive itself because of what Tesla’s CEO tells them to think. There’s no other explanation for why they do things like this:

Authorities said Yee had fixed weights to the steering wheel to trick Autopilot into registering the presence of a driver’s hands: Autopilot disables the functions if steering pressure is not applied after an extended amount of time.

Yee apparently really, really wanted to believe that Elon Musk wasn’t just a liar about car safety. Yee removed the required human oversight in his Tesla because (aside from it being far too easy to circumvent) somehow he was convinced by the con artist in charge of Tesla that it makes sense to significantly increase the chances of a dangerous crash.

And then, very predictably, Yee crashed.

Worse, also predictably, he caused serious harms.

Yee’s “trick” meant Tesla crashed at high speed through bright flashing red stop signs and into a child.

Tesla quickly has produced literally the worst safety in car industry history, the least realistic vision with the least capability, while falsely claiming it was making the safest car on the road.

What a lie. Without fraud there would be no Tesla.

Gabriel Hallevy in a 2013 book called “When Robots Kill: AI Under Criminal Law” famously predicted this problem could end up in one of at least three decisions.

  1. Direct liability: Tesla’s car acted with will, volition, or control. The robot would be guilty of the action and intent to commit it.
  2. Natural probable cause: The car’s action wasn’t intentional, but “reasonably forseen” by Tesla. They are charged with negligence, and the car is considered innocent.
  3. Perpetrator via another: The car was used by Tesla to intentionally do harm. Tesla is guilty with intent, and the car is considered innocent.

Waymo Robot “correctly identified” Dog Before Killing It

Incidents involving robots have been exploding when they get deployed, exactly the opposite of what the robot manufacturers promise. Here’s a new tragic case, where Waymo uses language that seems rather… unaware.

A Waymo spokesperson confirmed the incident details and said the company sends sincere condolences to the dog owner. “The investigation is ongoing, however, the initial review confirmed that the system correctly identified the dog….”

Condolences.

Perhaps if Waymo didn’t correctly identify the dog, that dog might have lived?

Hard to know what Waymo thinks they are doing by saying robots in public are aware of what they kill.

The robots are turning in worse and worse results, indicating they might be learning how to be very bad at their job.

Monthly reported incidents involving Waymo driverless operations have increased six-fold this year, including instances where they interfered with emergency services, according to a comment letter from SF officials. City data also says that reported incidents involving driverless Waymo and Cruise vehicles more than tripled from 24 to 87 between January and April.

Next step, watch someone simply change the Waymo algorithm (or the robot itself) from safe to unsafe mode and any dogs identified in city streets or sidewalks quickly are killed.

Call it a robot going into Thomas Edison mode because he infamously paid children cash to find and kidnap dogs (e.g. neighbors’ pets) for him to kill.