Tesla Safety Negligence Finally Goes to Court: “Sore Thumb” of American Roads

Quality of Tesla vehicles has been notoriously bad for years, and has been trending worse, which should be little surprise given how poorly it treats human life (from its workers and its customers to anyone in or around their product).

Now top experts in automobile safety, who finally are getting some attention, aren’t mincing words about the sad danger a Tesla poses to everyone on the road.

“Tesla sticks out like a sore thumb,” said David Friedman, who was deputy and acting administrator of NHTSA from 2013 to 2015. “And it has for years.” [Heidi King, a deputy and acting administrator of NHTSA during the Trump administration added] “I really dislike a lot of what Tesla has done, and at the top of the list in bright, bold letters, is Elon Musk’s habit of making false public claims… visionary exaggerations about a consumer product can be very, very dangerous.”

Liar, liar Elon Musk’s customers are literally dying in fires.

One of the reasons Musk has become an obvious “sore thumb” of safety is explained by his bully mindset of doing harm: to do wrongs until someone can afford to stop him in court.

“In the US, things are legal by default,” Musk said.

A public automobile company showing intent to commit crimes unless someone can catch them is the worst possible CEO statement.

“Things” are not simply legal by default.

To put it another way, in the US cannibalism is legal by default. So is Elon Musk’s next business idea going to be grinding the rising number of his dead customers into hamburger? Something technically legal DOES NOT mean you won’t be convicted of a related crime.

“We essentially have the Wild West on our roads right now,” Jennifer Homendy, the chair of the NTSB, said in an interview. She describes Tesla’s deployment of features marketed as Autopilot and Full Self-Driving as artificial-intelligence experiments using untrained operators of 5,000-pound vehicles. “It is a disaster waiting to happen.”

The Wild West killed a LOT of innocent people, especially because of men like Stanford when you think about it. I mean Silas Soule was a very notable exception who became more like the American rule but only much later.

But I digress. Tesla is not a disaster just waiting, it already happened!

Let’s play spot the disaster. Here are the death rate stats for electric cars.

Source: tesladeaths.com

I warned very loudly about the disaster we are now in for at least six years prior. My 2016 keynote presentation about Tesla death at BSidesLV was literally called “Great Disasters of Machine Learning“.

Elon Musk long ago signaled disaster as his business model and I saw it right away after the first road death was reported April 2, 2013.

Tesla was leaving Laguna Beach and veered into oncoming traffic

Veering across lines into oncoming traffic is not “legal by default” yet it seems that Tesla must believe it to be a profitable business model for America, given their vehicles have become notorious for doing exactly that.

April 8, 2022 (nearly TEN YEARS later) we see repetitive failures in safety.

Little remains of a Tesla and its driver in 2022 after it veered yet again into oncoming traffic

Things may change, however, given that a court is finally going to help Tesla owners see just how many unsafe “things are legal by default”.

A US federal judge’s ruling paves the way for a trial in July, the first time Tesla will face a jury in litigation over a car crash. The electric car-maker faces a flurry of lawsuits over a spate of accidents… Barrett Riley, 18, was at the wheel of his father’s Model S when he lost control and veered into a concrete wall of a house in Fort Lauderdale. The car was engulfed in flames. Riley and his friend in the passenger seat were both killed. The father, James Riley, alleged in a lawsuit that Tesla was negligent for removing a speed-limiting device from the car after his wife had asked for it to be installed. The after-market device was designed to cap the car’s speed at 85mph. The family also argued that Barrett could have survived the impact of the crash but lost his life because of the intense fire, which the suit attributes to a defective design in the battery.

Defaults give an interesting framing for this court case.

Why was the default top speed so far above any legal limit? The family tried to set a safe mode by requesting Tesla enable their built-in speed limiter (“loaner” mode with an 85 mph max). Allegedly Tesla later removed the setting to override parents’ explicit request, which led directly to the predictable death of their child.

Tesla’s argument for why they intentionally disobeyed parents was… because they could. A toddler-level mentality of safety, if not a conspiratorial one. When parties A and B come to a service provider with conflicting requests, Tesla very clearly took sides: serving the (reckless abandon) one and not the (safer, wiser, legal) other.

Two footnotes also may be worth adding.

First, this Tesla also operated with two un-repaired recalls at the time of its crash; unrelated to the cause of death yet it still gives evidence of Tesla being not on top of safety.

Second, the car continuously re-ignited into fire. It was on fire when police arrived. It then caught on fire again when it was put on a tow truck. It then caught on fire again when it was put on a second tow truck. And it then caught on fire again when it was unloaded from the second tow truck. That’s significantly worserush to market” thinking than even the Pinto disaster.

The lawsuits brought by injured people and their survivors uncovered how the company rushed the Pinto through production and onto the market. […] Ford officials decided to manufacture the car even though Ford owned the patent on a much safer gas tank. Did anyone go to Mr. Iacocca and tell him the gas tank was unsafe? “Hell no,” replied an engineer who worked on the Pinto. “That person would have been fired. Safety wasn’t a popular subject around Ford in those days. With Lee it was taboo.” As Lee Iacocca was then fond of saying, “Safety doesn’t sell.”

Does anyone really want to buy a sore thumb?

Mercedes Issues “Stop Driving” to 300,000 SUV Owners: Complete Brake Failure

Mercedes in the wild

The 2006-2012 ML, GL and R-Class have a moisture related corrosion issue with the brakes, which can result in total failure.

…brake force support might be reduced, leading to an increase in the brake pedal forces required to decelerate the vehicle and/or to a potentially increased stopping distance. In rare cases of very severe corrosion, it might be possible that a strong or hard braking application may cause mechanical damage in the brake booster, whereby the connection between the brake pedal and brake system may fail. In such a very rare case, it would not be possible to decelerate the vehicle via the brake pedal.

Not possible to decelerate the vehicle via the brake pedal.

I believe that officially means these road bathtubs should be classified as a boat instead of a car?

The issue is so serious Mercedes says drivers should immediately call and a tow truck will come take the vehicle to be repaired.

MBUSA is advising affected customers to stop driving their vehicles. MBUSA will also offer complimentary towing to owners of affected vehicles to attend the workshop.

I suppose what’s hidden in the details is how Mercedes took a single report and extensively researched the causes until they arrived at a decision to recall vehicles even 16 years old. Consumer Reports tells the story:

The automaker began its investigation in July 2021 after a report of a customer from outside the U.S. experiencing reduced braking during a stop. After conducting numerous field studies and tests, including discovery of a single similar situation in the U.S., Mercedes-Benz informed the National Highway Traffic Safety Association of the recall on May 5, 2022.

That’s an impressive response narrative.

Mercedes shows a duty of care completely opposite to Tesla’s negligence from “false and reckless” management (notorious for failures to stop and harming people and property) as witnessed yet again just last week.

A Columbus police crash report states that the driver of the Tesla, 63-year-old Frantz Jules, told police that he was unable to slow the vehicle as it hit speeds of 70 mph on a Downtown highway, so he exited and smashed into the center.

Jules told police he was driving on Ohio Route 315 when he “lost control of his brakes and was unable to stop,” according to the police report. He exited Route 315 at the Neil Avenue exit, which leads directly onto Vine Street toward a T-intersection and traffic light at North High Street — with the convention center directly in its path.

Three witnesses to the crash, one of whom was stopped at the red light at North High Street, told police that the driver of the Tesla appeared to speed up in order to beat a red light. They also said it did not appear he applied brakes before the building was hit.

Lost control of his brakes and was unable to stop… or sped up to run through a red light, or BOTH? Tesla likely doesn’t care and will spend its time trying to find ways to avoid being responsible.

Is Your Robot Vacuum Gathering Dirt About You?

Someone passed a joke by me today that tried to make light of the fact (pun intended) that “smart” electronics like phones and televisions might be collecting your private conversations.

The joke was to not worry since “your vacuum cleaner has been gathering dirt on you for years”.

My humorless response was two-fold.

1) Evil maid is a long-standing canon of threat modeling. Of course any “cleaning” device you allow should be in the category of services that could abuse narrowly-defined access grants to violate confidentiality. Pro-tip: evil maid does NOT mean physical access only.

2) Laser-guided cleaning robot vacuums have long been known to be a vector for acoustic monitoring, as demonstrated two years ago. Any device with light detection and ranging (Lidar) sensors could be manipulated for sound collection, despite having no microphone. Their “LidarPhone” used AI to match and identify parts of speech (numbers) with 90% accuracy. It also identified previous speech (television shows) from a minute’s worth of recording with more than 90% accuracy.

Movie Review: Escape from Mogadishu

This 2021 WellGo USA film has several interesting twists.

Dramatically constructed based on a true story: as civil war rages in Mogadishu, rival North and South Korean diplomats are left trapped. With no aid from either government, their only shot at survival may require uniting with bitter adversaries to escape.

An obvious way people are made happier is when they have the trust to build connections and be more social (even misery enjoys company). That’s the underlying wisdom of this true story.

I found it particularly interesting the North Koreans are depicted as competent, professional and coldly rational or calculating. The South Koreans are depicted as the opposite being incompetent, unprofessional and mostly emotional or moral feelings. I’ve reflected on this before given another movie from South Korea.

It’s also completely different to how Americans typically portray the two sides (trying to frame North Koreans as incompetent and emotional), which also reminds me of a presentation I gave called “Dar-win or Lose“: the Cuban Missile Crisis gives critical insight into why Big Data Platforms are doomed (led by coldly rational management instead of moral feelings).