Category Archives: Security

“Driving a Tesla” Cited as Single and Only Factor in Potential 40 Year Jail Sentence for Vehicular Homicide

A horrible homicide case in Florida has a very important buried lede. The prosecution say the accused perpetrator didn’t have motive, he wasn’t under any influence of anything, he just unfortunately decided to drive a Tesla. That’s literally the whole case. He stepped into the Tesla and it killed two people. Now he faces up to a 40 year jail sentence after pleading guilty.

Mongan was driving a friend’s Tesla with four passengers inside at about 10:15 p.m. on Sept. 3, 2021, when he accelerated on Manning Road, Florida Highway Patrol troopers said. The road has a speed limit of 30 mph. Mongan ran past a stop sign at a T intersection with Hermosa Drive and launched off a grassy embankment, crashing [at 116 mph] through a vinyl fence and into a home at 1498 Caird Way. One of the passengers, Travis Meisman of Odessa, was killed. Meisman owned the Tesla that Mongan was driving. […] The crash also killed a 69-year-old woman inside the home, Donna Rein, and her dog, Lily. […] Mongan had not been drinking or doing drugs prior to the crash and was not intoxicated at the time, according to Pinellas-Pasco State Attorney Bruce Bartlett. “I can’t say I’ve ever encountered this type of situation,” said Bartlett, who has been a prosecutor for more than 40 years. “Usually, on the part of the defendant, he’s intoxicated, on drugs or something … and they then drive drunk and hit and kill somebody,” Bartlett said. “It’s kind of like you assume the risk.” […] The state attorney said Mongan was not acting out of maliciousness, but still carries criminal responsibility for the crash. “The guy is very remorseful…” Bartlett said.

It was a Tesla Model S Plaid, which has been heavily marketed as 1,000 hp tuned to go 0-60 in just 2 seconds with a distance of about 100 feet. It’s pretty easy to see how Mongan, without being intoxicated by anything other than Tesla advertising, punched the accelerator and killed Meisman within seconds.

View of the T intersection in 2020. Source: Google Maps
View of the T intersection in 2022. Source: Google Maps

You would think having more people inside a Tesla would mean there’s some kind of moderation or lowered risk. Also you’d think Meisman telling Mongan to drive his car would bring moderation as well. Yet in Tesla fatalities I’m noticing a very prominent “showboat” factor, which means something incredibly unsafe and stupid is done in relation to public messaging by the Tesla CEO (e.g. accelerate as rapidly as possible on public roads, go to sleep on public roads).

Incidentally, a 116 mph crash in a 30 mph zone involving a Florida home also was big 2018 news, after parents had purchased a Tesla for their son based on its advertised safety.

…Barrett Riley hit triple digit speeds at 6:46 p.m. May 8, 2018, with Monserratt in the front passenger seat and another teen in a back seat. He blew through the 1300 block of Fort Lauderdale’s Seabreeze Boulevard, a 30-mph zone approaching a curve with a 25-mph advised speed. Barrett Riley lost control of the Tesla, which smacked the wall in front of a home twice, burst into flames and crashed into a light pole across the street. He was going 116 mph three seconds before impact.

New Book Illustrates How a Black U.S. Soldier Single Handedly Killed Six Nazis

The most interesting twist in this story might be how the eventual Medal of Honor recipient was denied work in post-WWII military… because he had served against the Japanese and Spanish before the war.

For his actions, Carter was originally awarded a Bronze Star, Distinguished Service Cross and a Purple Heart, according to Army records.

Upon returning stateside, Carter hoped to continue his military career, but was ruled to be ineligible because of his previous ties to the Chinese and Spanish conflicts.

Carter passed away in 1963. For decades, he was counted among the hundreds of Black service members excluded from Medal of Honor recognition. That was fixed in 1997.

Ties to the conflicts? He fought against Japanese aggression. He fought against fascism in Spain. He was a successful soldier way ahead of his time, which you’d think would have earned him promotions not exclusions.

Has the highly ceremonious medal, more than three decades after he died, done enough?

Illustration of Carter in action.

The service and awards of California-born Edward Carter Jr definitely need more exposure. An Association of the United States Army graphic novel is a great idea but I’m thinking more about VR and an immersive experience — not just how he single-handedly outwitted and killed six Nazis, but also how he experienced U.S. racism and discrimination for decades after.

FTC notice on AI: Tesla “false or unsubstantiated” claims are illegal

In the wake of the Tesla engineer testifying his CEO allegedly ordered criminally false and unsubstantiated “driverless” claims (planned deception)… the FTC is now warning everyone that tactic was and still is illegal.

…the fact is that some products with AI claims might not even work as advertised in the first place. In some cases, this lack of efficacy may exist regardless of what other harm the products might cause. Marketers should know that — for FTC enforcement purposes — false or unsubstantiated claims about a product’s efficacy are our bread and butter. […] Are you exaggerating what your AI product can do? Or even claiming it can do something beyond the current capability of any AI or automated technology? For example, we’re not yet living in the realm of science fiction, where computers can generally make trustworthy predictions of human behavior. Your performance claims would be deceptive if they lack scientific support or if they apply only to certain types of users or under certain conditions.

We know with absolute certainty that Tesla claims in November 2016 were dishonest, their “AI” did not work as advertised.

The Tesla advertisement (video) required multiple takes to have a car follow a pre-mapped route without driver intervention. It was unquestionably a staged video that depended on certain conditions, which were never disclosed to buyers.

Had they presented a vision for the future, with sufficient warnings about a reality gap, that would be one thing. The official Tesla 2017 report to the California DMV (Disengagement of Autonomous Mode) revealed its “self-driving” tests in all of 2016 achieved only 550 miles and suffered 168 disengagement events (failing every three miles, on average). And they didn’t even really test on California public roads.

Such a dismal result should have been the actual video message, because ALL of those heavily curated miles were in making a promotional video that claimed the exact opposite.

Tesla and especially the CEO plainly branded their video with a grossly misleading claim the human driver was there only for “legal purposes” (as if also implying laws are a nuisance, a theme that has resurfaced recently with Tesla’s latest AI tragically ignoring stop signs and yellow lines).

The Tesla marketing claims were and still are absolutely false: the human was in a required safety role to take over as the system (very frequently) disengaged with high risk.

This disconnect is so bad that their claims still do not work seven years later, as evidenced in a massive recall.

Tesla’s false advertising increasingly seems directly implicated in widespread societal harms including loss of life (e.g. customers who believed Tesla’s “legal purposes” lie — among many others — increased fatality risk in society).

Source: Tesladeaths.com

Pilot thought his non-responsive flight instructor was a joke. He had died on takeoff

A pilot showed up at his club with pressure to get in the air. Weather conditions were not ideal that June day in 2022, but he knew he had to put in time, so he asked a flight instructor to copilot. A UK safety bulletin has posted details of what happened next.

The pilot recalled that shortly after takeoff from Runway 28 the instructor’s head rolled back. The pilot knew the instructor well and thought he was just pretending to take a nap whilst the pilot flew the circuit, so he did not think anything was wrong at this stage. He proceeded to fly the aircraft round the circuit. As he turned onto base leg the instructor slumped over with his head resting on the pilot’s shoulder. The pilot still thought the instructor was just joking with him and continued to fly the approach. He landed normally on Runway 28 and started to taxi back to the apron. However, the instructor was still resting on his shoulder and was not responding, and the pilot realised something was wrong.

It’s an odd story, a sad one for sure, that touches on trust issues in safety operations.