Tesla FSD 12.5 Worst Version Yet? Driver Intervention Required or Death

A video has quickly surfaced from a Twitter user @MissJilianne who demonstrates Tesla FSD 12.5 is blind to emergency vehicles flashing lights.

That’s a problem the Tesla CEO had promised in 2016 would be fixed right away. And it’s just one example of the type of intervention required.

AMCI testing’s evaluation of Tesla FSD exposed how often human intervention was required for safe operation. In fact, our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles. ” With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public. When this technology is offered the public is largely unaware of the caveats (such as monitor or supervise) and the tech considered empirically foolproof. Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” said David Stokols, CEO of AMCI Testing’s parent company, AMCI Global. “

The 12.5 release is so bad that other drivers are pointing out that interventions are estimated to be increasing, now around almost every ten miles or… death.

When errors occur, they are occasionally sudden, dramatic, and dangerous; in those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident—or possibly a fatality

Don’t be like this guy, barely able to go 20 miles on FSD without needing a new pair of underwear:

…I had to take over and complete the turn faster to make sure the upcoming car didn’t crash into me. Up until that point, I was pretty impressed by FSD v12.5.

Up until I was dead, I was alive? No kidding.

This is far worse than where things were in 2016 because so many more people today are in imminent danger, but also because the snake-oil system is rapidly declining.

The number of miles between each critical disengagement for FSD v11.4.9 stood at 109, which was orders of magnitude away from driverless operation. The latest version of 12.5 is at 81 miles to safety critical disengagement.

And now it’s approaching 10, which is like not having driverless at all. Think about how much worse things have to be to drop from 100 to 10 miles between critical disengagements.

…as many Tesla FSD drivers say, sometimes Full Self-Driving takes one step forward and two steps backward…

In related safety engineering news, Twitter has become even more blind to imminent harm than Tesla:

Canada Tesla Kills Three at Stop Sign Crash

The Canadian news of Tesla crashes is typically sparse with details, yet lists the fatalities.

On Monday morning, at around 9:40am, emergency services responded to reports of a serious collision between a Tesla and a Dodge Ram at the intersection Hessen Strasse and Greenwood Hill Road.

Police say the driver of the Tesla, a 25-year-old Cambridge male was pronounced dead at the scene, a 30-year-old Kitchener male passenger was pronounced dead at the scene, and a 23-year-old Kitchener female passenger was transported to a local hospital and pronounced dead.

Three people dying in this crash, given the case information so far, indicates Tesla autopilot ignored the stop sign and was hit broadside. For example, the trees create a blind spot for the truck such that a Tesla running the sign would be hit at full speed.

Source: Google maps

Update: The poor family of one victim seek public donations to help return her body for burial in India.

Another Tesla in Wrong-way Crash: Highlighting Dangerous AI Safety Flaw

An uptick in reports of Tesla crashing head-on into wrong-way drivers may indicate a serious vulnerability in its AI.

At least two tragic cases (Sept 10 and 18) are followed now by one a few days later. News reports say the huge wide open multi-lane bridge was mostly empty except for one car going the wrong-way. Somehow Tesla AI couldn’t figure out how to see and avoid this lone compact car driving towards it.

At 2:02 a.m., CHP received a call regarding a white Honda driving in the wrong direction westbound on Interstate 80 near Treasure Island. A few minutes later, the Honda crashed into a Tesla on the Alameda/San Francisco county line near east of Main Tower.

The tell here is 2am on the Bay Bridge just east of Main Tower — wide open empty lanes that are extremely well lit by the brand new bridge design. Notably the Tesla driver likely was asleep or otherwise wasn’t paying attention and is lucky to be alive, unlike recent head-on Tesla crash victims in Michigan and Alabama who were killed using on-ramps.

In other words, if all three cases prove to be Tesla AI mishandling the oncoming traffic, then the company is literally getting people killed just to discover (what we have reported clearly since at least 2016) that its software isn’t ready yet for public roads.