A video has quickly surfaced from a Twitter user @MissJilianne who demonstrates Tesla FSD 12.5 is blind to emergency vehicles flashing lights.
That’s a problem the Tesla CEO had promised in 2016 would be fixed right away. And it’s just one example of the type of intervention required.
AMCI testing’s evaluation of Tesla FSD exposed how often human intervention was required for safe operation. In fact, our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles. ” With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public. When this technology is offered the public is largely unaware of the caveats (such as monitor or supervise) and the tech considered empirically foolproof. Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” said David Stokols, CEO of AMCI Testing’s parent company, AMCI Global. “
The 12.5 release is so bad that other drivers are pointing out that interventions are estimated to be increasing, now around almost every ten miles or… death.
When errors occur, they are occasionally sudden, dramatic, and dangerous; in those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident—or possibly a fatality
Don’t be like this guy, barely able to go 20 miles on FSD without needing a new pair of underwear:
…I had to take over and complete the turn faster to make sure the upcoming car didn’t crash into me. Up until that point, I was pretty impressed by FSD v12.5.
Up until I was dead, I was alive? No kidding.
This is far worse than where things were in 2016 because so many more people today are in imminent danger, but also because the snake-oil system is rapidly declining.
The number of miles between each critical disengagement for FSD v11.4.9 stood at 109, which was orders of magnitude away from driverless operation. The latest version of 12.5 is at 81 miles to safety critical disengagement.
And now it’s approaching 10, which is like not having driverless at all. Think about how much worse things have to be to drop from 100 to 10 miles between critical disengagements.
…as many Tesla FSD drivers say, sometimes Full Self-Driving takes one step forward and two steps backward…
In related safety engineering news, Twitter has become even more blind to imminent harm than Tesla: