Ukraine Exposes Russian Drones Running on Elon Musk’s Starlink

Ukrainian news suggests this day was inevitable, given how Starlink has been known to facilitate and support Russian military operations.

Starlink may be needed by the enemy in order to transmit intelligence information, for example, collected by radio technical means on the positions of Ukrainian air defense. Or, in the case of equipping the drone with cameras – images. And thanks to Starlink, the Russian Federation can use Shahed as an extremely long-range barrage munition with the possibility of reconnaissance, due to an additional camera, and even hitting moving targets.

Photos of a downed Russian drone show a Starlink logo, along with identification numbers.

Tesla FSD 12.5 Worst Version Yet? Driver Intervention Required or Death

A video has quickly surfaced from a Twitter user @MissJilianne who demonstrates Tesla FSD 12.5 is blind to emergency vehicles flashing lights.

That’s a problem the Tesla CEO had promised in 2016 would be fixed right away. And it’s just one example of the type of intervention required.

AMCI testing’s evaluation of Tesla FSD exposed how often human intervention was required for safe operation. In fact, our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles. ” With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public. When this technology is offered the public is largely unaware of the caveats (such as monitor or supervise) and the tech considered empirically foolproof. Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results,” said David Stokols, CEO of AMCI Testing’s parent company, AMCI Global. “

The 12.5 release is so bad that other drivers are pointing out that interventions are estimated to be increasing, now around almost every ten miles or… death.

When errors occur, they are occasionally sudden, dramatic, and dangerous; in those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident—or possibly a fatality

Don’t be like this guy, barely able to go 20 miles on FSD without needing a new pair of underwear:

…I had to take over and complete the turn faster to make sure the upcoming car didn’t crash into me. Up until that point, I was pretty impressed by FSD v12.5.

Up until I was dead, I was alive? No kidding.

This is far worse than where things were in 2016 because so many more people today are in imminent danger, but also because the snake-oil system is rapidly declining.

The number of miles between each critical disengagement for FSD v11.4.9 stood at 109, which was orders of magnitude away from driverless operation. The latest version of 12.5 is at 81 miles to safety critical disengagement.

And now it’s approaching 10, which is like not having driverless at all. Think about how much worse things have to be to drop from 100 to 10 miles between critical disengagements.

…as many Tesla FSD drivers say, sometimes Full Self-Driving takes one step forward and two steps backward…

In related safety engineering news, Twitter has become even more blind to imminent harm than Tesla:

Canada Tesla Kills Three at Stop Sign Crash

The Canadian news of Tesla crashes is typically sparse with details, yet lists the fatalities.

On Monday morning, at around 9:40am, emergency services responded to reports of a serious collision between a Tesla and a Dodge Ram at the intersection Hessen Strasse and Greenwood Hill Road.

Police say the driver of the Tesla, a 25-year-old Cambridge male was pronounced dead at the scene, a 30-year-old Kitchener male passenger was pronounced dead at the scene, and a 23-year-old Kitchener female passenger was transported to a local hospital and pronounced dead.

Three people dying in this crash, given the case information so far, indicates Tesla autopilot ignored the stop sign and was hit broadside. For example, the trees create a blind spot for the truck such that a Tesla running the sign would be hit at full speed.

Source: Google maps

Update: The poor family of one victim seek public donations to help return her body for burial in India.