Tesla refers to its robotic software as FSD, and police in Seattle have just released a statement that the infamously dangerous robot allowed on public roads has murdered an innocent man.
After the crash in a suburban area about 15 miles (24 kilometers) northeast of Seattle, the driver told a trooper that he was using Tesla’s Autopilot system and looked at his cellphone while the Tesla was moving.
“The next thing he knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him,” the trooper wrote in a probable-cause document.
Experts have condemned the Tesla system as a fraud.
“Unless you have data showing that the driver never has to supervise the automation, then there’s no basis for claiming they’re going to be acceptably safe,” [Phil Koopman, a professor at Carnegie Mellon University who studies autonomous vehicle safety], said.
This comes right after the WSJ reported that Tesla has intentionally made their robot designs less safe, intentionally deceiving investors and putting the public in harms way… similar to how Twitter had its safety controls turned off and experts removed when it was taken over.
And this is exactly what I said back in 2016 was happening, public algorithmic slaughter by Tesla, as an early warning that too few people took seriously enough.
Without fraud, there would be no Tesla.
Too bad Ralph Nadar is 90. Just two years ago, “Nader called FSD ‘one of the most dangerous and irresponsible actions by a car company in decades’”
Nadar was a fierce advocate for consumer safety as I’m sure you are aware. If he were still in his prime I’m sure he’d be tearing into Tesla which is also “Unsafe at Any Speed”
The Video by FortNine is sadly still true. https://www.youtube.com/watch?v=yRdzIs4FJJg
Without radar FSD pysically cant judge the distance to a motorcycle at night.