NZ Tesla Owner Convicted for “Autopilot” Dangerous Driving

As someone who watches the constant drumbeat of Tesla owners losing court battles, almost as fast as their passengers losing their lives, I have noticed certain names frequently top the list.

My suspicion is some cultures are more susceptible to a particular aspect of Tesla fraud. They believe excessive waste to buy a cheap car brand (substandard parts that cost Tesla $4 to make are sold by them for $900) elevates their sense of privilege — they then intentionally use a bogus “token” vehicle to flaunt or ignore traffic laws, as well as laws of physics.

Notably, the accused in this NZ news story attempted to argue the absolutely weakest possible defense:

Singh contended he wasn’t sleeping, but still appealed the dangerous driving conviction on the basis that he can’t have been driving…

He argued he was not, not driving (double negative = driving), but also that he was not driving. He didn’t claim to be innocent as much that he thought he should be allowed to be a criminal.

Driving while not driving?

As if anyone should be allowed to exist in an “untouchable” criminal state of … X.

And not surprisingly, the courts declared his bunk a fool’s gambit.

High Court judge Justice Matthew Downs disagreed in a decision released last month.

Singh’s cause for conviction wasn’t whether he fell asleep or not, but that he … had not seen what was happening around him in the car, which he did not, Downs said.

Singh also appealed on the grounds that he couldn’t have been guilty of failing to stop for police when he didn’t know he was meant to stop, he wasn’t avoiding stopping, and when he did notice the lights and sirens, he pulled over.

Once again, Downs disagreed.

“Mr Singh’s failure to stop was clearly due to his own fault.”

To be fair, Tesla didn’t see what was happening around the car. Tesla is an abject failure of engineering. A total fraud, claiming to have driverless cars based on “vision”, yet blind to high visibility vehicles with flashing lights SINCE 2016.

It’s unfortunate the courts still aren’t ready to hold Tesla accountable for telling customers they can sleep while operating a car.

Related: California just passed a law that says autonomous cars (road robots) must immediately obey police orders.

Tesla Cybertruck Drops a FIFTH Recall Notice: Cameras Take 8 Seconds to Display

Some say there is an unofficial sixth recall related to engine failure.

While we wait for that to percolate into public regulations, let alone the many other unofficial failures of the Cybertruck, here’s the official fifth recall notice this year alone:

Vehicles with rearview cameras in the U.S. must display an image within two seconds of turning on, the NHTSA said, and some of the Cybertrucks failed to display an image for up to eight seconds. Tesla received 45 warranty claims and four field reports that may be related to the defect…

An eight second delay. Forty five claims. So many Cybertrucks have now crashed, it’s a wonder there are nearly fifty of these clown cars still operating.

Now think about how Tesla pumps marketing with “fastest launch time in a straight line” by counting seconds. They constantly talk about each one like it’s the biggest measure of success imaginable.

Zero to 60 in how many seconds? Nevermind, because the Cybertruck can’t get display cameras up and running within two seconds.

In other words it will display a tree six seconds after it has crashed into it.

An eight second delay!

Unbelievable. How bad is Tesla engineering such that they can’t even meet basic safety regulations on a brand new car?

CA Passes AB 1777 Requiring Autonomous Cars to Immediately Follow Police Orders

There have been numerous instances of road robot algorithms being written so poorly that they end up blocking everyone else (causing denial of service). In the earliest case, nearly a decade ago, Google engineers never bothered to input basic Mountain View traffic laws and their robots were pulled over for nuisance driving (too slow).

In the latest case, Waymo stopped perpendicular to San Francisco traffic, on Nob Hill just outside the Fairmont, dangerously blocking a Vice Presidential motorcade.

California has thus finally passed a new emergency requirement (don’t call it a backdoor), that a traffic authority can issue a direct command to an entire robot fleet to vacate a space.

The bill would, commencing July 1, 2026, authorize an emergency response official to issue an emergency geofencing message, as defined, to a manufacturer and would require a manufacturer to direct its fleet to leave or avoid the area identified within 2 minutes of receiving an emergency geofencing message, as specified.

Now the obvious question arises how strong the integrity checks are on that message bus (no pun intended), because I know a lot of people who thought dropping orange cones to “geofence” robots already was a great idea.

FL Tesla “Veered” 2AM Crash Into Pole Knocks Out Miami Power

A critical infrastructure incident with all the hallmarks of a cruise missile attack, was actually just Tesla’s latest failed algorithm.

An overnight power pole fire that knocked out electricity to some homes in a Southwest Miami-Dade neighborhood for hours was apparently sparked by a driver who slammed into a power pole. Miami-Dade Fire Rescue units responded to the scene of the blaze along Southwest 94th Street, bear 87th Avenue, early Sunday morning. The fire ignited after a Tesla sedan smashed into the power pole at around 2 a.m., causing it to fall down and light up the brush surrounding it.

Mars landing by 2018!

Data scientists regularly remind me that Tesla crash into trees and poles at an abnormal rate compared with other electric cars.

Notably, Waymo very loudly pushed a specific software update with a fleet-wide recall that better recognized the danger of driving into a pole. Hint, hint, nudge, nudge.

Waymo determined the vehicle’s software had “assigned a low damage score” to the pole and its maps did not properly account for the lack of a curb or clear road edge in the alleyway. This caused the car to misjudge the hazard posed by the pole.

Meanwhile it seems like Tesla continues tuning their fleet intentionally into dangerous loitering munitions, with the aim of selling out American safety to high bid foreign adversaries. What price for Russia or Saudis to slide Elon Musk cash to turn 10s of thousands of Tesla into a swarm that attacks some US military base?