MO Tesla Kills One in “Veered” Head-On Crash

The Tesla, very likely on Autopilot, apparently couldn’t follow faded or weak lines on the Kansas City road just before a curve.

Sgt. Phillip DiMartino , a spokesperson with KCPD, said a preliminary investigation shows a gray Tesla 3 speeding south on North Platte Purchase Drive. The driver of the Tesla did not turn with the curve of the road and crossed the yellow center line, striking a northbound Buick Encore head-on, DiMartino said. The driver of the Buick was wearing a seat belt and was taken to a hospital where she was pronounced dead.

Southbound view. Note the end of the yellow lines near where the Tesla crossed to crash head-on into northbound traffic. Source: Google Maps

Related: One, two, three… similar recent crashes of Tesla that crashed failing to navigate a slight curve.

Australians Parody Thin Skinned Elon Musk on Misinformation

Scathing commentary from down under.

…Elon Musk, found dead in his home last night, says it is not the role of social media networks to determine what is true or not.

The Tesla and X owner, who is believed to have died from a heroin overdose while watching animal porn, [left a note that] said he would fight any attempts to stop the spread of misinformation on his platform.

It’s perhaps worth noting that Elon Musk is fond of misinformation as a power strategy (profit from a rejection of law and order).

He says his car company doesn’t make cars, for example.

We should be thought of as an AI robotics company. If you value Tesla as just an auto company — it’s just the wrong framework. If you ask the wrong question, then the right answer is impossible.

Cooking the question is not how to arrive at a responsible answer.

Tesla robots have in a few short years killed more innocent people than all robots in history.

Consider another example, when a door fit issue with Tesla was falsely labeled “not a door fit issue“.

Magritte’s original title of this painting was L’usage de la parole I (The Use of Speech I), which implies that we should question the veracity of the words, not the image.

Musk thus uses childlike propaganda to distract and delay anyone from holding his car company accountable, while he makes horrible cars: far more death and injuries than any other brand.

Tesla reports to the NHTSA stand out (red dot upper right) with far more deaths and injuries than any other car brand. Source: Alshival Data Service

Likewise, Elon Musk holds many titles to avoid being accountable to any of them.

The more people tell him to focus, the less they understand his lack of focus, disdain for truth, is his exact defense mechanism against anyone who wants him to be focused and to hold him accountable.

HK Tesla Blows Intersection and Flips Taxi

From the Hong Kong news comes another story of Tesla failing to navigate an intersection, crashing into and flipping a car, causing five people to be injured.

At about 9 pm yesterday (Apr 27)… at the junction of Yau Tong Road and Lei Yue Mun Road, causing the taxi to be overturned. …the Tesla turned right onto Yau Tong Road and accidentally blocked and collided with the taxi.

Source: Google Maps

Running intersections is still an ongoing problem for Tesla, especially FSD version 12.

Related: MI Tesla Blows Red Light and Flips SUV

Self-Driving Cars Could “Create Hell on Earth”

This article originally appeared in 2016.

by Paul Wagenseil Senior editor, security, privacy and gaming.

LAS VEGAS — Autonomous driving systems such as Tesla’s Autopilot or Google’s self-driving cars are far from being safe, security researcher Davi Ottenheimer said last week at the BSides Las Vegas hacker conference here, adding that letting such machines make life-or-death decisions might “create hell on earth.”

The problem, Ottenheimer said, is that we put too much faith in the infallibility of computer and algorithms, especially when “machine learning” and massive amounts of data, just waiting to be processed, seem to promise a boundless future of rapid innovation. Instead, he said, it’s time to slow down, admit that machines will take a long time to catch up, and let humans take the wheel.

“We believe that machines will be better than us,” Ottenheimer said. “But they repeat the same mistakes we do, only faster.”

Ottenheimer said Tesla bears some responsibility for the death of Joshua Brown, the Ohio man who was killed May 7 when his Tesla Model S’s auto-driving system apparently failed to see a tractor-trailer crossing the Florida highway on which Brown was traveling. But Ottenheimer adds that Brown was just as culpable by letting the car take over under the assumption that it would be a better driver than he was.

“Computers are increasingly bearing the burden of making our decisions,” Ottenheimer said. “But Brown’s death was a tragedy that didn’t have to happen.”

In the six months before he died, Brown had posted nearly two dozen video clips to YouTube showing the performance of Autopilot in various conditions, and he seemed to understand Autopilot and its limitations well.

Yet Ottenheimer played Brown’s last clip, posted April 5, in which Brown’s Tesla is nearly forced off a highway near Cleveland by a work truck drifting into his lane. It looks as if Brown wasn’t being as attentive as he should have been, which Brown himself acknowledged in the text accompanying the clip.

“I actually wasn’t watching that direction and Tessy (the name of my car) was on duty with autopilot engaged,” Brown wrote. “I became aware of the danger when Tessy alerted me with the ‘immediately take over’ warning chime and the car swerving to the right to avoid the side collision.”

The big white work truck was forward and slightly to the left of Brown’s Tesla, about 11 o’clock in military parlance. An attentive driver would have noticed the truck merging into an adjoining lane and anticipated that he or she would soon be in the truck’s blind spot. Instead, Ottenheimer contends, Brown’s reliance on Autopilot created a double blind spot, in which neither Brown nor the work truck’s driver were aware of each other.

“Brown shifted the burden of safety to the car,” Ottenheimer said. “He should have reacted much earlier, but he waited 20 seconds for the car to take over.”

Tesla’s Autopilot saved Brown then, but it failed a month later, when Brown had his fatal encounter with the tractor-trailer in Florida. Tesla argued in the wake of Brown’s death that perhaps the car’s camera didn’t see the white trailer against a bright sky.

Ottenheimer had a slightly different theory, based on the truck driver’s recollection that Brown’s car changed lanes at the last moment to aim right for the center of the trailer. (Had the Tesla swerved left, it would have gone into a left-hand turn lane and perhaps been safer.)

“The Tesla may have thought there was separation between the front and rear wheels of the trailer,” Ottenheimer said. “It may have thought there was an open lane.”

Such mistakes are to be expected from machine algorithms, Ottenheimer said. They’re simply not as smart as people.

As examples, he cited Google searches for “professional hair” that returned images of white people, while “unprofessional hair” returned images of mostly black people; facial-recognition programs that thought pro wrestlers in the process of losing seemed happy; and a robot security guard that ran over a toddler at a shopping mall in the heart of Silicon Valley in June because it couldn’t predict the whereabouts of small children. (The toddler hurt his foot but was otherwise OK.)

“Was Joshua Brown a victim of innovation?” Ottenheimer asked. “Yes, and so was that toddler.”

Ottenheimer said self-driving cars can barely handle driving on a freeway, the most basic and easiest kind of driving. They won’t be able to deal with city traffic, or to navigate a crowded parking lot with vehicles and pedestrians coming from all directions.

To make self-driving cars safer, he suggested that Silicon Valley innovators step back, realize that they’re practicing what Ottenheimer called “authoritomation” — the process of transferring authority to automation — and consider what that means.

“We’re getting out of engineering, and moving into philosophical issues of responsibility and authority,” Ottenheimer said. “Don’t expect an easy help button.”