Autonomous Vehicles Need No Drugs Or Drink To Be Bad Drivers

Humans often get cited for being influenced by drugs or alcohol when they make terrible decisions.

Autonomous vehicles (AV), on the other hand, need no drugs or alcohol to make such bad decisions.

This often gets reported as an AV isn’t at risk from drinking, while in reality the risk is from… influences.

And it turns out the AV is far more susceptible to influences than humans.

Tesla owners, for example, often seem to think they should train the company’s “automation” products to be in a huge rush and ignore all traffic laws: stop signs, red lights and yellow lines taught as optional in their world of influence.

That’s far worse than normal with one human drinking too much, or even dozens, because the bad behavior of Tesla’s “evil insiders” gets ingested into an entire AV fleet as free speed extremism… a huge robot army in permanent improvisation more dangerous than drugs or alcohol.

Serious food for thought when The Verge reports:

…there is scant data that proves that fully automated vehicles are safer than human drivers.

There is plenty of evidence that fully automated vehicles are easily put under the influence… of almost anything.

It’s like how TayBot was turned into a Nazi within 24 hours… which I’ve explained in great detail here before as a really dumb design flaw by Microsoft.

At least humans are influenced by known things and can be easily tested to determine how dangerous they will be attempting to operate heavy machinery.

The next time a Tesla crashes and someone reports a driver was given a standardized transparent test of influences, demand to know what the AV software was tested for after the crash, how and by who.

Think of it like this. Microsoft disabled their bot after one day. Uber cancelled their bot after one pedestrian was killed. Tesla turned their bot into a pay-to-play for the wealthy to flaunt safety laws and dangerously influence it, such that more and more and more people are dead.

Tesla Keeps Crossing Double-Yellow Causing Head-on Collisions

Elon Musk infamously boasts he makes mistakes whenever and doesn’t respect the rules.

Well, I guess we might make some mistakes. Who knows?

This seems to be coming up repeatedly as bad news for his customers, let alone anyone around them, when their car acts like the CEO and gloatingly crosses a double yellow.

Source: DC Fire

Another Tesla, another Tesla owner dead:

The preliminary investigation revealed, at approximately 7:50 a.m., a 2019 Tesla Model 3, occupied by five subjects, was traveling southbound in the left travel lane in the 3000 block of Connecticut Avenue, Northwest. The Tesla crossed the solid-double yellow lines into the northbound lane of travel, struck a 2018 Toyota C-HR, and then struck a 2010 Mercedes-Benz ML-350, head-on. […] On Sunday, February 26, 2023, the driver of the Tesla succumbed to his injuries and was pronounced dead.

The road suggests the driver was operating in a straight line on a road known for speed abuse (e.g. “40 to 60 percent of the people completely disobeyed the speed limit by more than 12 miles per hour”). A related problem with Tesla ignoring double yellow has been reported for many years. The algorithm treats lines as open to cross in almost any case, such as a vehicle slowing ahead.

Source: Google StreetView

I’ll say that again, Tesla engineering allegedly treated double yellow the same as dotted white line. Cars were trained to attempt unsafe maneuvers to feed unnecessary speed.

In this case when a car ahead slowed for the pedestrian crossing (as they should), a speeding Tesla algorithm likely reacted by accelerating in a jump across the double yellow into a devastating head-on crash… by design!

That crash scene reminds me of December 2016 when Uber driverless was caught running red lights, foreshadowing one death a year later that brought criminal charges and shut down their entire driverless program.

Source: Twitter

Tesla driverless software caused a similar fatality basically at the same time April 2018. Their cruel and unusual reaction was to increase the charge more and hire an army of lawyers to make the victims and their story disappear.

…when another vehicle ahead of him changed lanes to avoid the group [of pedestrians], the Model X accelerated and ran into them [killing Umeda].

Source: US District Court. Tesla software accelerated into pedestrians, parked motorcycles and a van. The company has for years manipulated courts and press to cover up this important 2018 crash.

2018.

Tesla software fatally accelerated straight into a traffic/hazard after it saw a vehicle ahead evade it. See why this 2023 crash reminds me of 2018?

Did you hear about this death and how bad Tesla was, or just the Uber case? The crashes were around the same time, and both companies should have been grounded. Yet Tesla instead bitterly spared with a grieving family and manipulated the news.

Here’s a 2021 Tesla owner forum report showing safety engineering has regressed, FSD five years later feeling worse than Uber’s cancelled program.

Had a super scary moment today. I was on a two lane road, my m3 with 10.6.1 activated was following a van. Van slowed down due to traffic ahead and my car decided to go around it by crossing the yellow line and on to oncoming traffic. […] The car ahead of me wasn’t idling. We were both moving at 25 mph. It slowed down due to traffic ahead. My car didn’t wait around. It just immediately decided to go around it crossing the yellow line.

A cacophony of voices then chime in and say the same thing happened to them many times, their Tesla attempting to surge illegally into a crash (e.g. “FSD Beta Attempts to Kill Me; Causes Accident”) . They’re clearly disappointed software has been designed to ignore double yellow in suicidal head-on acceleration, although some try to call it a bug.

January 2023 a new post shows Tesla still has been designed to ignore road safety. This time Tesla ignores red lights and stopped cars, allegedly attempting to surge across a double yellow into the path of an oncoming train!

Source: Tesla Motors Club

2023. A ten year old bug? Ignoring all the complaints and fatalities?

You might think this sounds too incompetent to be true, yet Tesla recently was forced to admit it has intentionally been teaching cars to ignore stop signs (as I warned here). That’s not an exaggeration. The company dangerously pushes a highly lethal “free speed extremism” narrative all the way into courts.

A speed limiter is not a safety device.

That’s a literal quote from Tesla who wanted a court to accept that reducing speed (e.g. why brakes are on cars) has no effect on safety.

As the car approached an intersection and signal, it accelerated, shifted and ran a red light. The driver then lost control… [killing him in yet another Tesla fire]. The NTSB says speeding heightens crash risk by increasing both the likelihood of a crash as the severity of injuries…

Let me put it this way. The immature impatient design logic at Tesla has been for its software to accelerate across yellow lines, stop signs and through red lights. It’s arguably regressive learning like a criminal, to be more dangerous and take worse risks the more it’s allowed to operate.

…overall [Tesla’s design feeds a] pattern here in Martin County of more aggressive driving, greater speeds and just a general cavalier sense towards their fellow motorists’ safety.

The latest research on Tesla’s acceleration to high speed mentality shows it increased driver crash risk 180 to 340 percent, with a survival rate near zero! Other EV brands such as Chevy Bolt or Nissan LEAF simply do not have any of this risk and are never cited.

340 percent higher crash risk because of design.

You have to wonder about the Tesla CEO falsely advertising his car would magically be safer than all others on the road… while also boasting he doesn’t know when obvious mistakes are being made. It’s kind of the opposite. The CEO likely is demanding known mistakes be made intentionally, very specific mistakes like crossing the double yellow and accelerating into basically everything instead of slowing and waiting.

People fall for this fraud and keep dying at an alarming rate, often in cases where it appears they might have survived in any other car.

And then you have to really wonder at the Tesla CEO falsely advertising his car would magically drive itself, inhumanely encouraging owners to let it wander unsafely on public roads (e.g. “wife freaks” at crossing double yellow on blind corner) yet never showing up for their funerals.

Let me just gently remind the reader that it was a 2013 Tesla crossing a double yellow killing someone that Elon Musk boasted was his inspiration to rush to market “Autopilot” and prevent it ever happening again.

2013.

Ten years later? Technology failures that indicate reckless management with a disregard for life that rises to the level of culpable negligence.

Note there is another big new lawsuit filed this week, although it talks about shareholder value as if that’s all that we can measure.

Tesla finally is starting to be forced by regulators into fixing its “flat wrong” software. The gravestones are proof of the mistakes being made. The grieving friends and families know.

The deceased driver’s profile in this latest crash says he went to Harvard, proving yet again intelligence doesn’t protect you from fraud.

Source: Tesladeaths.com

And that’s not even getting to reports of sudden steering loss from poor manufacturing quality, or wider reports about abuse of customers:

Spending over $20,000 on a $500 repair… a LOT of customers are getting shafted on this…TRUSTING Tesla do this, and they’re failing horribly at the expense of the customer.

If you don’t die in a crash from substandard hardware and software engineering, or grossly negligent designs, Tesla’s big repair estimate scams might kill you.

Why are Tesla allowed on the road?

Elon Musk Just Layed Off The Staff Who Believed His Promise They Wouldn’t Be Layed Off

If you’re still working at Twitter you might think the lesson from the Titanic is that promotions were open territory soon after its skipper drove into an iceberg.

Musk fired Crawford last week, as part of yet another round of layoffs at the company — after Musk promised late last year that there would be no more layoffs.

Crawford apparently thought she would coldly benefit from others being pushed into lifeboats, as if a big promotion suddenly was hers if she slept on the deck or complimented the skipper’s pants.

Layoffs instead came for those who foolishly stayed, those believing their CEO promise there would be no layoffs.

First they came for PR and she said nothing, then they came for engineering and again she said nothing, then they came for her…

This CEO is constantly saying things immediately proven false. He uses a classic tactic of permanent improvisation, formerly known as white supremacist dictatorship.

Layoffs are unfortunate, but layoffs soon after layoffs while promising no layoffs… that’s executive malfeasance.

Tesla Racing Instructor Warns Sudden Acceleration A Design Flaw: NOT Driver Fault

As I suggested a couple weeks ago, Tesla sudden acceleration has hallmarks of 1980s design flaws.

Now a Tesla Racing Instructor is trying to tell the world it happened even to him.

…nothing hits home as something like this happens to a Tesla fan. Greg Wester, a long-time Tesla owner from San Francisco, just shared a close encounter when his Tesla Model 3 suddenly accelerated in a parking garage. According to his story, the car was stopped when it suddenly bounced forward. Luckily, he had his foot on the brake pedal and was able to “overpower it.” Now, Greg is not a regular driver. He races his Tesla and is also a racing instructor, so he should know when to press the accelerator and when to brake. Greg is trying to comprehend what caused this and is willing to extract the info from the car computer if he finds a way. Nevertheless, he has lost confidence in Tesla now. Following his incident, he wrote on Twitter that he seriously considers installing a 400-volt kill switch next to the steering wheel. He also says that “no pedestrian should ever walk in front of a Tesla that is parking.”

No one is safe around a Tesla, although this guy brags “I can handle dangerous cars” he also admits a serious flaw:

…willing to extract the info from the car computer if he finds a way…

This is HUGE. A monster clue. He doesn’t trust Tesla and hints here he needs total separation for the info. It’s his car, his data. Yet Tesla isn’t giving owners a way to present facts about their own life because Tesla controls “info” entirely and selfishly. Their management practices always have been a four alarm data security dumpster fire. Tesla staff can do what they are known to do and lie, with no simple trusted path designed for truth to be found by the people they put in harm’s way.

The driver gave his detailed explanation on Twitter, reminiscent of the 911 call that changed everyone’s mind about Toyota liability (Mark Saylor’s calm, professional tragedy disproved driver fault).

It felt almost like some extra electricity all of a sudden grounded and spun the motor. Very scary. I was sitting close to the steering wheel and my foot was 100% covering the brake, light pressure — not half on like heel-toe shifting and throttle blipping. Wearing flat tennis shoes.

In related news another Tesla suddenly accelerated into a building and caught fire.

It then caught fire again while on the tow truck, injuring the driver and forcing crews to dump its burning toxic wreckage into the street.

That’s almost as bad as the news a Tesla accelerated into a girl scout cookie stand in a Walmart parking lot.

The guy suddenly accelerating into girl scouts was given a sobriety test. His car was not given a similar test. Perhaps he could argue unless Tesla agrees to independent standardized third party tests of its data in real time… then he should be allowed (like Tesla) to test himself days later and in a closed box to decide if he thinks he was at fault.

Tesla seems more and more like an airplane that can’t land without crashing. Don’t get in one. Don’t be around one.