NC Tesla Combusts Into Massive Fire While Driving on Highway

The buried lede in an alarming story about sudden massive combustion causing environmental disaster is that Tesla owners really DGAF about quality or safety.

…even if his just went up in flames. “I still love the Tesla and probably going to order another one,“ said Lippe.

Lippe service.

Tesla destroyed by NC road debris on I-140. Source: WECT

We see here the real reason why Tesla seems desperate to produce cars faster.

Men like Lippe “love” to buy and dispose of them over and over again like bulk packs of cheap white tube socks.

The news report also takes an interesting shot at Tesla, pointing out the soulless mediocre car company dumps fatal design defects onto public roads yet ignores the first responders who have to clean up the mess.

It’s a known issue, Tesla has a list of first responder guides for its different models, while companies like General Motors have special E-V emergency training for first responders.

Tesla has guides. And the guides suck.

Subtle slam, right? If you blinked you might have missed it. GM indeed has been providing in person training and hands on education to first responders.

They actually care about the full lifecycle of engineering, holding a systemic view of transit. It includes significantly reducing taxpayer burden for any emergency, as well as lowering risks to people and the environment.

I was just meeting with one of the team who quickly reduced Porsche deaths to zero through novel engineering changes related to early GM safety innovations. It was some big picture stuff based on the science of ethics. You might not know or hear about it, which is the point. They aren’t in the news by design.

Tesla? DGAF. First responders get a blank stare, stale and incomplete documents and contact details that go nowhere. Known unsafe cars are sold on false promises, which generate constant bad news like this story. Tesla’s “profit” is from making someone, anyone else pay for their bad decisions.

And on that note, many other cars called in the NC road debris. Because road debris is a thing car engineers are supposed to think about, those calls were just about flat tires.

In a call from 4:08 p.m., the caller says that they got multiple flat tires and collided with a piece of metal, which they say may have hit a few other cars as well. […] Another caller from around the same time said that pieces of metal had fallen out of a pickup truck and caused two cars to get flat tires, and the metal was still in the road. About five minutes later, a third caller said that the car they were driving got a flat tire after a metal object fell from a truck.

Those calls should probably have immediately triggered a ban on any Tesla from that road, because they clearly are incapable of handling such common things on their own. The reporter again cast serious shade on Tesla.

…right after he hit the debris the car was still drivable but drivers around him had reason to be alarmed. “A truck driver next to me opened up his window and was pointing frantically. So I pulled off into the breakdown lane. And when I got out you can see flames from underneath the car in the front,” said Lippe.

The car was still drivable?

Uh, NO. Being on fire and about to be totally consumed is not drivable.

A flat tire is more drivable.

Luckily a truck driver intervened using an open window and somehow injected a dose of reality into this Tesla owner’s empty head.

Speaking of which did you know Tesla designed their own truck to make human interaction impossible not least of all with windows that can’t be opened?

Fail unsafe designs combined with gross negligence/ignorance of common risk perhaps translates to an immediate widespread stop order. Tesla manufactures the exact opposite of drivability.

Or to put it more cynically, if someone wants to destroy Teslas at scale, or wants to deplete emergency response capacity, they need only to strategically deploy simple road debris.

VA Tesla Crash Under Trailer Kills One

A truck has been charged with reckless driving after a Tesla drove under its trailer.

Teodoro, 57, of Remington, was behind the wheel of his Tesla, traveling north on James Madison Highway at about 6:31 p.m. Wednesday, July 19, when he struck the side of a tractor-trailer truck pulling out of the Quarles truck stop, according to Jeffery Long, Fauquier County Sheriff’s Office spokesman.

Teodoro’s car hit the side of the truck and went underneath it. He was pronounced dead at the scene, Long said in a news release.

It appears there was a lack of a highway onramp from the truck stop, meaning the trailer would have exposed its large broadside especially if it was headed southbound. This may have led to the very obvious trailer becoming invisible to Tesla’s faulty and unsafe Autopilot, as documented in at least three prior crashes into truck trailers caused by the robot.

Quarles truck stop entrance to James Madison highway. Source: Google Maps

US Court Green Lights Execution by Tesla: Probation for Robotic Manslaughter

In California, a Tesla equipped with Autopilot ignored red lights and caused fatalities, leading to its owner receiving only a two-year probation. Consequently, it appears that the penalty for such an incident, where a Tesla with autonomous capabilities is involved in causing harm, is relatively lenient. Enabling Autopilot and directing it towards someone seems to result only in a charge of vehicular manslaughter, potentially giving a green light to robot attacks.

A 28-year-old man who was behind the wheel of a Tesla Model S on autopilot in 2019 when it ran a red light in Gardena and slammed into a car, killing two people, authorities said, has been sentenced to two years of probation after pleading no contest to two counts of vehicular manslaughter.

This outcome is disheartening as it appears that the misuse of Autopilot is now seemingly protected from prosecution, especially when it is exploited for deliberate and motivated assassinations (even in the case of mass casualties) aimed at advancing certain social or political causes.

ALTHOUGH MOST PERSONS WOULD NOT ADMIT THE WOULD-BE ASSASSIN’S RIGHT TO JUDGE, CONDEMN, AND EXECUTE, THEY MIGHT CONCEDE THE RIGHT OF AN INDIVIDUAL TO JUDGE, CONDEMN, AND ORDER EXECUTION, AS THEY DID TO HARRY TRUMAN IN HIS USE OF THE HYDROGEN BOMB.

The widespread release of faulty software in Tesla vehicles, which can be centrally controlled and planned, has raised a significant concern. These vehicles, operating with features similar to loitering cluster munitions (known as Autopilot attacks), pose a real threat. This situation implies that even without the need for sophisticated Intercontinental Ballistic Missiles (ICBMs), certain entities like white nationalist groups or North Korea could potentially compromise data integrity to employ swarms of Teslas for dangerous actions, such as ignoring traffic lights or intentionally veering onto sidewalks and into buildings, causing harm and chaos.

Furthermore, the manipulation of Tesla’s Autopilot could lead to individuals escaping severe charges, downplaying their actions as simple negligence rather than being held accountable for intentional actions that result in explosions and capital murder. This raises serious concerns about the potential misuse and consequences of such technology.

Yoshihiro Umeda

In April 2018, Yoshihiro Umeda tragically lost his life due to a Model X Autopilot incident. The owner of the Tesla claimed to be asleep at the time of the incident, as the vehicle suddenly accelerated, crashing into a group of people standing in front of a large white van. This incident bore striking resemblance to a terror attack.

Following this unfortunate event, Umeda’s family filed a wrongful death lawsuit against Tesla, holding Autopilot responsible, and claiming that defective vehicles were the cause of the tragedy. However, in 2022, the lawsuit was coldly dismissed from federal court in California and unexpectedly moved to Japan, raising suspicions about the reasons behind the decision.

For years, Tesla managed to prolong the legal proceedings, putting immense pressure on the victim’s family through their arguments. While Tesla continued to sell cars in Japan, incidents of these vehicles causing harm and even fatalities among the Japanese population were reported. Despite this, the company argued that it should be exempt from prosecution under US or California laws because “Japan is not a party to the Hague Convention.”

“Tesla further argues, persuasively, that access to third party evidence in Japan for a proceeding in the U.S. would be at best extraordinarily cumbersome, time-consuming, expensive and with uncertain results. Dkt. 15-1 (Yamashita Decl.) ¶ 15 (“Japan is not a party to the Hague Convention …”) Umeda v. Tesla Inc., Case No. 20-cv-02926-SVK, 10 (N.D. Cal. Sep. 23, 2020)

Let’s clarify some misleading information.

First, contrary to what a Tesla lawyer might claim or a possibly confused US Judge, Japan is indeed a party to the Hague Convention. This fact is supported by concrete evidence. The Convention on the Service Abroad of Judicial and Extrajudicial Documents in Civil or Commercial Matters (known as the Hague Service Convention) was adopted on November 15, 1965, at the Hague Conference on Private International Law. Japan signed this convention on March 12, 1970, and subsequently ratified it on May 28, 1970. It officially entered into force on July 27, 1970.

In the past, American courts have utilized this Hague Service Convention when dealing with Japanese auto manufacturers for many years. Therefore, it is essential to dispel any misinformation and recognize that Japan is, in fact, a party to the Hague Convention.

During the Hague Conference on Private International Law in 2008, Japan provided its self-assessment regarding the general operation of the Service Convention. The rating they gave themselves is as follows:

[ ] Excellent
[X] Good
[ ] Satisfactory
[ ] Unsatisfactory

In this self-assessment, Japan marked “Good” to indicate their perception of the general operation of the Service Convention at that time.

Additionally, it’s essential to note that the Hague Convention is not the sole authority governing court discovery. The European Union (EU) has its own superseding international Council Regulation (EC) No. 1393/2007 that deals with this matter. Furthermore, even before the Hague Convention, Japan had already been operating under the 1963 Consular Convention with the United States for handling requests to obtain evidence.

Japan’s own convention contains an Article 17(1)(e), as mentioned in the link provided, stating that a US attorney must have any Japanese witness voluntarily offer information and cannot compel them to do so. In light of this, Tesla’s objection might have been more of a rejection of victim rights than anything else.Third, service through the Central Authority under the Hague Service Convention is no easy and fast picnic either. It requires months simply due to preparation of the documents and the usual multi-lingual bureaucracy of an “official language” such as Japanese (Article 5 of the Service Convention).

Tesla has deliberately employed stall tactics and delays as a cruel defense strategy to hinder the progress of litigation. Their repeated efforts to avoid US courts, specifically English reporting, have caused Japanese families who are grieving over the loss of their loved ones years of unnecessary burden, making it very clear who has intentionally turned the judicial process into something “extraordinarily cumbersome, time-consuming, expensive, and uncertain.”

When considering the entire situation, it becomes evident that Tesla’s Autopilot was clearly unsafe, defective, and posed an immediate threat to pedestrians back in 2018. However, the company has managed to evade accountability in the court system. If the US courts had held Tesla’s robotics accountable for their defects, it is possible that numerous deaths, possibly dozens if not hundreds, could have been prevented.

Elaine Herzberg

As many of us are aware, in March 2019, Arizona prosecutors concluded that Uber was not criminally responsible for their “self-drive” vehicles. However, the “back-up” driver involved in the incident was charged with negligent homicide.

This echoed Arizona Governor’s Executive Order 2018-04 inviting Uber to operate in the state.

If a failure of the automated driving system occurs that renders that system unable to perform the entire dynamic driving task relevant to its intended operational design domain, the fully autonomous vehicle will achieve a minimal risk condition; The fully autonomous vehicle is capable of complying with all applicable traffic and motor vehicle safety laws and regulations of the State of Arizona, and the person testing or operating the fully autonomous vehicle may be issued a traffic citation or other applicable penalty in the event the vehicle fails to comply with traffic and/or motor vehicle laws

The charges against the human operator of Uber’s autonomous vehicle were based on the failure of the vehicle to achieve a “minimal risk condition,” as specified in the Executive Order (EO). However, a significant issue with this liability concept arose when Arizona had previously defined a “minimal risk condition” as a robot operating without any human operator at all. This conflicting definition created a glaring problem in the case.

MINIMAL RISK CONDITION. A low-risk operating mode in which a fully autonomous vehicle operating without a human person achieves a reasonably safe state, such as bringing the vehicle to a complete stop, upon experiencing a failure of the vehicle’s automated driving system that renders the vehicle unable to perform the entire dynamic driving task.

This paragraph contains two major issues. Firstly, it is redundant to describe “minimal risk” from a robot as a “reasonably safe state.” This requirement lacks meaningful clarity as it essentially says that safety should be reasoned to be safe, which is not specific enough. Additionally, the state’s definition of a “fully autonomous” system as one “designed to function as a level four or five” creates a potential danger. Instead of focusing on preventing harms such as injury or death, the emphasis seems to be on designing a system that can predict when it is unable to predict, which is counterintuitive and confusing.

Secondly, the paragraph highlights that a temporary low-skilled human operator can be held liable for “dynamic” robots, even in cases of full autonomy with no human operator present. This creates a concerning accountability loophole for potentially harmful robots.

In essence, the paragraph suffers from vague safety definitions and poses a risk of holding the wrong parties accountable for the actions of autonomous systems, thereby leaving room for potential misuse and harm.

The situation went from bad to worse when the Arizona district attorney had to step back from the investigation due to their prior involvement in actively promoting Uber’s “self-drive” technology to gain people’s trust. This created a conflict of interest in handling the case. The district attorney’s campaign suggested that Uber’s autonomous vehicles reduced risk since they couldn’t drink and drive, implying they were safer than human drivers under the influence of alcohol.

However, despite this argument, the liability for the incident fell on Uber’s “back-up” driver, who could be argued to have been more impaired in perception and judgment than someone under the influence of drugs or alcohol. This raises serious questions about the fairness and accuracy of the assignment of responsibility in the case.

…she was looking toward the bottom of the SUV’s center console, where she had placed her cell phone at the start of the trip.

In addition to the previous issues, it was revealed that Uber had deliberately disabled their vehicle’s driver-assistance and braking technology, which was specifically designed to enhance pedestrian safety. This action raises significant concerns about approaches to robot manufacturer responsibility for safety and transfer of responsibility to hamstrung operators when ensuring the well-being of pedestrians and other road users.

At 1.3 seconds before impact, the self-driving system determined emergency braking was needed. But Uber said, according to the NTSB, that automatic emergency braking maneuvers in the Volvo XC90 were disabled while the car was under computer control in order to “reduce the potential for erratic vehicle behavior.”

After disabling the automatic braking system, Uber also neglected to implement any alerting mechanism to inform their “back-up” driver that the car had detected the need for emergency braking. In simple terms, Uber’s system failed to recognize a human in its path, misclassifying her as both a vehicle and a bicycle, and consequently, it didn’t prevent the harm despite the object being clearly visible at a considerable distance. The blame for the incident was then shifted to the driver.

To their credit, Uber eventually decided to shut down their self-driving operation. In contrast, Tesla seemed to downplay their involvement in a pedestrian death during the same time in 2018 and continued to charge customers a substantial premium to use unsafe autonomous vehicles on public roads, seemingly pushing the boundaries of what they could get away with in terms of already proven hazardous technology.

Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez

The above two significant incidents involving autonomous vehicles provide an important backdrop for a tragic event in 2019. During this later incident, a Tesla, as has been observed in several cases, disregarded several speed warnings and ran a red light at a speed exceeding 70mph. This predictably led to a collision with another car, resulting in the death of its two occupants.

Los Angeles police Officer Alvin Lee testified that numerous traffic signs warn drivers to slow down as they approach the end of the freeway. […] The case was believed to be the first felony prosecution filed in the U.S. against a driver using widely available partial-autopilot technology.

According to the NHTSA, their investigation teams have been involved in 26 crashes related to Tesla’s Autopilot since 2016, which resulted in at least 11 deaths. However, data collected from local Tesla crash reports on tesladeaths.com shows a much higher tally of 432 deaths, with Autopilot being at fault in 38 cases, more than three times the NHTSA’s number.

Similar to the 2018 Tesla victim family in Japan, the 2019 victim families in California have filed lawsuits against Tesla for selling defective vehicles. A joint trial is scheduled for mid-2023, and this time Tesla cannot prevent it from being conducted in English and within the United States.

One can’t help but wonder if this trial could have been avoided if Yoshihiro Umeda’s case had been given proper attention.

The lenient two-year probation given for the intentional violent use of a potentially dangerous autonomous system raises serious concerns about preventable deaths, especially if used by socially or politically motivated actors on public roads. The implications are significant, with even global figures like Kim Jong Un, Trump or Putin possibly observing a new widespread destructive threat vector with interest.

Tesla Quietly Admits in 2023 Autopilot Isn’t Ready For Public Roads

Flashback to 2016 when Tesla very confidently said more of their cars on the road would mean less injury, an absolute and total lie.

As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time…

Yeah, no. What do we call the opposite of that false prediction? Deaths have become increasingly common because of Tesla design and manufacturing defects.

Source: Tesladeaths.com

A new Tesla job post in Chicago is very revealing how Tesla is now in a panic state, rebooting like they don’t know what to do, while gaslighting tactics are still their thing.

Job Type: Full-time […] This is an at-will, temporary position. The assignment is expected to last 3 months.

Nothing says full-time like being a temporary position. What is seasonal about driving in America? As if people had a choice to not use their cars based on the seasons.

It’s a temp job, not being setup to be paid fairly as a temp job. The simple math is temp staff are paid overtime (1.5X hourly wage) for any time worked over 40 hours per week.

Tesla wants to demand a lot of hours of work per day, yet will not pay per hour. Tesla will demand a lot of learning and sharing of information, yet will throw the learner and sharer out after three months.

To be fair, Tesla might not expect the person to live longer than 90 days. Benefits should describe the funeral and burial a Tesla temp driver’s family should expect.

The role is a high-risk extreme danger robot operator.

Job Category: Autopilot & Robotics […] Operate a vehicle in a designated area for data collection […] Minimum 4 years of licensed driving experiences

This is truly fascinating because the whole basis of Tesla dumping it’s unfinished and mislabeled “Autopilot” into public roads was that it could be operated by anyone with almost no experience or restrictions.

Now, after years of mounting fatalities from “veered” collisions, Tesla quietly is bringing in unskilled temp drivers?

What is it about Tesla owners’ driving that has the company so worried now, such that they’re paying people behind the scenes to be their drivers instead?

  • Illinois
  • Minnesota
  • Utah
  • Texas
  • Colorado
  • Washington
  • California
  • Florida
  • Georgia
  • New York
  • Arizona
  • Massachusetts

Why so few states? And why only four years of experience? Is that the skill level you want behind your driverless robot?

Not me.

Four years is not even beginner level for driving expertise, let alone vehicle knowledge required to assess “proper working order”.

Remember how Tesla based its marketing on robots having huge amounts of driving experience, arguing we shouldn’t trust drivers with just a few years?

Now they’re hiring drivers in isolated environments with almost no expertise.

So to recap, it’s full-time yet temporary, driverless yet driving, with beginner driving experience expected to be a road expert/trainer for “quality”.

Gaslighting. The literal opposite to quality.

It reads to me like Tesla has had no idea what it’s been doing and has started to panic now due to regulators noticing fraud. Like a college student trying to hire other students as tutors to pass final exams after wasting four years on partying.

Why hire a dedicated driver today if the entire promise of autopilot was supposed to manifest through the quick and early years of needlessly throwing away the lives of any and every Tesla owner? That promise must have been false.

I think we may be witnessing that Tesla knows its robots are only getting worse over time, but they don’t understand why.

How many people would still be alive today if Tesla had not deployed such a faulty robot design? And can an inexperienced temp worker really bring the unnecessary Tesla fatality rate down?

Related:

Tesla more likely to run over Black people

A Simple Reason Why Tesla Keeps Crashing into Police Cars

Unsafe by Design: Tesla Fails to See Children, Red Lights, Emergency Vehicles…