Did a Robot Just Try to Kill Tiger Woods?

Catchy title, no? But seriously, cars (from the old word carriage) also are called automobiles because they automate mobility, kind of like robots that move people.

I’m calling a car a robot because that’s really what it is.

In the recent case of Tiger Woods, his robot sent him at high speed off the road.

First allow me to set the context on the automobile in question. It has a particular problem that in October 2012 wasn’t discussed enough, and thus was flagged by automobile critics:

Hyundai has been having problems pop up here, and there, and the related news coverage at best has been extremely minimum. […] These incidents have been happening more often than none all over Korea to hapless drivers who don’t know anything about being prepared on handling an out of control car. After the second video I became very interested and wanted to know more about the Hyundai acceleration issue.

An out of control Hyundai? Accelerating suddenly without warning? That sounds familiar.

Second, let’s define sudden Unintended Acceleration (UA) by referencing the National Highway Traffic Safety Administration (NHTSA) explanation:

Unintended, unexpected, high power accelerations from a… low initial speed accompanied by an apparent loss of braking effectiveness.

Third, there was a well documented case of Toyota covering up its SUA, exposed by a whistleblower (Besty Benjaminson, as cited by Chase law) who also exposed the NHTSA as rather shallow in its investigations.

Through the Senator’s whistleblower program, I gave hundreds of documents to his Judiciary Committee staffers. I sorted the documents to show that many electronics issues related to UA were known inside Toyota but not even touched upon by NHTSA and NASA in their studies of Toyota electronics and UA. I also organized the documents to show that it seemed the executives were misrepresenting facts in their sworn testimony before three Congressional committees. Senator Grassley was thus concerned about whether NHTSA had done a proper job, especially with the NASA study it had commissioned, and sent a public letter of inquiry to NHTSA administrator David Strickland. NHTSA’s response to Senator Grassley was cleverly worded and noncommittal.

Which now brings us back to the recent news of Tiger Woods in a 2021 [Hyundai] Genesis GV80 experiencing UA.

Woods’ SUV was traveling between 84 and 87 miles per hour just prior to impact, investigators learned. There was no indication that he hit the brakes. It’s possible that he hit the accelerator pedal by accident. “It is speculated and believed that Tiger Woods inadvertently hit the accelerator instead of the brake pedal,” LASD Capt. James Powers told reporters.

Old problem new car?

At least he wasn’t driving a Tesla.

True/False? “NHTSA reports an average of one accident per 484,000 miles”

As soon as Tesla was on the road it had to start reporting deaths. Source: https://www.tesladeaths.com/

I keep reading the following sentence in safety reports about Tesla, but only about Tesla:

NHTSA reports an average of one accident per 484,000 miles.

Do you see the NHTSA reporting that anywhere? I do not. And I do not see any other car manufacturer quoting this number either.

I see only a sentence Tesla put on their website to claim they aspire “to be” the safest car on the road. And then they wrote that sentence without any source or qualifications.

In other words the 484,000 miles reference is found nowhere but the Tesla site, which claims it found it somewhere else without telling us exactly where.

This December 2020 NHTSA report (DOT HS 813 060) is perhaps the closest thing: “Overview of Motor Vehicle Crashes in 2019”

Source: NHTSA’s National Center for Statistics and Analysis, Research Note: “Overview of Motor Vehicle Crashes in 2019”, DOT HS 813 060

Wow, as a percentage of total fatalities since 1995 more and more people outside cars are being killed!

Speaking of charts, here’s a real one based on the data that Tesla itself publishes.

Source: Tesla Vehicle Safety Report for the first quarter of 2021. It clearly shows the average distance per accident while driving on Autopilot, and driving off Autopilot, both declined year-over-year.

I am not kidding when I illustrate their own data showing the precipitous decline in safety over recent quarters, while their NHTSA number is showing almost no change. These are the real numbers they publish themselves. Bizarre.

So can someone find the magic 484,000 number anywhere in NHTSA reports? I have questions even if you can:

  1. Why didn’t Tesla put in a simple NHTSA reference to their claim? Don’t they want us to connect directly to the NHTSA and read that report if true?
  2. Why do people keep repeating this without any direct NHTSA reference? People say Tesla says that the NHTSA says a number. What? Nobody just says please show us this report? Can anybody find an actual NHTSA report that says this number?
  3. Does anyone understand what NHTSA might actually be talking about when they are cited improperly in this Tesla quote?

Until I see this report where NHTSA says the exact magic 484,000 number, I continue to believe something is very wrong with media channels repeating it as though it’s true.

Take this report that uses the number for example:

Stock in the electric-vehicle pioneer Tesla is wobbling after a Tesla vehicle crashed and police said no one appears to have been at the wheel.

Here’s another one that uses the number:

Tesla Q1 Safety Report Shows Rise In Autopilot Accidents

Why is that 484,000 data point being sourced from Tesla in these articles about Tesla safety failures, and NOT some statement or report directly from the actual NHTSA?

Perhaps Tesla is engaging in disinformation such that safety news is always controlled by them and them alone to poison a safety narrative?

Here are some guesses why Tesla doesn’t want someone to find or read a NHTSA report, even though Tesla wants us to believe they base their safety engineering on it:

  • NHTSA averages are for all vehicles in all conditions everywhere
  • Tesla averages are for a tiny subset of vehicles and conditions
  • Tesla doesn’t define methods or terms such as miles, crash, accident
  • Tesla crashes have been increasing, worsening not improving
  • Other car manufacturers are reporting their safest records in history during rise in Tesla fatalities and injuries

Saying Autopilot in a Tesla is safer than a 1995 rust-bucket on a dirt road where Autopilot can’t even function is a completely bogus comparison.

Tesla seems to be willfully misleading with its claims about crash data.

As an example of more meaningful comparison here is an actual NHTSA report on factors in crashes in the United States:

Source: NHTSA’s National Center for Statistics and Analysis, Research Note: “The Relationship Between Passenger Vehicle Occupant Injury Outcomes and Vehicle Age or Model Year in Police-Reported Crashes”, DOT HS 812 937

From this table we see 1995-2011 cars are clocking 1,030,624 severe injuries.

Meanwhile, 2012-2018 cars have only 199,480. So is the 2021 Tesla safety marketing campaign comparing itself to a 1995 car on purpose?

Also the NHTSA issues a warning about their own numbers:

…while the present analysis shows that the newer vehicle model year groups were inversely associated with occupant injury severity outcome, this study does not identify which aspects of the model year group with particular vehicular designs are responsible for the reduction in the risk of severe injury to vehicle occupants.

That’s literally the opposite of Tesla marketing, which repeatedly says their particular vehicle is responsible for reduction of crashes… despite no actual evidence to support such claims.

Tesla put its first cars on the road in 2013, right? So you can see it’s patently unfair to compare a 2013 or later model with anything prior unless making a completely different point about car safety (e.g. buy any new car, not an old car, because data shows generic new cars safer than all old ones).

Do you see a problem with Tesla comparing its particular cars to all crashes ever for all cars on the road instead of doing a true comparison with proper analysis?

What if we just run the numbers of Teslas crashing versus Teslas delivered. What percentage of Teslas crash, and how soon after being delivered?

Remember that table at the start of this post?

After putting only a few thousand cars on the road, and a CEO publicly stating his cars are the safest of all cars on the road, Tesla had to report two deaths from a car that “veers into opposite lane”.

Is there another car manufacturer that has as many deaths per cars delivered?

If you went out to buy a car today, Tesla continues to claim misleadingly you should see them as safer than ALL cars ever made, even when you are only in the market for NEW cars.

And when you’re in the market for new cars, Tesla may in fact be significantly less safe than other options (Volvo, Honda, etc). Here’s some proper analysis:

The fundamental problem here is that Tesla does a poor job of driver monitoring. Unlike several other automakers, Tesla only uses a torque sensor in the steering wheel to try to detect when the driver is moving the wheel. This is a cheap but very imprecise method.

A brand new Tesla uses “cheap but very imprecise” engineering for its safety.

Why would Tesla hide the reference to the NHTSA and make it hard to see the actual math? Seems cheap and imprecise of them.

Again, here’s some proper analysis.

General Motors’ similar Super Cruise feature, which is advertised as hands-free, uses facial recognition technology to ensure that a driver is watching the road while it is in operation and recently ranked higher than Autopilot in a Consumer Reports test

I don’t like “hands-free” marketing either, but you have to recognize that Tesla was ranked lower than other brands in safety using independent analysis

If nothing else, you should know Tesla clearly doesn’t want the NHTSA to speak for itself because it never seems to say to anyone “here’s the NHTSA” or “go read the NHTSA”.

Until I see people start to use original source NHTSA documentation when talking about NHTSA reports, I am extremely skeptical of the NHTSA being fairly referenced by Tesla.

If Tesla builds cars like they build their arguments to drive their cars, you shouldn’t buy their cars.

Here’s some poetry that might help explain:

Electric cars were the future in 1981.
Reagan shut it all down.
Electric cars were the future in 2001.
Bush shut it all down.
Electric cars were the future in 2021.
Tesla is a dumpster fire.

If you want to know why people are sticking with fossil fuels, it’s pretty clear who is keeping them alive. Yes, that’s silly. Let’s get rid of the combustion engine and get in our electric cars.

Just don’t get into a Tesla unless you’re prepared to be misled by funny numbers straight into a tree and die in a fire.

Why put people into an electric clown car? That does not help bring electric cars to market faster, as it destroys trust in new cars and their manufacturers.

Perhaps the best take in the news so far has been the Chaser:

“When we say we want a fully driverless future, we mean it” said Tesla CEO, Elon Musk at a press conference on Monday. Musk harked back to his childhood days as the heir of a Zambian blood-diamond empire “this tactical disdain for human life is crucial for any entrepreneur looking to really embrace change”.

For my take on broader disinformation issues in Tesla marketing, see my earlier post on their CEO tactics.

Facebook Security Knew in 2017 There Was a Problem and Failed to Act

A harsh new report about notifications to Facebook in 2017, based on an allegedly leaked internal email, tells us what we’ve all known since at least 2015.

Facebook knew there was a problem, and failed to do anything until half a billion users’ details were released…

Security at Facebook has been operating as a farce. News stories like this based on internal email confirm harms were intentionally allowed as a strategic decision, even while peddling a story about being the best in the world at preventing them.

Management increasingly is exposed for focusing all along on attacking and manipulating critics. In the event of criticism, Facebook runs massive PR engines to discredit real journalism (as I’ve written about here before).

Tesla CEO Gaslighting Autopilot Safety Failures

Update April 22, 2021: A statement from Consumer Reports’ senior director of auto testing, Jake Fisher confirms that the Tesla vehicles lack basic safety — fail to include a modern-day equivalent of a seat belt.

In our test, the system not only failed to make sure the driver was paying attention — it couldn’t even tell if there was a driver there at all.

Other manufacturers neither have the safety failures of Tesla, nor the exaggerated safety claims, nor a CEO who encourages known unsafe operation of his sub-par engineering.

Source: My presentation at MindTheSec 2021

“2 dead in Tesla crash after car ‘no one was driving’ hits tree”. Source: NBC

Just a few days ago on April 14th the CEO of Tesla tweeted a prediction:

Major improvements are being made to the vision stack every week. Beta button hopefully next month. This is a “march of 9’s” trying to get probability of no injury above 99.999999% of miles for city driving. Production Autopilot is already above that for highway driving.

Production Beta

You might see a problem immediately with such a future leaning statement clearly meant to mislead investors. “Production Autopilot” means it already is in production, yet the prior sentence was “Beta… next month”.

Can it both be in production while under continuous modifications for testing, while still being a month away from reaching beta? It sounds more like none of it is true — there is the least possible work being done (real testing and safe deployment is hard) but they want extra credit.

Also make special note of the highway driving reference. Production is being used as a very limited subset of production. It’s still being tested in the city because not ready while being production ready for highway, but all of it is called production while being unready?

This is very tortured marketing double-speak, to the point where Tesla language becomes meaningless. Surely that is intentional, a tactic to avoid responsibility.

Let’s move on to April 17th at 3:45PM when the CEO of Tesla was tweeting Autopilot claims about being standard on all Teslas, as part of a full endorsement, juicing it with third-parties working as Tesla PR who cooked up false and misleading advertising like this:

Source: Twitter

Hold that thought. No matter what, even with manual mode, Tesla’s PR campaign is telling drivers that Autopilot is always there preventing crashes. Got it? This is important in a minute.

Also this is not a statement about it being a production highway-only Autopilot. It is not specifying the beta button of vision of Autopilot. There is nothing anything about this or that version, in this or that situation.

This is a statement about ALL Autopilot versions on all Teslas.

ALWAYS on, looking out for you.

Standard. On ALL Teslas.

These are very BOLD claims, also known as DISINFORMATION. Here it is in full effect with CEO endorsement:

Passive means active? Who engages in such obviously toxic word soup?

Just to be clear about sources, the above @WholeMarsBlog user operating like a PR account and tweeting safety claims is… a Tesla promotional stunt operation.

It tweets things like “2.6s 0-60 mph” promoting extreme acceleration right next to a video called “Do not make the mistake of underestimating FSD @elonmusk”

Do not underestimate “full self driving”? Go 0-60 in 2.6s?

That seems ridiculously dangerous advice that will get people killed, an uncontrolled acceleration actively launching them straight into a tree with NO time even for a passive chance of surviving.

Here’s the associated video, a foreshadowing at this point.

To summarize, an account linked to the CEO explicitly has been trying to encourage Tesla owners to do highly dangerous performance and power tests on small public roads that lack markings.

Now hold those two thoughts together. We can see Tesla’s odd marketing system promoting: Autopilot is always looking out for you on all Tesla models without exception, and owners should try extreme tests on unmarked roads where underestimating Autopilot is called the “mistake”. In other words…

Tesla PR is you should DRIVE DANGEROUSLY AND DIE.

See the connections?

I see the above introduction as evidence of invitation from Tesla (they certainly didn’t object) to use Autopilot for high performance stunts on small roads where even slight miscalculation could be disastrous.

Next, on April 17th just hours before yet another fatal Tesla accident, the CEO tweeted his rather crazy idea that a Tesla offers a lower chance of accident when it is compared to all automobile crash data combined.

Specifically, the CEO points to his own report that states:

NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

Where does it show this? Most recent data means what? Are we talking about 2016?

This is what I see in the 2020 report, which presumably includes Teslas:

Overview of Motor Vehicle Crashes in 2019. Source: NHTSA

Tesla offers no citations that can be verified, no links, no copy of the recent data or even a date. Their claims are very vague, written into their own report that they publish, and we have no way of validating with them.

Also, what is defined by Tesla as a crash? Is it the same as the NHTSA? And why does Tesla say crash instead of the more meaningful metric of fatality or injury?

NHTSA publishes a lot of fatality data. Is every ding and bump on every vehicle of any kind being compared with just an “accident” for Tesla? All of it seems extremely, unquestionably misleading.

And the misuse of data comes below statements the company makes like “Tesla vehicles are engineered to be the safest cars in the world.” This is probability language. They are to be safe, when? Sometime in future? Are they not the safest yet and why not? Again misleading.

The reverse issue also comes to mind. If a child adds 2+2=4 a billion times, that doesn’t qualify them as ready to take a calculus exam.

However Tesla keeps boasting it has billions of miles “safely” traveled, as though 2+2 is magically supposed to be equivalent to actual complex driving conditions with advanced risks. It’s a logical fallacy, which seems intentionally misleading.

You can see the CEO pumps up generic Autopilot (all of them, every version, every car described as totally equivalent) as something that will prevent huge numbers of crashes and make an owner exponentially safer, based only on hand-wavy numbers

Now let’s watch after a crash happens and he immediately disowns his own product, splitting hairs about this or that version and claiming there’s no expectation of capability in any common situation.

His next tweet on the subject comes April 19th at 2:14PM when he rage tweets about insider information (secret logs) to dispute claims made by witnesses and reporters.

To recap, before a fatal accident the CEO describes Autopilot as a singular product across all Tesla that dramatically reduces risk of a crash no matter what. And then immediately following a fatal accident the CEO is frantically slicing and dicing to carve out exceptions:

  • Enabled
  • Purchased FSD
  • Standard Autopilot
  • Lane lines
  • This street

These caveats seem entirely disingenuous compared with just a day prior when everything was being heavily marketed as safer without any detail, any warning, any common sense or transparency.

Note that the WSJ report that prompted the tweet is gathering far lower social numbers than the CEO’s own network effects, which helps explain how and why he pushes selfish narratives even while admitting facts are not yet known.

The CEO is trying to shape beliefs and undermine the voice of professionals to get ahead of the facts being reported accurately.

Now just imagine if the CEO cared about safety. On April 17th he could have tweeted what he was saying on the 19th instead:

Dear loyal fans, just so you are aware your Standard Autopilot isn’t like Purchased FSD and it won’t turn on unless it sees something that looks like a lane line…don’t overestimate its abilities. In fact, it doesn’t turn on for a minute or more so you could be in grave danger.

Big difference right? It’s much better than that very misleading “always on” and “safest car in the world” puffery that led right into another tragic fatality.

Seriously, why didn’t his tweets on the 17th have a ton of couched language and caveats like the 19th?

I’ll tell you why, the CEO is pushing disinformation before a fatality and then more disinformation after a fatality.

Disinformation from a CEO

Let’s break down a few simple and clear problems with the CEO statement. Here it is again:

First, the CEO invokes lane lines only when he replies to the tweet. That means he completely side-steps the mention of safety measures. He knows there are widespread abuses and bypasses of the “in place” weighted seat and steering wheel feedback measures.

We know the CEO regularly promotes random evidence of people who promote him, including people who practice hands-off driving, and we should never be surprised his followers will do exactly what he promotes.

The CEO basically likes and shares marketing material made by Tesla drivers who do not pay attention, so he’s creating a movement of bad drivers who practice unsafe driving and ignore warnings. Wired very clearly showed how a 60 Minutes segment with the CEO promoted unsafe driving.

Even Elon Musk Abuses Tesla’s Autopilot. Musk’s ’60 Minutes’ interview risks making the public even more confused about how to safely use the semi-autonomous system.

We clearly see in his tweet response that he neither reiterates the safety measure claims, nor condemns or even acknowledges the well-known flaws in Tesla engineering.

Instead he tries to narrow the discussion down to just lines on the road. Don’t let him avoid a real safety issue here.

In June of 2019 a widely circulated video showed a Tesla operating with nobody in the driver seat.

…should be pretty damn easy to implement [prevention controls], and all the hardware to do so is already in the car. So why aren’t they doing that? That would keep dangerous bullshit like this from happening. Videos like this… should be a big fat wake-up call that these systems are way too easy to abuse… and sooner or later, wrecks will happen. These systems are not designed to be used like this; they can stop working at any time, for any number of reasons. They can make bad decisions that require a human to jump in to correct. They are not for this. I reached out to Tesla for comment, and they pointed me to the same thing they always say in these circumstances, which basically boils down to “don’t do this.”

September of 2020 a widely circulated video showed people drinking in a Tesla at high speed with nobody in the driver seat.

This isn’t the first time blurrblake has posted reckless behavior with the Tesla…. He has another video up showing a teddy bear behind the wheel with a dude reclining in the front passenger seat.

Show me the CEO condemnation, a call for regulation, of an owner putting their teddy bear behind the wheel in a sheer mockery of Tesla’s negligent safety engineering.

March of 2021 again a story hit the news of teenagers in a Tesla, with nobody in the driver seat, that runs into a police car.

That’s right, a Tesla crashed into a police car (reversing directly into it) after being stopped for driving on the wrong side of the road ! Again, let me point out that police found nobody in the driver seat of a car that crashed into their police car. I didn’t find any CEO tweets about “lane line” or versions of Autopilot.

Why was a new Tesla driving on the wrong side of the road with nobody in the driver seat, let alone crashing into a police car with its safety lights flashing?

And in another case during March 2021, Tesla gave an owner ability to summon the car remotely. When they used the feature the Tesla nearly ran over a pregnant woman with a toddler. The tone-deaf official response to this incident was that someone should be in the driver seat (completely contradicting their own feature designed on the principle that nobody is in the car).

People sometimes seem to point out how the CEO begs for regulation of AI, talks about AI being bad if unregulated, yet those same people never seem to criticize the CEO for failing to lift a finger himself to regulate and shut down these simple bad behavior examples right here right now.

Regulation by others wouldn’t even be needed if Tesla would just engineer real and basic security.

The CEO for example calls seat belts an obviously good thing nobody should ever have delayed, but there’s ample evidence that he’s failing to put in today’s seat belt equivalent. Very mixed messaging. Seat belts are a restraint, reducing freedom of movement, and the CEO is claiming he believes in them while failing to restrain people.

There must be a reason the CEO avoids deploying better safety while also telling everyone it’s stupid to delay better safety.

Second, lines may be needed to turn on. Ok. Now explain if Autopilot can continue without lines. More to the obvious point, does a line have to be seen for a second or a minute? The CEO doesn’t make any of this detailed distinction, while pretending to care about facts. In other words if a line is erroneously detected then we assume Autopilot is enabled. Done. His argument is cooked.

Third, what’s a line? WHAT IS A LINE? Come on people. You can’t take any statement from this CEO at face value. He is talking about faded, worn, vague, confused lines like there is some kind of universal definition, when Autopilot has no real idea of what a line is. Again his argument is cooked.

Sorry, but this is such an incredibly important point about the CEO’s deceptive methods as to require shouting again WHAT IS A LINE?

Anything can be read as a line if a system is dumb enough and Tesla has repeatedly been proven to have extremely dumb mistakes. It will see lines where there are none, and sometimes it doesn’t see double-yellow lines.

Fourth, the database logs can be wrong/corrupted especially if they’re being handled privately and opaquely to serve the CEO’s agenda. That statement was “logs recovered so far”. Such a statement is extremely poor form, why say anything at all?

The CEO is actively failing to turn data over to police to be validated and instead trying to curry favor with his loyalists by yelling partial truths and attacking journalists. Such behavior is extremely suspicious, as the CEO is withholding information while at the same time knowing full well that facts would be better stated by independent investigators.

Local police responded to the CEO tweets with “if he has already pulled the data, he hasn’t told us that.”

Why isn’t the CEO of Tesla working WITH investigators instead of trying to keep data secret and control the narrative, not to mention violate investigation protocols?

…the National Transportation Safety Board (NTSB), which removed Tesla as a party to an earlier investigation into a fatal crash in 2018 after the company made public details of the probe without authorisation.

The police meanwhile are saying what we know is very likely to be true.

We have witness statements from people that said they left to test drive the vehicle without a driver and to show the friend how it can drive itself.

Let’s not forget also this CEO is also the same guy who in March 2020 tweeted disinformation “Kids are essentially immune” to COVID19. Today we read stories that are the opposite.

…government data from Brazil suggest that over 800 children under age 9 have died of Covid-19, an expert estimates that the death toll is nearly three times higher…

Thousands of children dying from pandemic after the Tesla CEO told the world to treat them as immune. Essentially immune? That’s double-speak again like saying Autopilot is in production meaning highway only because still testing urban and in a month from now it will achieve beta.

Or double-speak like saying Autopilot makes every Tesla owner safer always, except in this one road or this one car because of some person.

Who trusts his data, his grasp of responsibility for words and his predictions?

Just as a quick reminder, this crash is the 28th for Tesla to be investigated by the NHTSA. And in 2013 when this Model S was released the CEO called it the safest car on the road. Since then as many as 16 deaths have been alleged to be during Autopilot.

Fifth, the location, timing (9:30P) and style of the accident suggests extremely rapid acceleration that didn’t turn to follow the road and instead went in a straight line into a tree.

This is consistent with someone trying to test/push extreme performance “capabilities” of the car (as promoted and encouraged by the CEO above and many times before), which everyone knows would include people trying to push Autopilot (as recorded by witnesses).

Remember those thoughts I asked you to hold all the way up at the top of this post? A reasonable person listening to “Autopilot is always on and much safer than human” and watching videos of “Don’t underestimate FSD” next to comments about blazing acceleration times… it pretty obviously adds up to Tesla creating this exact scenario.

Tesla owners dispute CEO claims

Some of this already has been explored by owners of Tesla vehicles who started uploading proofs of their car operating on autopilot with no lines on a road.

In one thread on Twitter the owner of a 2020 Model X with Autopilot and FSD Capability shares his findings, seemingly contradicting Tesla’s CEO extremely rushed and brash statements.

@LyftGift Part One

7:55am, I returned to the parking lot to show you folks how the Autopilot engages with no lines marked on the road as @elonmusk claims is necessary. I engaged autopilot without an issue. I didn’t post this video right away, because I wanted to see how y’all would twist it.

@LyftGift Part Two

Show me a line. Any line. Show me a speed limit sign. Any sign.

@LyftGift then reiterates again with a screenshot: “At 2:15 both icons are activated. Cruise and AP” with no lines on the road.

Something worth noting here, because the tiny details sometimes matter, is the kind of incongruity in Tesla vehicle features.

The CEO is saying the base Autopilot without FSD shouldn’t activate without lines, yet @LyftGyft gives us two important counter-points.

We see someone not only upload proof of Autopilot without lines, it is in a 2020 Model X performance, with free unlimited premium connectivity.

An eagle-eyed observer (as I said these details tend to matter, if not confuse everything) asked how that configuration is possible given Tesla officially discontinued it in mid-2018.

@LyftGift replies “Tesla hooked me up”.

So let’s all admit for the sake of honesty here, since Tesla bends its rules arbitrarily to say what is or is not available on a car, it is really hard to trust anything the CEO might say he knows or believes about any car.

Was it base Autopilot or is he just saying that because he hasn’t found out “yet” in his extremely early announcements that someone at Tesla “hooked” a modification for the owner and didn’t report it.

22 Hammock Dunes Place

Maps show that the empty wooded lot where the car exploded had desolate, simple lanes, near a golf club, where the roads were in perfect condition. The only complication seems to be the roads are constantly curved.

The car allegedly only went several hundred yards on one “S” curve and lost control, before exploding on impact with a tree. The short narrow path and turn suggests rapid acceleration that we’ve read about in other fatal Tesla crash and burn reports.

I would guess the Tesla owners thought they had chosen a particular safe place to do some extreme Autopilot testing to show off the car.

Apple satellite imagery looks like this:

Google StreetView shows these areas aren’t being mapped, which honestly says to me traffic is very low including police and thus a prime area for vehicle stunts:

Zillow offers a rather spooky red arrow in their description of the lot, also pointing roughly to where the burning car was found.

And I see lines, do you see lines?

Howabout in this view? Do you see lines plausibly indicating side of a road?

Ok, now this will surely blow your mind. The men who allegedly told others they were going to show off the Autopilot capability on this road were driving at night.

Look closely at the yellow light reflecting on this curve of the road like a yellow… wait for it… line!

Emergency services personnel stand near the site of the Tesla vehicle crash in Spring, Texas, on April 17, 2021. PHOTO: REUTERS

Fighting the Fire

The Houston Chronicle quotes the firefighters in self-contradictory statements, which is actually kind of important to the investigation.

With respect to the fire fight, unfortunately, those rumors grew out way of control. It did not take us four hours to put out the blaze. Our guys got there and put down the fire within two to three minutes, enough to see the vehicle had occupants

This suggests firefighters had a very good idea of where the passengers were in the vehicle and how they were impacted, when everyone was reporting nobody in the driver seat.

The firefighter then goes on to say fighting the fire took several hours after all, but the technical description means it wasn’t live flames, just the ongoing possibility of live flames. Indeed, other Tesla after crashes have reignited multiple times over several hours if not longer.

Buck said what is termed in the firefighting profession as “final extinguishment” of the vehicle — a 2019 Tesla — took several hours, but that classification does not mean the vehicle was out-of-control or had live flames.

And then a little bit later…

…every once in a while, the (battery) reaction would flame.

It wasn’t on fire for more than three minutes. It could have reignited so we were on it for several hours. It was reigniting every once in a while.

So to be clear, the car was a serious fire hazard for hours yet burned intensely only for minutes. Technically it did burn for hours (much like an ember is burning, even when no flames are present) although also technically the fire fighters prefer to say it was a controlled burn.

Conclusion

Tesla is a scam.

As I’ve posted on this blog before

Tesla, without a question, has a way higher incidence of fire deaths than other cars.

There already are many twists to this new story (pun not intended) because the CEO of Tesla is peddling disinformation and misleading people — claiming Autopilot is always there and will save the world until it doesn’t and then backpedaling to “there was no Autopilot” and tightly controlling all the messaging and data.

Seems to fit the bill for gaslighting. Autopilot is both on always but off, as the car is to be safest yet smashed into a tree and on fire for minutes and burning for hours.

Tesla’s production highway tested beta manual autopilot using passive active safety literally couldn’t prevent hitting a tree as advertised.

Even when you’re driving manually, Autopilot is [A TOTAL SCAM] looking out for [NOTHING]