Brand New Tesla Have Explosion of Hardware Failures

The critical safety trend is clear from 232 documented Tesla fires causing 83 fatalities, with patterns suggesting systemic issues growing worse over time rather than improving. Already we saw a 2025 model Tesla abruptly kill its owner.

The latest reports of brand new vehicles experiencing critical computer failures within just tens of miles of delivery — affecting core safety features like cameras and active safety systems — indicates Tesla is now shipping cars with fundamental flaws that create immediate safety risks.

Source: tesla-fire.com

The progression being charted is troubling, due to the unusually high death rate from design and manufacturing defects:

  • Early incidents (2013-2016) often involved battery punctures or severe crashes
  • Mid-period (2017-2020) saw increasing cases of spontaneous combustion in parked vehicles
  • Recent period (2021-2024) shows acceleration of incidents plus new failure modes like:
    1. Computer systems failing immediately after delivery
    2. Multiple instances of fires spreading to buildings
    3. Cases requiring massive water resources to extinguish
    4. Batteries reigniting days later
    5. Higher fatality rates per incident

Despite Tesla’s plummeting sales, such as nearly 20% decline in California due to growing safety concerns, some consumers continue to purchase these vehicles, echoing the puzzling consumer behavior seen with dangerous products like lawn darts in the past.

Chart: Michael Thomas. Source: CA New Dealers Association

The company’s mounting inventory of unsold cars and ongoing failures in basic quality, particularly regarding fire risks, make the continued demand difficult to comprehend. The unfortunate market failure is highlighted by sad stories like the following one, which documents growing fears by Tesla fans how they may be the next to die in a fire.

Tesla drivers are reporting computer failures after driving off with their brand-new cars over just the first few tens to hundreds of miles. Wide-ranging features powered by the computer, like active safety features, cameras, and even GPS, navigation, and range estimations, fail to work. …the broken rear-view camera goes against federal safety regulations, which should force a recall. At the moment, the main remedy being discussed is a computer replacement…. Tesla service is currently being overwhelmed by the issue, and Tesla is pushing service appointments to next year.

Tens of miles.

Presumably a Tesla official response will soon be released that people who want their cars to work shouldn’t sign papers that say Elon Musk is their lord and savior who can never be exposed or challenged.

Elon Musk Doesn’t Want Tesla to Have to Do Car-Crash Reporting…

A few years ago we noted Tesla engineering was so poor it was unlikely to last past 10,000 miles, as proven by reports from junkyards.

Now the brand is down to just tens of miles before the new owners flood service centers complaining of a critical safety failure.

Widespread computer failures in new Tesla deliveries suggests quality control has deteriorated to dangerous levels. Rather than addressing known fire risks, Tesla appears to be in a coverup, shipping vehicles with compounding safety problems, from battery fires to critical system failures rendering safety features inoperable within the first few miles of operation.

It’s not hard to imagine why a management culture that silences critics leads directly to products full of dangerous flaws, and keeps worsening over time as fewer and fewer critics are allowed to remain.

Its vehicles accounted for 21% of all U.S. recalls in the first three quarters of the year, according to recall management firm BizzyCar. […] Tesla recalled 1,858,774 vehicles in the September quarter, the highest in the U.S..

When car quality declines to the point where brand new vehicles can’t complete their delivery drives without disabling critical safety systems, it indicates severe manufacturing and engineering problems are being ignored rather than fixed. The fire incident data shows this pattern of allowing known safety issues to expand rather than investing in solutions.

Truly, and so painfully obviously, Tesla is the worst in manufacturing history. Imagine if after the Hindenburg burned up the response was “let’s make a million more of these disasters and prevent anyone from reporting!” That’s Tesla.

My 2024 LSE Commencement Speech

The Dogs of Cyberwar
A Lowly Hacker’s Warning
LSE Commencement 2024

Distinguished faculty, dear students, and those venture capitalists or intelligence agencies inevitably lurking in the back hoping to recruit our graduates into their latest ethical catastrophe:

Thirty years ago, I sat where you’re sitting now, though with considerably fewer people and a significantly more embarrassing haircut. Back then, I was the American oddity who lived day and night in the computer lab while my half-dozen classmates assembled in the Three Tuns, competing to see whose understanding of the Cuban Missile Crisis would solve all of humanity’s problems over another round of pints that cost a pound twenty each.

I spent considerable effort to get LSE on something new called the “World Wide Web” – a phrase now that sounds as charmingly dated as “information superhighway” or “freeze dried coffee”, which was by the way the only coffee you could find in London in 1993. Can you imagine a young American hacker stepping off a plane in London and realizing only too late I was expected to drink tea and write with a pen?

I almost immediately died from caffeine and keyboard withdrawal.

To keep calm and carry on I volunteered writing code to help a blind PhD student of political philosophy digitize his dozens of books into robotic speech. He taught me more about seeing the world clearly, learning page breaks didn’t really exist in our mind’s eye, than I ever taught him about data integrity flaws in OCR algorithms. However, I did save him from accidentally submitting his thesis with hidden instances of the letter S replaced by a number five — a substitution that in retrospect could have meant his analysis of Hobbe55tate of Nature would be credited for the invention of modern passwords.

As you might guess, I arrived here still a raw and immature sod raised on the dirt roads of rural America. When an LSE student repeatedly left their World War I essay about military vulnerability completely exposed on one of our four shared lab computers, the irony proved as irresistible as… relieving myself on a hidden electric fence back home. A risky temptation that I really should have resisted. After watching the pattern repeat daily with a stubborn predictability of the BBC weather forecast, I did what any country bumpkin would do facing an open barn door: I scattered pointed commentary about undefended positions throughout their work. Professor Stevenson, to my great relief, marked every single edit with a bright red circle, proving he dutifully read each word that we turned in — which is more than I could say for my fellow student about their own work.

Little did I know this kind of penchant for exposing vulnerability would become the perfect metaphor later in my career. Professor Kent, the best advisor anyone could ever ask for, encouraged me upon graduation to throw myself straight into the tech industry. And so I did. For three decades I’ve helped big and small organizations see vulnerabilities, in order to keep them grounded, to make their grandiose claims about safety less full of the stuff we knew in Kansas as meadow muffins. Or prairie pancakes.

As I learned quickly as a kid, one taste and you immediately knew it’s a good thing you didn’t step in it.

From LSE’s tiny computer lab up into the largest corporate skyscraper boardrooms, I have spilled gallons of red ink around tens of thousands or more of undefended positions. Although now stakes are rather higher than a student’s marked-down essay, and the giant hidden electric fences are incredibly more…well, shocking.

Let me explain.

Take Palantir, named without a trace of irony after Tolkien’s all-seeing stones that invariably corrupt those who use them. They pitched venture capitalists a “revolutionary” surveillance system to “predict and prevent terrorism.” Of course you can imagine how VCs’ eyes lit up with dollar signs, presumably the same way medieval merchants’ eyes lit up with dubloons at the prospect of selling torture devices to the Spanish Inquisition. “Think of the market opportunity!” they must have said. “Every Queen Isabel will want one!” Did you know studies today show that Palantir actually created the terrorists they promised to predict and prevent? The self licking ISIS cream cone is real.

Similarly, Tesla’s ‘Autopilot’ promised to end traffic deaths, then proceeded to invent entirely new ways for cars to kill people. It has achieved the remarkable distinction of making the Ford Pinto look like a triumph of safety engineering. Who needs a faulty gas tank when you have AI that can find entirely new ways to turn cars into crematoriums? Henry Ford may have won the Third Reich’s highest honor, but at least he didn’t try to rebrand his Dearborn Independent newspaper with a hakenkreuz and call himself Twittler.

You might think I’m being too glib about death or unfair to visionaries. “Surely,” you say, “their companies must have some redeeming qualities, like what about South Africans dreaming of turning Mars into New Rhodesia?” Well yes, I suppose in the same way the East India Company really streamlined the tea trade. Have you seen the grand old counting house? I noticed the gift shop doesn’t mention how they balanced their moral ledger. The problem isn’t that technology being assembled is unimpressive — it’s measuring who really pays for it.

Which brings me to why your, and my, LSE education matters more than ever. You see, the world is perhaps being affected by Silicon Valley today in the same way that Dresden was fire-bombed by some pioneering Palo Alto radar engineers. Tech desperately needs people who can spot the rather subtle differences between innovations and repackaged historical tragedies. They need people who, when presented with a “revolutionary” surveillance system called Bluesky, can say, “Ah yes, this is exactly like the Stasi, but with better UX design.”

You’ve been trained to see patterns that even the most brilliant engineers miss – not because they lack intelligence, but because they’ve never had to explain to Dr. Preston why Franco’s “move fast and break things” wasn’t about innovations in Jerry cans. You understand that every “disruption” has a history, every “innovation” a context, and every hot-rushed philosophy eventually breaks something rather important – like democracy, or human rights, or that quaint notion that public transit shouldn’t spontaneously combust.

Let me give you a current example. Are you aware of the thousands of networked autonomous vehicles quietly amassing at a former Cold War airfield outside Berlin? The press has cheered deforestation around the German capital as “Tesla’s biggest European output.” With your training, you might recognize this as rather like how France celebrated the Maginot Line as their biggest investment in concrete. We’re staring down the barrel of a cybernetic equivalent to Chekhov’s gun. Thousands of hackable vehicles in Act One, are going to cause chaos by Act Three.

You’re entering a world where technology companies have more power than most nations, yet demonstrate all the ethical sophistication of a first-year philosophy student discovering moral relativism. They need people who can see through the Silicon Valley doublespeak, who understand that “making the world a better place” often means “making ourselves richer at everyone else’s expense.”

When I left LSE directly to California, with only $50 and dried coffee crystals in my pockets, I thought I was leaving behind the rigorous historical thinking this institution taught me. Instead, I found it was my most valuable skill. While engineers around me focused on rapid valuation from throwaway ideas, I was trained to ask whether they should. And more importantly, I was trained to recognize when “unprecedented” innovation was actually a very precedented bad idea in a shiny new package.

At one point I sat in charge of software release gates that affected two billion users, navigating the dawn of modern mobile phones and gaming consoles. With an official title of “dedicated paranoid” I wore a t-shirt that simply said “why?” It turned out to be the most important question in Silicon Valley, though one that got me uninvited from a surprising number of launch events. Venture capitalists, I learned, prefer historical parallels to stop at the Wright brothers and skip the Hindenburg.

So, Class of 2024, as you leave these strangely sunny and bright, airy halls that I somehow remember as windowless and always wet from rain, please know that your historical training isn’t just about understanding mistakes in the past. It’s about recognizing when someone tries to repeat them while hoping nobody realizes. In a world where tech companies are speedrunning through every bad idea of the 20th century, we desperately need people who can find the causes of things to avoid every AI implementation becoming a case study in all our successor’s dissertations.

You have been trained to see through a growing fog of cyberwar, whether rising from hundreds of thousands of burning Model 3s attacking European cities or stuff spread by social media tycoons about their robots. Use your training in clarity of vision to improve society. The world needs your sharp tongue and sharper minds.

And to those venture capitalists in the back: yes, our graduates are available for hire. But I should warn you – they’ve been trained to spot patterns. Your term sheets look remarkably like Victorian labor contracts, just with time measured by TikToks.

Thank you, and congratulations.


Swasticars: Remote-controlled high-explosive vehicles stockpiled by Twittler outside Berlin.

FL Tesla Kills One Motorcyclist

Many motorcyclists have died from sudden impact with the rear of a Tesla. It’s another example of Tesla tragedy that begs the capability of their sensor system to react intelligently to common traffic.

According to the Florida Highway Patrol, the 47-year-old man was driving a Harley Davidson motorcycle northbound in the outside lane of Interstate 275 just north of milepost 43. Around 2:40 p.m., police said the motorcyclist drove across the [triangular lane merge buffer] gore at an entrance ramp and collided with the back of a Tesla Model 3 that was traveling next to it.

While crossing a gore area is technically improper it’s a relatively common quick exit maneuver made by motorcyclists, especially Harleys, that want to avoid dangerous merging with drivers’ blindspots. Little do they expect a Tesla to react in a dangerously and unpredictably inhuman way. The key expert questions here are:

1. Did Tesla’s systems detect the approaching motorcycle?

2. If so, did the car initiate sudden braking in response?

3. If there was sudden braking, was it an appropriate response or a dangerous overreaction by Tesla’s systems causing a deadly crash?