Failed White Ethnostate Was the Blueprint for Twitter Takeover

There’s a predictable path from Tesla’s killing-machines to Twitter’s destruction, one I warned about in 2016. That’s why I would say there’s crucial historical context missing from this late-to-the-party Atlantic article about Twitter’s transformation into an authoritarian platform. Here’s their seemingly provocative headline:

Musk’s Twitter Is the Blueprint for a MAGA Government: Fire everyone. Turn it into a personal political weapon. Let chaos reign.

Except, these warning signs were visible long before Twitter’s acquisition. In 2016 I presented a BSidesLV Keynote called “Great Disasters of Machine Learning,” analyzing how automated systems become tools of authoritarian control. The patterns were already clear in Tesla’s operations, showing striking parallels to historical examples of technological authoritarianism.

The Lesson of Rhodesia

Consider the history of a self-governing British colony that became an unrecognized state in southern Africa (now Zimbabwe), which has secretly been driving a lot of online trolls today. The abrupt collapse of Rhodesia stemmed from elitist minority rule systematically disenfranchising a majority population based on their race. When Ian Smith’s government unilaterally declared independence in 1965, it was presented as a “necessary” administrative action to maintain white “order” and white “efficiency” to prevent societal decay.

Sound familiar? As the Atlantic notes:

Musk’s argument for gutting Twitter was that the company was so overstaffed that it was running out of money and had only “four months to live.” Musk cut so close to the bone that there were genuine concerns among employees I spoke with at the time that the site might crash during big news events, or fall into a state of disrepair.

“Authorimation” Pattern Called Out in 2016

Great Disasters of Machine Learning: Predicting Titanic Events in Our Oceans of Math

My keynote presentation at the Las Vegas security conference highlighted three key warning signs that predicted this slide towards tech authoritarianism:

  1. Hiding and Rebranding Failures: Tesla’s nine-day delay in reporting a fatal autopilot crash—while vehicle parts were still being recovered weeks later—demonstrated how authoritarian systems conceal their failures. As the Atlantic observes about Twitter/X:

    Small-scale disruptions aside, the site has mostly functioned during elections, World Cups, Super Bowls, and world-historic news events. But Musk’s cuts have not spared the platform from deep financial hardship.

  2. Automated Unaccountability: I coined the term “authorimation” – authority through automation – to describe how tech platforms avoid accountability while maintaining control. The Atlantic notes this pattern continuing:

    Their silence on Musk’s clear bias coupled with their admiration for his activism suggest that what they really value is the way that Musk was able to seize a popular communication platform and turn it into something that they can control and wield against their political enemies.

  3. Technology as a Mask for Political Control: Just as Rhodesia’s government used administrative language to mask apartheid, today’s tech authoritarians use technical jargon to obscure power grabs. The Atlantic highlights this in Ramaswamy’s proposal:

    Ramaswamy was talking with Ezra Klein about the potential for tens of thousands of government workers to lose their job should Donald Trump be reelected. This would be a healthy development, he argued.

The “Killing Machine” Warning

My 2016 “killing machine” warning wasn’t just about Tesla’s vehicle safety—it revealed how automated systems amplify power imbalances while operators deny responsibility. Back then, discussing Tesla’s risks made people deeply uncomfortable, even as Musk himself repeatedly boasted “people will die” as a badge of honor.

Claims of “90% accuracy” in ML systems masked devastating failures, just as today’s “necessary” cuts conceal the systematic dismantling of democratic institutions. Musk reframed these failures as stepping stones toward his deceptively branded “Mars Technocracy” or “Occupy Mars”—a white nationalist state in technological disguise.

As the Atlantic concludes:

Trump, however, has made no effort to disguise the vindictive goals of his next administration and how he plans, in the words of the New York Times columnist Jamelle Bouie, to “merge the office of the presidency with himself” and “rebuild it as an instrument of his will, wielded for his friends and against his enemies.”

The fifteen years of Rhodesia’s “bush war” wasn’t a business failure any more than Twitter’s transformation is about efficiency. Labeling either as mere administrative or business challenges obscures the truth: these are calculated attempts to exploit unregulated technology, creating bureaucratic loopholes that enable authoritarian control while denying human costs.

Trust and Digital Ethics

Dismissing Twitter as a business failure echoes attempts to frame IKEA’s slave labor as simply an aggressive low-cost furniture strategy.

While it’s encouraging to see digital ethics finally entering mainstream discourse, some of us flagged these dangers when Musk first eyed Twitter—well after his “driverless” fraud immediately claimed lives in 2016… yet was cruelly allowed to continue the killing.

The more Tesla the more tragic death, unlike any other car brand. Without fraud, there would be no Tesla. Source: Tesladeaths.com

Now, finally, others are recognizing the national security threats lurking within “unicorn” technology companies funded by foreign adversaries (e.g. why I deleted my Facebook account in 2009). A stark warning about “big data” safety that I presented as “The Fourth V” at BSidesLV in 2012, has come true in the worst ways.

2024 U.S. Presidential election headlines indicate major integrity breaches in online platforms have been facilitating a rise of dangerous extremism

What have I more recently presented? I just met with a war history professor on why Tesla’s CEO accepts billions from Russia while amassing thousands of VBIED drones near Berlin. Perhaps academia will finally formalize the public safety warnings that some of us deep within the industry have raised for at least a decade.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.