Category Archives: Security

Bottom Line on Kohler Toilet E2EE Claims

A security researcher, supposedly exposing privacy risk in a Kohler networked toilet, didn’t get to the bottom of anything.

But seriously, what is this crap?

The initial issue with Kohler using the term “end-to-end encryption” is that it’s not obvious how it could apply to their product. The term is generally used for applications that allow some kind of communication between users, and Kohler Health doesn’t have any user-to-user sharing features. So while one “end” would be the user, it’s not clear what the other end would be.

The researcher takes issue with the term E2EE, despite an already-compromised meaning, pretends it has a pure canonical definition, then catches Kohler failing to meet his fictional standard.

That’s a definitional sleight of hand and for what end, exactly?

Whatsapp is a Facebook product that falsely claims E2EE. When they say “we can’t read your messages” they actually mean they can read your message when your contact taps a Report button, and they harvest all the metadata, and cloud backups may be accessible, and….

The researcher even cites Whatsapp as an example of E2EE. That’s like saying Exxon is an example of how to protect the environment. Marlboro is an example of healthy living.

At least Kohler is being plain and honest about being an end of their encryption. They say they will use the data and why. What the hell are huge warehouses of Whatsapp staff doing with all the data they harvest from bogus E2EE, which apparently even fooled this researcher into promoting?

Talk about burying the lede: if you want to hunt vulnerabilities, the Kohler AI training angle is actually interesting research! That’s where you could say it’s behind in the privacy department.

What happens when the de-identified stool image datasets get breached or sold? What’s the actual re-identification risk? What are the clinical validation standards for the insights they’re selling?

Instead we got “users at a company who use your data can access your data.

No shit.

A real security/privacy analysis of the back-end architecture was available and the researcher chose definitional games instead. I mean, if you want to hate on Kohler, there is plenty to dislike without cooking up encryption semantics.

The subscription model is $600 for hardware that becomes a brick if you stop paying or they shut down. That’s the enshittification lifecycle applied to your actual toilet.

De-identification is hard, and this is distinctive dumps. Stool images are biometric-adjacent. The claim that de-identified toilet photos can’t be re-identified is… doubtful.

The gut health insight market is largely unvalidated. What evidence-based intervention follows from the data? “Your stool is different today” brings what actionable change beyond what you can detect naturally already? It’s quantified self for a process that mostly works fine without surveillance.

Attack surface expansion. Your toilet worked fine before. Now it’s a networked sensor with dependencies, firmware updates, and an app that needs permissions. Every connected device adds more liability; this one points at you with your pants down.

Subscription healthtech has misaligned incentives. They need you anxious enough to keep paying but not so alarmed you see a real doctor. That’s a weird optimization target to sit on.

And so forth… as I’ve said on this blog about “log” data in waste water, for at least ten years if not more?

Log analysis for wastewater plants

AZ Tesla in “Veered” Crash Head-on Into Dumptruck

The report makes it clear, Tesla caused a sudden veered acceleration event like a suicide attack.

Based on the initial investigation, the driver of the Tesla was originally traveling eastbound on Cactus Road before veering left and driving the wrong way into the westbound lanes of Cactus Road, where it collided head-on with the dump truck in the curb lane.

The video makes it even clearer.

Data Integrity Breach Kills Seven, Injures 700: Abbott Discloses Failed Diabetes Monitor

Data breaches of privacy are still far more documented and discussed than integrity ones, but one of the clearest cases yet has been posted:

Certain glucose monitors from Abbott Diabetes Care are providing users with incorrect glucose readings, an error that has been linked with the deaths of at least seven people and more than 700 serious injuries worldwide, according to an alert from the US Food and Drug Administration.

This begs integrity breach regulation. If Facebook were forced to disclose their breaches (inaccurate data causing harm), let alone OpenAI, it would significantly improve public safety.