Americans tend to put a lot of emphasis on popularity and wealth, to the point where integrity is completely ignored. Here’s a notable exception. A new ProPublica article reports that the popular and wealthy Montana oncologist Thomas Weiner just had his medical license revoked after a years-long investigation into patient deaths.
Unfortunately the framing by the journalists catching him is of a rogue doctor, a system that eventually corrected, and accountability that could be delivered.
The framing is wrong.
He wasn’t rogue, he was the system.
Look at the DOJ lawsuit against Weiner, for example, because it doesn’t allege mistakes. It alleges prescribing needless treatments, double billing, seeing patients more frequently than necessary, and upcoding. That’s technically not being charged as malpractice. That’s a system of revenue extraction where patients were inventory. The lawsuit is erecting a concept of integrity that was breached.
Consider how Scot Warwick was diagnosed with Stage 4 lung cancer in 2009. For eleven years, Weiner subjected him to chemotherapy until he died. The autopsy found he never had cancer. The medical board confirmed: the expensive unnecessary chemotherapy treatments over eleven years is what killed him.
Eleven years of billing events. Eleven years of being told he had a rapid terminal cancer that never existed.
Staying alive for more than three months was a cognitive trap that made him believe he was being cured for years on end by a genius, when he was actually being poisoned.
Closed Loop Abuse
Weiner became the highest-paid doctor at St. Peter’s Health, which was undoubtedly how he wielded power over more ethical colleagues. Tens of millions of dollars is much easier to generate when it is decoupled from the reality of actual medicine. Lesser earners couldn’t compete with his false narratives. He drove out hospital leaders who questioned his judgment. Colleagues feared him because of his wealth. The medical board, surely calculating the revenue stream, renewed his license three times after receiving thousands of pages of hospital documentation alleging malpractice.
The system didn’t fail when you look at the system design.
The system apparently measured only a very narrow, market-based set of outcomes. The patients who were paying to be poisoned by their doctor were cynically reclassified as survivors happy to pay to be alive.
Healthcare accounting controls can audit codes in financial consistency. Did the bill match the procedure? Did the procedure match the documentation? In this case, Weiner documentation was the fraud. He wrote false diagnoses into medical records.
The billing system accurately processed a doctor who lies. The accountants had no controls to stop the fraud, since they only could verify that cancer treatments were properly coded as cancer treatments.
Revenue as the integrity check is therefore the obvious mistake. A motto of “if they pay it’s ok” undermines the entire concept of measuring outcomes in healthcare terms. When the big money flowed, it did not stop an extreme reversal from care to harm.
What Catches What
Consider the asymmetry: if Weiner had made a privacy breach, like in the old days, and stole his patients’ credit cards or sold their data, 1990s-era HIPAA enforcement would have activated immediately. Privacy has an architecture of technical controls, mandatory reporting, statutory penalties, institutional liability.
But he breached integrity instead. The doctor fabricated patient data. He created documentary evidence of cancers that nobody had. And for that, American medicine showed up empty handed and a day late. The patients were dying and no equivalent tripwire could be tripped.
Privacy is enforced to stop breaches.
Integrity is… dead.
The False Claims Act (FCA) is an accounting control method that came around and finally caught what the medical board wouldn’t. The FCA doesn’t ask “was this good medicine?” It asks “did you bill the government for things that didn’t happen?” Federal fraud investigators asked whether charts matched reality. It was a critical failure exposure, especially since the medical profession’s own accountability systems never asked that question.
Beneficiaries of Integrity Breaches
One of the controversial aspects of privacy breach laws since 2003 is the concept of externality. If a hospital fails to protect privacy for data it handles, the harm goes directly to the patient and not the hospital itself. This externalization is how regulators forced hospitals to care about the data, so they would prevent harms even when those harms aren’t felt by them directly.
Integrity breaches are a different risk model, because the hospital clearly stands to directly gain when patients are harmed. St. Peter’s Health settled for $10.8 million in False Claims Act violations related to Weiner’s billing. The hospital wasn’t just failing to stop him, as they were billing Medicare for his treatments. Databases of false diagnosis Weiner entered generated huge revenues for the institution.
That is why we should not call this a “rogue doctor” narrative, and we should refuse to insulate the hospital. It obscures that an operations model created the incentive, the institution collected the payments, and the oversight mechanisms protected the revenue stream. It was external journalism that made the design flaws untenable.
Weiner was so good at breaching integrity that he still has supporters despite being victims. Facebook groups. Billboards. Former patients believe he saved their lives. Some of them never had cancer. They just don’t know it. The chilling reality is the ones who finally figured it out and could testify against him are also the dead.
Anthony Olson received nine years of chemotherapy for a cancer that never existed. His body will be recovering for a while. He says he thought about joining the Weiner support group to share what happened to him and convince others, but concluded: “I assume there is nothing I can say to them that will bring them around to reason.”
The AI Death Canary
This case is instructive for reasons far beyond a corrupt healthcare system in Montana, or even America. This is really about any and all healthcare even thinking about AI.
Weiner exploited a simple structural fact: American healthcare validates activity with documentation, not reality of outcomes. He wrote false diagnoses detached from standards and transparent validation. Systems processed the lies. Revenue flowed without need. No integrity control existed between what he claimed and what could be proven by medicine. Experts were dismissed.
Now consider AI diagnostic systems entering a Weiner-phase.
The AI recommends treatment as Weiner did. The recommendation becomes documentation as Weiner did. The documentation generates billing as Weiner did. The billing generates revenue as Weiner did. The revenue validates the AI, and the business operations department celebrates banner profit.
What validates the diagnosis?
The same closed loop becomes even more deadly when automated. The same absence of integrity architecture. But at scale, at speed, and with an additional accountability gap: when the AI is wrong, who is responsible? The vendor? The hospital that deployed it? The doctor who followed the recommendation? The regulator who approved it?
Weiner was apparently obviously wrong and implicated repeatedly and it took Montana years to admit their mountains of mistakes.
It required an autopsy, a whistleblower, thousands of pages of documentation that regulators ignored, and finally external journalism that didn’t care how popular or wealthy a doctor is. That’s just one doctor. In one hospital. With a body count large enough that investigators are still determining.
AI systems will inflated the harms as it generates millions of recommendations. The vendors will profit whether those recommendations are correct or lethal. The same business model Weiner exploited with integrity breaches for revenue extraction, still has a documentation flow that no one validates against reality.
Industrialized integrity breach.
You wouldn’t put your medical records into a database that didn’t have authentication to control privacy. Why would you put your medical records into a database that doesn’t have controls to detect and prevent tampering and fraud?
Goodhart’s Oncology
The problem described here isn’t novel, and even has a name: Goodhart’s Law. When a measure becomes a target, it ceases to be a good measure.
American medicine still has concepts of quality. But it redefined quality measures as revenue. The logic at each step was too weak to see the whole problem: good care attracts patients, patients generate billing, therefore billing indicates good care.
The proxy replaced the thing. And people were killed by the breaches of integrity.
McNamara’s Pentagon made the same error, for an obvious example. Body counts became the measure of success in Vietnam. How much rice was seized? The metrics were infamously gameable, so it got gamed. The air force claimed to have destroyed more trucks than even existed in Vietnam. Villages were destroyed, babies and elderly counted as combatants, officers promoted for numbers that meant nothing. The system optimized to a metric while losing all support and the war. McNamara even commissioned the Pentagon Papers that in 1968 admitted such accounting couldn’t work and the war already was lost.
Professions once had guild structures with internal quality controls, peer accountability, and ethical codes enforced by exclusion. Inefficient, sometimes corrupt, often exclusionary. But they asked “is this good work?”
Market logic promised to replace guild subjectivity with objective metrics. Let the numbers decide. But the numbers only measure what they are made to count. Units shipped. Procedures and prescriptions made. And what’s countable is not inherently the thing that matters as an outcome.
An economics professor once told me the Soviet Union collapsed because its markets were so gamified. If you measured window production by square meters, they all broke before installation because too thin. If you measured window production by kilogram, they couldn’t fit frames because too thick. Until someone measured successful window installations, nobody was getting windows, while someone was getting very rich for broken windows.
Weiner understood the assignment exactly like the Soviet “window boss”. He made revenue flows become the proof that care is happening. He proved the model works for the provider, for the institution, for everyone except the patients who were being poisoned for profit.
The AI vendors must be dissuaded from treating this as proof of concept. They cannot be permitted to inherit a system that measures activity as revenue instead of outcomes as truth. If medicine only measures whether a diagnosis bills, that’s not medicine. And yet that’s clearly the system into which AI already is being deployed.







