In 2017, Attorney General Jeff Sessions became convinced that the opioid crisis was not the fault of cartels smuggling fentanyl across porous American checkpoints. And it wasn’t due to pharmaceutical companies corrupting drug approval officials and DEA administrators by hiring them as consultants after making decisions in the company’s favor. No. The opioid crisis was caused by American physicians coddling pain patients and addicts.
But reviewing all the actions of tens of thousands of physicians would take literally hundreds of years and thousands of agents, so the government paid for the development of a secret weapon. It hired a private company to create a new Artificial Intelligence that would be trained to look for actions taken by physicians that the DEA deemed “illegitimate.” The theory was that artificial intelligence-enforced precrime predictive models would provide a focused criminal deterrence in the practice of medicine. Sessions then committed twelve federal prosecutors to enforce the AI’s decisions in twelve different federal districts where the DEA perceived the opioid crisis to be at its worst.
The health care data services company tasked with creating the AI had been founded in 1973 and had gone through several phases of rebranding and restructuring, ultimately unifying under one brand in 2017. This was a company that provided analytical services to the U.S. government, boasting forty-five years of helping to detect fraud, waste, and abuse (FWA) working with programs like Medicaid and Medicare. There is a strong incentive in this industry to err on the side of labeling activity as illegal; indeed, “bounties” can be paid to persons and companies identifying FWA. This company’s main selling point was the use of innovative data analysis, including data mining and predictive analytics, to identify patterns and improve operational efficiencies. “Dynamic dashboards” were created to focus on what the AI programmers perceived to be criminal behavior.
AI decision-making is only as good as the data and parameters it is fed. But what defined criminal behavior to these neural networks? Not the law itself, as “usual practice of medicine” and “legitimate medical purpose” have no set metric. Instead, they asked the insurance companies to help them identify algorithms associated with the greatest amount of potentially recoverable FWA.
Companies like this keep their algorithms secret as “proprietary” risk-scoring methodology, but through court action, physician researchers have identified some of the criteria used to train this particular AI. Insurance companies told the AI that the following treatments were “medically unnecessary” and therefore indicative of fraud: genetic testing, orthotics, prosthetics, respiratory pathogen panels, high volume of beneficiaries, telehealth expansion, and ancillary lab services. The DEA then provided its interpretation of actions it thought showed criminal medical practice and would highlight “aberrant” behavior indicating a likelihood of diversion. These included the number of patients seen and prescriptions written per month, particularly the number of Schedule II prescriptions. The distance a patient drove, whether or not the doctor’s office accepted cash payments, and the prescribing of a “trinity” of drugs, including a narcotic with a sedative, muscle relaxer, or even, amazingly, an antibiotic.
These were all seen as indicative of criminal behavior. Ignoring the fact that medical textbooks, continuing medical education (CME), and evidence-based practice say otherwise. I’ll give you one example. Continuing to treat a patient after a negative drug screen is interpreted by the algorithm and the DEA as clear evidence of diversion, despite other branches of the federal government, namely the CDC and DHHS, specifically saying NOT to “fire” patients over drug screens, as it puts them at high risk of death and addiction. Doctors are now being convicted for “ignoring the risk of addiction” when the CDC says that treatment for pain can continue even if a patient has been diagnosed with addiction. (Page 57/100 CDC Clinical Practice Guideline for Prescribing Opioids for Pain — United States, 2022, Bullet 2, “Although identification of an opioid use disorder can alter the expected benefits and risks of opioid therapy for pain, patients with co-occurring pain and opioid use disorder require ongoing pain management that maximizes benefits relative to risks.)
The AI also uses claims, encounters, pharmacy invoices, beneficiary and provider enrollment files, state licensing board information, property records of targeted health care physicians, ownership/asset and financial filings of targeted health care physicians, and court records, as well as other custom data in its decision-making algorithm. Now we know why they throw a fit when a physician has a nice car, like Dr. David Lewis’ 1963 Rolls Royce. Perceived financial success seems to be one of the “sins” that brought on the Inquisition for him. Once trained to be on the lookout for these individually completely legal and reasonable practices, Chief Executive Officer Ronald G. Forsythe said, “We have combined the book smarts’ of AI and our analysts with the “street smarts” of our investigators to create one beast of a program integrity tool. Unleash the beast!” And unleash it they did, becoming, in effect, a private police force.
The following was found in documents in the possession of the DOJ after a protracted legal battle. The company’s “experienced technicians will accompany DEA agents and investigators executing warrants on a provider’s office and using previously identified patients from the PDMP data analysis (emphasis added) can digitally scan the identified medical records into an electronically retrievable format.” Scanning medical records without a warrant allowed the DEA to target these patients for investigation and brand their physician an “over prescriber.” Physicians started to be targeted all over the country, not because they had criminal intent or actually broke any codified law, not because they had sold pills by the dose or charged for services not rendered, but because the AI had determined, that they were at high risk of doing so, and in accordance with the precrime initiative, had to be shut down to make the opioid crisis better. Despite the doctors not having broken any law passed by Congress. This creates a self-reinforcing bias. By arresting and prosecuting identified doctors, often in the face of clear evidence of innocent intent, they justify their actions.
Physicians began to see their colleague’s clinic doors kicked down and watched as these doctors were pilloried in the media for prescribing “Millions” of morphine milligram dose equivalents!” which would then be conflated with millions of pills. No matter that this was a number accumulated over several years to sometimes thousands of patients and that palliative, cancer, sickle cell, and HIV/AIDs patients had not been excluded from the numbers.
Once in court, the testimony of “experts” cherry-picked by the prosecutors for their extreme views made conviction easy. According to a study in The Journal of the International Association of the Study of Pain, trained medical experts only agree 43.0 percent of the time when using ICD-10 criteria to diagnose chronic pain and 63.2 percent of the time when using ICD-11. That makes it a coin toss as to whether two physicians will agree about chronic pain. How in the world can we expect randomly chosen lay people without the benefit of at least four years of medical education to do better?
Next, the papers would announce that the targeted physician had been prosecuted and convicted for what, to other medical professionals, were nonsensical reasons. Like the driving distance of patients, even in rural and suburban settings, and for prescribing an antibiotic to a pain patient. The as-yet unindicted doctors were terrified. At least one, Dr. Charles Szyman, committed suicide AFTER being acquitted at trial. Many medical providers decided that it just was not safe to treat pain anymore at all, and not just primary care providers. Pain Specialists were targeted as if their training and education did not make them more qualified than government agents and even juries to make these judgments, and they started quitting in droves. This created an increasing burden on the physicians who continued to treat their patients suffering from chronic pain or addiction, driving up the doctor’s numbers and making them more vulnerable to attack until, in the end, tens of thousands of pain and addiction patients could not find a physician anywhere who was brave enough to treat them.
These patients were at extreme risk of either killing themselves, as many did, or going to the streets to try to buy their regular pain medications. Finding instead, fake hydrocodone and oxycodone tablets were laced with fentanyl, and they died by the thousands. Since fentanyl can be prescribed, every one of these deaths was classified as a prescription-related overdose, thereby creating another false metric to justify the previous action. American citizens have the right to receive compassionate and evidence-based health care, and physicians have the right to treat their patients in accordance with their education, training, and experience, without being prosecuted for violating the opinions of some health care executive or DEA agent. No matter what the AI “beast” says.
L. Joseph Parker is a distinguished professional with a diverse and accomplished career spanning the fields of science, military service, and medical practice. He currently serves as the chief science officer and operations officer, Advanced Research Concepts LLC, a pioneering company dedicated to propelling humanity into the realms of space exploration. At Advanced Research Concepts LLC, Dr. Parker leads a team of experts committed to developing innovative solutions for the complex challenges of space travel, including space transportation, energy storage, radiation shielding, artificial gravity, and space-related medical issues.