Every time Dr. Gina Huhnke writes a prescription for opioids, she first consults a number in the patient’s electronic health record.
That number, a risk score that providers can use to gauge a patient’s risk for a substance-use disorder, tells her whether she should prescribe the drug or instead have a conversation about drug addiction.
“Sometimes even the patients themselves are surprised,” said Huhnke, director of medical affairs and emergency medicine at Deaconess Health System, based in Evansville, Ind.
The score is algorithmically generated by a platform created by Appriss Health with information from prescription drug monitoring programs and patient health histories.
Wait a minute – how did this corporation get access to not only our PDMP data, but our full health histories?
So many law-enforcement and business interests feel free to browse through our medical information since it became electronic. I thought HIPPA had privacy protections – or is that only for the patients themselves and the office staff?
Forty-two of the 52 prescription drug monitoring programs in the U.S. use Appriss Health to access state data, drawing on the company’s Awarxe database, which catalogs all of a state’s controlled-substance prescriptions.
So Appriss sucks all the data off almost every state’s PDMP and keeps it for themselves to clandestinely manipulate any way they want.
Appriss can design its software and database however it wants, feeding our most private medical information into its “proprietary algorithm” which then arrives at our “risk score”, a secretly fabricated “magic number” that will decide whether we qualify for effective paincare.
Not my doctor, not another medical professional, but endless lines of code in the software from Appriss will decide my fate. And you can’t talk back because it’s a “proprietary algorithm” making the decision, not a human.
Am I the only one that sees a problem with this?
That information is integrated into pharmacy and clinical workflows with Appriss’ PMP Gateway, for which Appriss Health charges a fee per provider—except in several states, such as Indiana and Virginia, that have purchased licenses for all the providers and dispensers within their borders
- Whereas a basic PDMP report might contain a historical list of prescriptions for, say, the last two years for a patient,
- Narxcare visualizes that information and produces risk scores so providers don’t have to sort through heaps of raw data
—and it does so from within the pharmacy management system or EHR, including in those made by Epic, Cerner, Allscripts, eClinicalworks and Athenahealth. Narxcare has been around since early 2017 after growing out of software Appriss first released in 2011.
There it is again: they have access not only to our opioid prescriptions but our whole medical record.
How is this not a HIPPA or even Civil Liberties issue?
I’ve had depression diagnosed and treated with ongoing medication for decades, and that would bump up my risk score in such software. And I remember talking to my doctor about my anxiety – that would bump up my risk score even more.
I never consented to give all my medical information to some corporation that will use it against me and destroy my chances of ever getting opioid pain management again.
(Or is this another right we sign away in our “opioid contracts” or “pain agreements”?)
Appriss Health’s Narxcare analyzes the data.
“These numerical scores are awareness triggers,” said Dr. Jim Huizenga, Appriss’ chief clinical officer. However, providers still need to talk with their patients, he said
Well, we’ve all learned how such “suggestions” end up becoming prohibitions.
The data make those discussions with patients easier, said Rob Cohen, president of Appriss Health. “They can now say, ‘I’ve got this score that was produced through machine learning, and based on this score, I think you should be concerned,’ ” he said.
It takes the judgment out of providers’ hands and patients react better.
So now it’s a good idea for medical judgment to be taken out of the doctors’ hands?
Impersonal algorithms are taking over what used to be the deeply personal practice of medicine.
Instead of basing such a judgment on the personal knowledge your doctor has about you, perhaps from a long-running relationship, they insist we should base it on data that’s been processed and presented by some “proprietary algorithms”, whose inner workings cannot even be audited.
And if the result is wrong?
Who are you going to argue your case with? The algorithm?
the risk scores make patients more amenable to conversations, Huhnke said, especially when most patients have no idea that their doctors have this kind of information in the first place
What kind of “secret” information do these doctors have?
The data about us in the PDMP seems to be secret only to us patients, and now that our medical records are electronic, our personal information is routinely scanned by private corporations for their own ends.
When the algorithm used to calculate our risk score is a secret from everybody else, who knows what’s hidden in those hundreds of thousands of lines of code?
Since Deaconess began using Narxcare in May 2017, “I’ve had several patients who were not happy that I was able to access this information,” she said. “The risk score changes the conversation from, ‘We’re the police, and we’ve caught you doing something bad’ to ‘We’re the healthcare providers, and we’re here to help.’ ”
This is a joke, right?
When it comes to opioids, we’re not treated by healthcare and doctors anymore, we’re treated by algorithm and law enforcement.