My construct of the "Ddulite" orientation largely driving health IT is not merely a theoretical construct. Ddulites (derived from the term "Luddite" with first four characters reversed) are:
Hyper-enthusiastic technophiles who either deliberately ignore or are blinded to technology's downsides, ethical issues, and repeated local and mass failures.
Here is an example of this disposition on display:
In a March 14, 2012 Pittsburgh Post Gazette article "Digital ease may complicate health care" by Bill Toland about the recent controversy caused by a Harvard study showing EHR's may actually increase test ordering thus raising, not lowering, medical costs, Nancy Finn, a medical consultant and author of "e-Patients Live Longer" is quoted as saying:
... In an ideal world, management would know if a software suite is going to improve health outcomes before it's rolled out, said Nancy Finn, a medical consultant and author of "e-Patients Live Longer." Unfortunately, though, uncertainty is built into the process.
"The only way to know [the systems] are inefficient and flawed is to deploy them, then correct them as we go," she said. [That is, they are experimental - ed.]
"That is the way that all of the new innovative technologies have worked over the years. We have to take the risk, and then improvements get made."
This statement in highly alien to medical ethics.
She is explicitly stating that this technology is experimental - "The only way to know [the systems] are inefficient and flawed is to deploy them" - and then states "We have to take the risk" where the "we" are unconsenting patients, i.e., not afforded the opportunity for true informed consent, and 'investigators' also often coerced to use these systems, i.e., clinicians themselves.
Never mentioned are the downsides of experimental technology such as health IT: patient injury, death, litigation against physicians and other clinicians entrapped into "use error" (errors promoted by the common mission hostility of today's health IT), or led into errors by poor software quality causing data corruption, misidentification or outright loss, and additional issues described by FDA (link) and others. Nor are ethical issues considered.
NO, Ms. Finn: "We" do NOT have to "take the risk."
There are scientific methods for improving experimental technologies such as "controlled clinical trials" with informed consent, opt-out provisions and built-in protections for patients and investigators.
The "trial and error", "learn-as-we-go", "computers' rights supercede patients' rights" approach you suggest, while perhaps appropriate for mercantile computing, is highly inappropriate for healthcare.
Such issues, I had believed, had been settled after WW2.
There is nothing to argue, and nothing to discuss.
-- SS
0 comments:
Post a Comment