In a June 25, 2013 Bloomberg News article "Digital Health Records’ Risks Emerge as Deaths Blamed on Systems" by technology reporter Jordan Robertson (http://go.bloomberg.com/tech-blog/author/jrobertson40/), an EHR-harms case in which I am (unfortunately) intimately involved as substitute plaintiff is mentioned:

When Scot Silverstein’s 84-year-old mother, Betty, starting mixing up her words, he worried she was having a stroke. So he rushed her to Abington Memorial Hospital in Pennsylvania.

After she was admitted, Silverstein, who is a doctor, looked at his mother’s electronic health records, which are designed to make medical care safer by providing more information on patients than paper files do. He saw that Sotalol, which controls rapid heartbeats, was correctly listed as one of her medications.

Days later, when her heart condition flared up, he re-examined her records and was stunned to see that the drug was no longer listed, he said. His mom later suffered clotting, hemorrhaged and required emergency brain surgery. She died in 2011. Silverstein blames her death on problems with the hospital’s electronic medical records.

“I had the indignity of watching them put her in a body bag and put her in a hearse in my driveway,” said Silverstein, who has filed a wrongful-death lawsuit. “If paper records had been in place, unless someone had been using disappearing ink, this would not have happened.”

How can I say that?  Because I trained in this hospital and worked as resident Admitting Officer in that very ED pre-computer.  The many personnel in 2010 who were given the meds history by my mother and myself directed it not to paper for others to see, but to /dev/null.

Why can I say that?  Because the hospital's Motion for Prior Restraint (censorship) against me was denied outright by the presiding judge just days before the Bloomberg article was published (http://en.wikipedia.org/wiki/Prior_restraint):

Prior restraint (also referred to as prior censorship or pre-publication censorship) is censorship imposed, usually by a government, on expression before the expression actually takes place. An alternative to prior restraint is to allow the expression to take place and to take appropriate action afterward, if the expression is found to violate the law, regulations, or other rules.

Prior restraint prevents the censored material from being heard or distributed at all; other measures provide sanctions only after the offending material has been communicated, such as suits for slander or libel. In some countries (e.g., United States, Argentina) prior restraint by the government is forbidden, subject to certain exceptions, by a constitution.

Prior restraint is often considered a particularly oppressive form of censorship in Anglo-American jurisprudence because it prevents the restricted material from being heard or distributed at all. Other forms of restrictions on expression (such as actions for libel or criminal libel, slander, defamation, and contempt of court) implement criminal or civil sanctions only after the offending material has been published. While such sanctions might lead to a chilling effect, legal commentators argue that at least such actions do not directly impoverish the marketplace of ideas. Prior restraint, on the other hand, takes an idea or material completely out of the marketplace. Thus it is often considered to be the most extreme form of censorship.

The First Amendment lives.

(I wonder if it irks the hospital that they cannot perform sham peer review upon me now that the censorship motion is denied.  Sham peer review is a common reaction by hospital executives to "disruptive" physicians, but I have not worked there since 1987 and I no longer practice medicine.)

In the Bloomberg story Mr. Robertson wrote:

... “So far, the evidence we have doesn’t suggest that health information technology is a significant factor in safety events,” said Jodi Daniel (http://www.healthit.gov/newsroom/jodi-daniel-jd-mph), director of ONC’s office of policy and planning. “That said, we’re very interested in understanding where there may be a correlation and how to mitigate risks that do occur.”

In my opinion this statement represents gross negligence by a government official.  Ms. Daniel is unarguably working for a government agency pushing this technology.   She makes the claim that "so far the evidence we have doesn't suggest significant risk" while surely being aware (or having the fiduciary responsibility to be aware) of the impediments to having such evidence.

From my March 2012 post "Doctors and EHRs: Reframing the 'Modernists v. Luddites' Canard to The Accurate 'Ardent Technophiles vs. Pragmatists' Reality" at http://hcrenewal.blogspot.com/2012/03/doctors-and-ehrs-reframing-modernists-v.html  (yes, this was more than a year ago):

... The Institute of Medicine of the National Academies noted this in their late 2011 study on EHR safety:


... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

Also in the IOM report:

… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.

More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”


I also noted that the 'impediments to generating evidence' effectively rise to the level of legalized censorship, as observed by Koppel and Kreda regarding gag and hold-harmless clauses in their JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause: Implications for Patients and Clinicians", JAMA 2009;301(12):1276-1278. doi: 10.1001/jama.2009.398.

FDA had similar findings about impediments to knowledge of health IT risks, see my Aug. 2010 post "Internal FDA memorandum of Feb. 23, 2010 to Jeffrey Shuren on HIT risks. Smoking gun?" at http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html.

I also note this from amednews.com's coverage of the ECRI Deep Dive Study (http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html):


... In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly. That is just one example of 171 health information technology-related problems reported [voluntarily] during a nine-week period [from 36 hospitals] to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events. Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.


One wonders if Ms. Daniels' definition of "significant" is when body bags start to accumulate on the steps of the Capitol.

I also note she is not a clinician but a JD/MPH.

I am increasingly of the opinion that non-clinicians need to be removed from positions of health IT leadership at regional and national levels.

In large part many just don't seem to have the experience, insights and perhaps ethics necessary to understand the implications of their decisions.

At the very least, such people who never made it to medical school or nursing school need to be kept on a very short leash by those who did.

-- SS

0 nhận xét:

Đăng nhận xét

 
Top