A paper I recently wrote on a critical issue in healthcare IT was rejected on first pass by the Medical Informatics academic community.

The paper concerns the profound lack of publicly available data on unintended adverse consequences of healthcare IT and proposed steps that could be taken by ethical clinicians and others to remediate this gap.

I have decided not to make "revisions", feeling the "problems" with the paper were likely more about its topic than its substance or format, making the topic unpublishable in the medical informatics literature. I am therefore making the paper available publicly.

It is entitled "Remediating an Unintended Consequence of Healthcare IT: A Dearth of Data on Unintended Consequences of Healthcare IT." The full paper is available via Scribd at this link:

http://www.scribd.com/doc/28747771/
(MS Word .doc format).


Only this month has the FDA even acknowledged patient injury and deaths due to health IT problems, and admits their numbers are likely the "tip of the iceberg" (in a future essay I will explain my reasoning as to why I believe their numbers may be three orders of magnitude or more off the mark and perhaps four orders of magnitude off when HIT goes national).

About the paper:

Abstract:

Case reports, systematic statistical data and other information on unintended consequences (UC’s) of healthcare information technology (HIT) is relatively scarce despite ample literature on potential HIT benefits. This impedes optimal efforts at computerization of healthcare, and can and should be remediated.

Objectives: To illustrate the relative scarcity of information on HIT UC’s, suggest contributing factors, and recommend tactical measures for improvement such as better user reporting of HIT UC’s and better diffusion of existing literature on the phenomenon.

Methods: A number of recent indicators for scarcity of UC information were compiled and possible reasons described. Examples of suboptimal adverse results disclosures in related domains (e.g., the pharmaceutical industry) that may hold lessons for HIT were included.

Results: UC information on HIT is relatively scarce likely due to a variety of influences and complex interactions among and between medicine, informatics, government and industry that, left unaddressed, may lead to delays or other harm to good faith efforts to computerize informational aspects of healthcare delivery and research.

Conclusions: The relative scarcity of definitive information on the extent of HIT UC’s should be addressed in a responsible and ethical manner by clinicians, regulators and other stakeholders if this technology is to be successfully rolled out nationally.

While some reviewers commented on paper organization and formatting issues, which is fair, the most striking review comments received were these:

This paper addresses a potentially important issue but adds little that is new or that goes beyond what a reader might find in a major city newspaper.
and
Proposing a classification of sources of UC [unintended consequences - ed.] and analysis of reasons for undereporting of each type in the resulting classification could be a useful addition to the field.

I do not recall reading many, if any, articles about the covering up of healthcare IT dangers in major city newspapers. Further, it's hard to "classify UC's" when there is scant data available about them in the first place. I feel it's more important, as I did in the paper, to propose reasons for underreporting of unintended consequences in a global fashion and propose remediation steps, not perform a useless exercise of classifying that which is tightly suppressed.

I have the experience of being one of a very few medical informatics professionals to publicly challenge the HIT hysteria beginning over a decade ago at a website at this link and observing the reactions of the informatics community to that site. In addition to that experience, here are a few more points on why I think the paper unpublishable by the informatics community due to its controversial, HIT business-unfriendly topic:

One reviewer opined they'd recognized the writing style and:

... may have seen the paper prepublished on a blog somewhere.

Coming from supposed information experts who must be aware of search engines and their indexing of blogs (this blog's stories uniformly coming up very high in Google searches, for instance), this comment was remarkable.

It would seem the smart thing to have done would have been to prove their hypothesis false in a five-minute effort rather than slandering me to the editor. Further, if they'd recognized my writing, they'd surely have known I once ran a scientific library and was well aware of such publication issues, and write for this blog as well on the ethical issues concerning scientific publication. I propose the reason behind that comment was hostility.

Finally, a reviewer offered this gem:

Out of curiosity, I also wonder why all the web sites cited were accessed on the same date [the date of paper submission - ed.], if the date was noted at all.

Coming from a community of supposed computing and information experts regarding the stated dates when cited websites were "last accessed", I could only shake my head.

Peer review being somewhat of an echo chamber regarding controversial social issues in healthcare informatics (i.e., against the flow of the HIT business) , I turn the paper over to the court of public opinion.

Fortunately, the paper will likely get far more exposure where it matters - i.e., outside the academic informatics orthodoxy - via web based dissemination than via publication in rarified informatics journals.

-- SS

0 nhận xét:

Đăng nhận xét

 
Top