Improving patient safety is one of the most urgent issues facing healthcare today. Patient Safety and Quality Healthcare (PSQH) is written for and by people who are involved directly in improving patient safety and the quality of care.
PSQH welcomes original submissions from all healthcare professionals on topics related to safety and quality. PSQH publishes a variety of articles, to reflect the breadth of work being done in this field: case studies, surveys, research, book or technology reviews, guest editorials, essays, and letters to the editor.
Ensuring that clinicians are treating the patient they intend to treat—accurate patient identification (ID)—is one of many perennial patient safety problems. Some of the perennials, including handwashing, wrong site surgery, and patient ID, seem simple enough on the surface. They remain unsolved, however, because they are deceivingly complex. It is understandable that “improvement fatigue” may set in for organizations and individuals who have known for years that they have an unsolved problem, sometimes one they’ve been addressing unsuccessfully for years.
The annual conference known simply as HIMSS will take place in Chicago next week, April 12–16. By far the largest educational program and exhibition focused on health information technology each year, HIMSS is daunting and exciting. Each year, preparing to attend HIMSS begins to demand attention shortly after New Years. The conference usually takes place in February. When it is held in Chicago—where the parent organization, the Health Information Management and Systems Society is based—the schedule is moved back to April for obvious reasons. Of course, last time HIMSS was held in Chicago in April, it snowed. This year, few of us will be shocked by cold precipitation after the winter we’ve had.
This is Patient Safety Awareness Week (PSAW), which is focused on the theme “United for Safety.” The National Patient Safety Foundation (NPSF) explains that the theme is meant to rally all stakeholders—from care providers, corporate executives, and vendors, to patients and their families—around a commitment to “keeping patients and those who care for them free from harm.” The campaign emphasizes patient engagement, communication among patients and clinicians, and on how the quality of relationships affects the delivery of care. United for Safety is also the theme of NPSF’s Patient Safety Congress, to be held next month in Austin, Texas.
On January 28, the Institute for Safe Medical Practices (ISMP) published an alarming report exposing the dirty secret about drug safety in America. Its report properly chronicled that pharmaceutical companies are largely responsible for collecting and reporting adverse drug events to the Food and Drug Administration (FDA) and most notably, that they’re doing a substandard job at it.
Although consumers and providers may also report adverse events, the FDA Adverse Events Reporting System (FAERS) relies primarily on reports from pharmaceutical companies. Those pharmaceutical companies view this task as burdensome and expensive. According to the Wall Street Journal, they have outsourced it for the lowest possible cost or allocated the fewest resources to it internally. Clearly, there’s a conflict of interest in a system that relies on self-regulation. And that conflict is only amplified when those tasked with monitoring post-approval drug safety turn around and outsource the responsibility to the lowest bidder. The result is a drug safety monitoring system that, according to ISMP, “suffers from a flood of low quality reports from drug manufacturers.”
As many expected, a malpractice lawsuit was filed on Monday, January 26, against the physicians and endoscopy clinic that treated the late Joan Rivers. CNN reported that the family of the 81-year-old comedian wants to "make certain that the many medical deficiencies that led to Joan Rivers' death are never repeated by any outpatient surgery center."
As this lawsuit unfolds in the months ahead, it will examine alleged errors made at Yorkville Endoscopy in New York City, including failing to identify Ms. Rivers’ deteriorating vital signs and respond in a timely manner. Significantly, the lawsuit will also review numerous deficiencies cited by the New York State Department of Health regarding basic patient rights, including failing to obtain informed consent for each procedure performed.
The high-profile nature of this case should be expected to increase the scrutiny of healthcare organizations’ informed consent processes and documents. Articles in the lay press will likely advise the public to do the following...
In title of every post in his long-running blog, Mark Neuenschwander offers readers a snapshot of what’s on his mind. From “I’ve been thinking about Tina, Sarah, Slinkies, and how improve may improve your group’s productivity,” to “…Dodgers, Webinars, leadership, and the importance of sitting near an exit,” Neuenschwander covers a wide swath of what’s important in life while always zeroing in on medication safety, point-of-care technologies (especially barcoding), and foundational principles of patient safety. He’s always entertaining, real, and informative.
Most recently, in “…drugs, wars, Christmases, and your hospital,” Neuenschwander describes a medication error that killed Loretta Macpherson in December 2014. He points out that although more than two-thirds of hospitals in the United States now (finally!) barcode scan patients and medications at the bedside, only five or six percent scan to verify the component ingredients in compounded products.
PSQH editor Susan Carr shares her experience hearing Don Berwick, who recently ran for governor of Massachusetts and is also previously CEO of the Institute for Healthcare Improvement and administrator of the Centers for Medicare and Medicaid Services, speak on election night.
I pay close attention to efforts to quantify the number of errors and amount of harm caused by medical error, and I am often uncertain what to make of the results at least as they sometimes are reported.
Numbers are powerful, especially when conveyed in concrete, familiar units.
…80,000 die each year partly as a result of iatrogenic injury, the equivalent of three jumbo-jet crashed every 2 days” (Leape, 1994).
…the results of the study in Colorado and Utah imply that at least 44,000 Americans die each year as a result of medical errors. The results of the New York Study suggest the number may be as high as 98,000 (Institute of Medicine, 2000).
Those numbers caught everyone’s attention years ago and became mantras for the patient safety movement. More recently, John James’ estimate that between 210,000 and 440,000 patients die prematurely each year due to preventable harm renewed discussion about quantifying the magnitude of the problem (James, 2013).
Our responses to the news that Ebola had been diagnosed in the United States for the first time reveal gaps in our understanding of how to protect others and ourselves from Ebola and other infectious diseases. When we overreact in fear and take comfort from actions that don’t actually make us safer, we may overlook aspects of our systems and institutions that really do put us at risk.
The case of Thomas Duncan, diagnosed in Dallas with Ebola on September 26, 2014, reveals how unreliable our systems can be, especially under stress. The actions of Texas Health Presbyterian Hospital Dallas, where Duncan went for emergency care when he first became ill, reveal broad problems with implications that reach beyond the immediate response to one patient with Ebola.
First, I must disclose a conflict of interest: I am co-author of one chapter in this edited collection, and the editor, Lorri Zipperer, is a close friend and colleague. I was pre-disposed to like this book, and as I spend more time with the other chapters, my respect for Lorri’s vision and the resulting text continues to grow.
Efforts to improve patient safety should be informed by the best evidence, information, and knowledge (EI&K) available, but often they are not. This is a familiar, if unexamined, problem in this time of “information overload,” but first I should review how the terms EI&K are defined and used in the book. These terms are common but not often used in the safety and quality improvement literature as precisely as they are in Patient Safety:
- Evidence is the result of research, of tested hypotheses, such as trials and studies published in peer-reviewed publications across all disciplines (not just medicine).
- Information is data that has been analyzed, organized, and printed/presented for a specific use.
- Knowledge is what individuals know, either implicitly or explicitly. Knowledge is dynamic, with elements of action or experience.
Many of us are familiar with the challenge posed by the abundance of evidence, information, and knowledge currently available about all things. It is exhilarating that we live in a time of rich and increasingly available resources, but it is rarely self-evident how best to access the EI&K we need or easy to feel confident that we’ve found the best advice on a given subject. How do we know, for example, that what we really need is on page 10 or 25 of our Google search results or will only appear if we use a particular search word. Social media such as Twitter and email discussion groups have made experts more accessible than ever, but knowing who has the answer to your question, having the time to search, and simply knowing where to begin, is not always easy. These challenges exist in patient safety, too, with potentially profound implications for patients and all who are involved in their care.
New guidance from the Centers for Medicare & Medicaid Services (CMS) recommends monitoring of patients receiving opioids.
AdverseEvents’ primary customers are health plans, PBMs, health systems, and hospitals. We provide these healthcare decision makers with important insight on drug safety concerns that were not revealed during clinical trials and are not being communicated by the manufacturer.
I had the radio on as I drove to the market, but I wasn’t really listening until I heard “It's very important to have a culture of safety that says, if you've got a problem, talk about it.” I didn’t recall ever having heard the phrase “culture of safety” outside of safety improvement circles.
A recent CDC report found that 1 in 25 hospital patients develop healthcare-associated infections (HAIs). According to the report, about 75,000 of these patients die during their hospital stay.
On May 7, the Dept. of Health and Human Services (HHS) reported on the effects of federal efforts to decrease the rates of hospital-acquired conditions (HACs) and readmissions. These efforts, implemented through a system of Hospital Engagement Networks (HENs), have been supported with funding from the Affordable Care Act (ACA), starting with grants to the HENs in October 2011.
In 2005, the Pennsylvania Patient Safety Reporting System received a report of a near miss that brought up a new issue in the nursing field. It involved a nurse who worked in two hospital facilities; one facility used yellow wristbands for limb restrictions (do not use this limb) and the other facility used them to indicate DNR (do not resuscitate). This nurse had a patient with arm restrictions. So, well-intentioned, she placed a yellow wristband on the patient’s arm.
OpenNotes is a program that allows patients to read their clinicians’ notes as they appear in the medical record. The program started less than five years ago and currently includes 2 million patients in health systems across the country, including the Department of Veterans Affairs.
Patient safety has been a topic of interest at the annual HIMSS conference for the 10 years that I have attended and probably longer. Safety improvement usually appears on the program and in discussion as the result or at least the goal of technology implementations.
The latest issue of Patient Safety & Quality Healthcare (PSQH) offers articles that focus on the pros and cons of information technology in healthcare, an interdisciplinary approach to managing drug shortages, new efforts to improve workplace safety for healthcare professionals, and more.
Rick Boothman has been thinking about when he knew that the University of Michigan Health System, where he is executive director of clinical safety, had reached the “point of no return” about openly discussing preventable harm with patients. In his keynote to the MITSS Annual Dinner in November, Boothman explained that he had been struck by something he heard during a meeting in Washington, which led him to reflect on Michigan’s commitment to this approach.