Making Sense of a Safety Reporting System’s Data with BI Software

January / February 2012

Making Sense of a Safety Reporting System’s Data with BI Software

Making Sense of a Safety Reporting System’s Data with BI Software
Image © Peterfactors | Dreamstime.com

Safety incident detection and analysis are key components of a framework for patient safety improvement—both allow for understanding the nature of adverse events and can inform healthcare process enhancements to prevent error recurrence (Pronovost, et al., 2009). As the culture of error reporting has grown, the accumulation of data has outpaced our ability to effectively and efficiently analyze it for safety and quality interventions. (Johnson, 2003; Boxwala, et al., 2004). Several methodologies have been described for detection of adverse events, including chart review, voluntary reporting, and surveillance trigger tools (Ferranti, et al., 2008b; Naessens, et al., 2009). Limitations exist with each, and voluntary reporting systems are dependent on staff to report, which is shaped by time constraints, anonymity concerns, and impressions of the actionable impact of reports (Holden & Karsh, 2007). Consequently, feedback from incident reporting is essential not only to stimulate improvement strategies but also to promote future reporting (Gandhi, Seger, & Bates, 2000; Kaplan & Fastman, 2003).

Incident feedback can come in multiple forms, such as leadership walk-rounds, education and training, and publications (Gandhi, et al., 2000). Benn et al. (2009) outline 15 requirements for effective feedback, including feedback at multiple levels of the organization, appropriateness of delivery mode, and empowering front-line staff to take responsibility for local safety improvements. Front-line staff members are positioned to provide insight into failure causes and improvement strategies, but do not typically have the time or abilities for aggregate report generation. Although singular incident review can trigger lessons learned, aggregate review and trending is key to revealing underlying causes and targeting improvements. Empowerment of staff requires provision of the right tools to efficiently and effectively take action.  To date, most efforts focus on reporting system and incident collection development and less on how to analyze and provide meaningful information back to administrators and clinicians (Pronovost, et al., 2007; Farley, et al., 2008).

Technological advances in the form of business intelligence (BI) tools have enabled simplified extraction and analysis thereby transforming data stores into useful business knowledge. BI tools offer user-friendly interfaces, timely access to data, and flexibility for data mining (Copacino & Pendrock, 2007). Recently, healthcare organizations have begun using business intelligence to facilitate quality, safety, and financial improvements (Horvath, Cozart, Ahmad, Langman, & Ferranti, 2009; Ferranti, Langman, Tanaka, McCall, & Ahmad, 2010). Although successful BI tool use is documented, the implementation of these tools at the organizational level has not yet been addressed. In this article, we describe one health system’s approach to the delivery of feedback to front-line users utilizing a modern BI tool.

Implementation Site
Setting
Duke University Health System (DUHS) is comprised of Duke University Hospital (DUH), an academic medical center, along with two community hospitals and outpatient clinics throughout the Raleigh-Durham, North Carolina, area.

Approach to Adverse Event Detection
DUHS uses a two-pronged approach for adverse event detection: an electronic voluntary reporting system and a computerized surveillance trigger tool. The Safety Reporting System (SRS) is a web-based, voluntary reporting tool, offering a single portal for employee reporting of witnessed events or concerns across inpatient, outpatient, and home-care environments (Kilbridge, Campbell, Cozart, & Mojarrad, 2006; Ferranti, et al., 2008b). This custom reporting tool began collecting data in May 2002 and captures medication, blood transfusion, fall (Whitehurst, et al., 2010), treatment/testing, perioperative, patient dissatisfaction, and laboratory incidents.

For additional detection of medication-related incidents, a computerized trigger tool, ADE-Surveillance (ADE-S), was deployed in November 2004 at all three hospitals (Kilbridge, et al., 2006; Cozart, et al., 2010). This internally developed tool fires alerts for potential adverse drug events (ADEs) when relevant criteria based on laboratory results, antidote dispenses, or drug-laboratory combinations are electronically detected (Ferranti, Horvath, Cozart, Whitehurst, & Eckstrand, 2008a). ADEs identified through evaluation of these triggers were ultimately integrated into the SRS system so as to capitalize on operational workflow and features of SRS, notably additional data fields for event classification (Cozart, et al., 2010).

A centralized safety data warehouse stores all incident data for retrospective review, interfacing with clinical and administrative systems that record patient demographics, allergies, clinical service, and attending physicians. Redesign of the SRS interface using the World Health Organization’s International Classification for Patient Safety, which allows for standardized categorization of safety information, provides structure for improved codified data collection necessary for meaningful aggregate report generation (Whitehurst, et al., 2010).

SRS incident reports are automatically routed to safety reviewers (nurse managers, administrators, medical directors, pharmacists, or other healthcare professionals) based upon attributes such as incident category, medication class, clinical service, severity, location, or any combination of those factors. This sophisticated routing ability facilitates efficient report distribution. Reviewers confirm data fields are complete in a standardized manner, which ensures data integrity.

Each care area’s core safety team is responsible for addressing issues identified in incident reports and escalating concerns to broader departmental and hospital-based patient safety and clinical quality committees. Voluntary reporting and adverse event detection are an integral component of the organization’s overall goal for clinical quality as demonstrated by positioning related metrics on the balanced scorecards for the health system, hospitals, and individual departments.

Moving to Business Intelligence
During 8 years of existence, SRS has recorded over 130,000 safety incidents and currently averages 1,600 incidents monthly across DUHS. With this volume, it is difficult to obtain a holistic understanding of systemic risk and needed improvements through review of distinct incidents. Ad hoc requests for more detailed or stratified data, longitudinal trending, and aggregate reports was a time-consuming, manual process performed by IT safety analysts.

With the expansion of safety initiatives, it became essential for staff to ask questions of the data in an interactive, real-time fashion, a prerequisite likely shared by many healthcare organizations trying to understand their adverse event data. To this end, in 2008, in consultation with quality improvement leaders and staff, the DHTS health analytics and patient safety team sought to develop the framework for a BI reporting platform accessible to front-line clinicians and health system administrators.

Although BI tools allow users to create reports using a simple drag-and-drop interface, most of our users rely on IT staff to create them given time constraints or knowledge limitations. Given the size of DUHS and number of interested parties, a clear data strategy was essential for safety data management across all DUHS locations. Creation of safety reports using a marketed BI tool was appropriate given the resources available through Duke Health Technology Solutions, the technology support entity for DUHS.

Report Design Strategy
Report generation involves three key players: a technical person knowledgeable in data warehousing infrastructure, an analyst for report builds, and clinicians to define report needs. It is essential to design from the end user’s perspective to ensure report applicability and usability. Points that must be addressed during design include questions to be asked of the data and result visualization. Subject matter experts have a pivotal role in both asking and answering these questions. For example, a falls clinical champion has insight into the historical analyses needed for fall interventions and relevant quality reporting initiatives. Consideration of the reporting and analysis needs of all customers, however, must not be ignored—what may be important to one group may not be of interest to another.

An iterative design process was used with a representative subset of users during report development. The group consisted of managers, physicians, pharmacists, quality improvement leaders in focused areas (e.g., transfusions or falls), and health system safety officers. After defining clinical interests, the analyst built basic report functionality:  filters to narrow report focus; data displays in graphical, tabular, or list format; and static vs. drill-through functions to access granular data. This was the analyst’s interpretation of how to utilize the tool given the available data along with the capabilities of the BI application. Prototypes were then presented  using live data to offer working knowledge of the report functionality. Subsequently, modifications were made based on feedback. This iterative design process—design, demonstration, and feedback—was repeated three times over the course of two weeks, ultimately generating static, automated reports and interactive reports with drill-down functionality to allow for real-time data exploration. In our experience, executives preferred high-level trends with few modifiable data points, whereas, other clinicians (e.g., nurse managers) wanted the same report to have increased capabilities for refinement to accommodate their dynamic needs. For example, quarterly reports may demonstrate a high volume of ADEs on a unit. Without the ability to drill-down into medication classes or system failures, front-line staff are less able to extract meaning and formulate actions to reduce risk. As a result, dynamic reports with filtering and drill-down functionality were created, as were list reports to gather individual incident details based on selected criteria. Though the design process was relatively fast, we published reports in a sequential fashion using 60% effort of one FTE. Automated reports were piloted in May 2009 with final versions being utilized July 2009. Medication, fall, and laboratory reports were completed from July to August 2008, November to December 2008, and November to December 2009, respectively.

Report Navigation and Execution
In keeping with the current workflow of SRS users, we created a reporting dashboard, with links to all reports via an Analysis Reports tab within the SRS (Figure 1). A dashboard approach allows for centralized and secured access to all reports at any time. Users first access a ‘prompt’ page where they select fields to be used as filters to limit their query. Figure 2 displays a sample falls report prompt page with filters for incident dates, locations, and fall descriptors (e.g., time of day, fall type, contributing factors, or age). Parameter combinations can be used to investigate data in a relevant and timely manner. For example, locating falls for elderly male oncology patients who were not identified as a falls risk during the last year can be accomplished by a few clicks. Multiple reports can be executed to compare incidents across specific nursing units, medical services, or hospitals.

Making Sense of a Safety Reporting System’s Data with BI Software
Figure 1: Safety Data Reporting Dashboard (click here to view a larger version)
The Safety Reporting System (SRS) is a web-based voluntary incident reporting system used throughout DUHS. Users can either submit a new incident report (“New Report” tab) or access the safety reporting dashboard via the “Analysis Reports” tab. From here available reports are displayed by incident category with details such as report names, general descriptions, visual format, presence of protected health information, and number of report pages. SRS is now a central site for individual and aggregate incident review.

Figure 2: Falls Report Prompt Page Upon selection of a specific report from the safety reporting dashboard, users are presented with a prompt page. This page displays available modifiable data fields, such as incident date ranges, patient demographics, and fall parameters to enable tailoring a query to the needs of individual users.
Figure 2: Falls Report Prompt Page (click here to view a larger version)
Upon selection of a specific report from the safety reporting dashboard, users are presented with a prompt page. This page displays available modifiable data fields, such as incident date ranges, patient demographics, and fall parameters to enable tailoring a query to the needs of individual users.

Report results can be visualized differently depending on report design. Static or dynamic report output can be in list and/or graphical formats. Drill-down functionality was heavily utilized in reports to produce series of clickable charts, graphs, or text, allowing users to focus on areas of interest. BI technology can simplify the ability to understand data, even by novice computer users. Figure 3 provides a visual example of how the drill-down function may be used.

Figure 3: Falls Report Drill-Down Functionality Business intelligence tools offer drill-down functionality, allowing users to transition from aggregate to more granular data via point-and-click. In this example, selection of Duke University Hospital from an initial graphical report of total lab/pathology incidents by hospital populates a pie chart demonstrating the percent contribution of each care unit toward the total number of incidents for Duke University Hospital. From here, selection of a single care area, or slice of the pie, leads to a spreadsheet of individual incident reports for detailed investigation.
Figure 3: Falls Report Drill-Down Functionality
Business intelligence tools offer drill-down functionality, allowing users to transition from aggregate to more granular data via point-and-click. In this example, selection of Duke University Hospital from an initial graphical report of total lab/pathology incidents by hospital populates a pie chart demonstrating the percent contribution of each care unit toward the total number of incidents for Duke University Hospital. From here, selection of a single care area, or slice of the pie, leads to a spreadsheet of individual incident reports for detailed investigation.

Privacy and Security of Reports
Privacy of healthcare information is accounted for, as DUHS employees must use Microsoft Windows Server 2003 Active Directory accounts to access SRS. Access rights to the BI reports must be authorized by an employee’s manager and director of risk management for DUHS. Restrictions are placed at entity (e.g., DUH access only), report (e.g., fall reports only) and incident location levels. Limitations on drill-down features also prevent access to report details where necessary, thus protecting reporter privacy and patient identifiers.

Organizational Utilization
Automated reports are distributed monthly and quarterly to various business unit administrators and safety leadership. These reports are sent via email in printable, ready to use formats and are shared with staff in safety and operational meetings to identify areas for further investigation. Interactive reports allow investigation of underlying causes or trends given the ad hoc searching, filtering, and display capabilities. This investigative work can be tasked to any number of personnel, from safety leaders to front-line staff, given the user friendliness of the BI application.

We present two cases demonstrating the use of our designed reports.

Case Study 1. Analysis of Laboratory Errors
Laboratory incidents at DUH, such as mislabeling, unacceptable specimen collection, and specimen transport errors increased during two time periods, 9 to 11 a.m. and 3 to 5 p.m. The morning time correlates with clinical rounds, which may indicate a high number of issues due to the larger volume of lab orders generated. The afternoon time corresponds to a laboratory personnel shift change. With interactive reports, the laboratory safety team was able to obtain granular data regarding differences in report submission based on nursing units. With a few clicks, reports identified that oncology unit lab errors occur during the morning time period while cardiology and children’s units have higher volumes in the afternoon. The ability to quickly ask and answer questions allowed the laboratory safety team to gain a better insight to correct the problem.

Case Study 2. Patient Attendants for High-Fall-Risk Patients
Given the potential devastating consequences of falls in elderly patients, an area of concern is the usefulness and/or availability of patient attendants (staff who monitor patients) for high-risk patients. Initial investigation began with understanding how often patient attendants are requested but unavailable due to staff limitations. Using the interactive BI report, falls safety investigators confirmed that falls occurring in patients monitored by a patient attendant were more often classified as assisted falls, meaning the attendant aided the patient to the ground and likely reduced the amount of patient injury. Querying of falls incidents in this manner represents a data-driven approach to safety improvement, which highlighted an unmet staffing need, therefore providing justification for additional resources. Beyond investigations of risk, the falls reports have annually saved approximately 400 hours of staff time previously spent on highly manual falls data collection and aggregation by falls champions for participation in the National Database of Nursing Quality Indicators program (Whitehurst, et al., 2010).

Discussion
A sizable volume of safety incident reports generated within a health system may impede efficient review and action implementation. Our BI application provides a query pathway that patient safety leadership and front-line staff now use to perform immediate data analysis. We created an easily accessible, centralized location for report access by capitalizing on current incident review workflow and constructing the safety reporting dashboard within SRS. This one-stop shop smoothed the transition of the BI product to front-line users.

Implementation involves several considerations regarding report construction, access, and training. During initial development, it is vital to involve subject matter experts who know what questions to ask of the data and how best to visualize it for operational or analytic purposes. Engagement of clinicians requires system demonstrations to display tool capability and collect feedback. Throughout our process, we realized that different clinical environments and personnel may have unique reporting requirements. This lack of a one-size-fits-all solution lends further support to the involvement of subject matter experts among various sub-groups to accommodate varying needs. As for any IT system implementation, report development teams should revisit stakeholders to see how they are using the data, making report enhancements as necessary.

Our institution took an incremental approach to report go-live; however, the actual design phase for each incident type followed a structured process over several weeks. Once the BI tool was in place, the safety analyst’s responsibilities shifted from ad hoc reports to minimal maintenance and routine quality assurance checks of the BI reports. As a result, the analyst more effectively utilizes their clinical and technical skills to assist in analyses and discussions at the safety committee level.

Our overarching BI mission across DUHS is to empower users from all levels of the healthcare arena to navigate the flood of clinical safety data generated during patient care,  allowing wisdom generation that guides change and promotes a culture of safety. In alignment with this mission, clinicians in our organization welcomed the BI tool, as it supports our overarching strategy of safety-incident transparency. Stakeholders can now devise meaningful interventions and document outcomes based on the new ability to browse and visualize safety data.

Andrea Long is a clinical pharmacist with the Maestro Care ambulatory electronic health record (EHR) deployment team for Duke Health Technology Solutions (DHTS). She obtained her doctor of pharmacy degree in 2006 and completed a general practice residency. Long previously assisted in analysis of electronic data for quality improvement and patient safety metrics. She can be contacted at andrea.long@duke.edu.

John Schroder is a nurse analyst with the Maestro Care Inpatient deployment team for DHTS. He obtained his bachelor of nursing degree from Duke University. Previously, Schroder designed and provided support for the Safety Reporting System and its business intelligence platform. He can be contacted at john.schroder@duke.edu.

Julie Whitehurst served as a clinical pharmacist with the health analytics and IT patient safety group for DHTS. She obtained her doctor of pharmacy and master of public health degrees in 2003 and 2006, respectively, and completed pharmacy practice and drug information residencies. She can be contacted at julie.whitehurst@yahoo.com.

Monica Horvath serves as team lead for health intelligence and research services at DHTS.  She provides statistical and methodological leadership for both business analytics and health IT research within the office of the Duke Medicine chief medical information officer.  She obtained her PhD in molecular biophysics & computational biology in 2005, completed postdoctoral training at the National Institute of Environmental Science in 2006, and previously served as a senior research analyst for the DHTS health analytics and IT patient safety group. She can be contacted at monica.horvath@duke.edu.

Dave Leonard is a senior IT analyst with the DHTS information management team. He obtained his bachelor of science degree from the University of North Carolina at Chapel Hill. He was the original architect of the safety reporting system and continues contribution to its development. He can be contacted at dave.leonard@dm.duke.edu.

Heidi Cozart is the inpatient project director of the Maestro Care EHR project at Duke Medicine.  She previously served as the clinical director of computerized provider order entry (CPOE) and IT patient safety in DHTS for Duke Medicine.  She graduated from the University of Iowa College of Pharmacy in 1994 and began her career in informatics in 1996. Cozart has been involved in informatics initiatives focused on patient safety and outcomes research, software design, and clinical system integration and deployment. She can be contacted at cozar011@mc.duke.edu.

Jeffrey Ferranti is the chief medical information officer for Duke Medicine. He is responsible for leading a team charged with the visioning, strategic planning, and effective adoption of integrated technology and information solutions that enable high quality clinical care, research, and education. He is an assistant professor of pediatrics and informatics, and holds a masters degree in medical informatics. He maintains practice in neonatal critical care at Duke University and Durham Regional Hospitals. He can be contacted at ferra007@mc.duke.edu.

References
Benn, J., Koutantji, M., Wallace, L., Spurgeon, P., Rejman, M., Healey, A. &  Vincent, C. (2009). Feedback from incident reporting: information and action to improve patient safety. Quality and Safety in Health Care, 18, 11-21.

Boxwala, A. A., Dierks, M., Keenan, M., Jackson, S., Hanscom, R., Bates, D. W., &  Sato, L. (2004). Organization and representation of patient safety data: Current status and issues around generalizability and scalability. Journal of the American Medical Informatics Association, 11, 468-478.

Copacino, W., & Pendrock, M. (2007). New Solutions make BI attractive to all companies. DM Review. November, 14-17.

Cozart, H., Horvath, M. M., Long, A., Whitehurst, J., Eckstrand, J., &  Ferranti, J. (2010). Culture counts-sustainable inpatient computerized surveillance across Duke University Health System. Quality Management in Health Care, 19, 282-291.

Farley, D. O., Haviland, A., Champagne, S., Jain, A. K., Battles, J. B., Munier, W. B., &  Loeb, J. M. (2008). Adverse-event-reporting practices by US hospitals: Results of a national survey. Quality and Safety in Health Care, 17, 416-423.

Ferranti, J., Horvath, M., Cozart, H., Whitehurst, J., &  Eckstrand, J. (2008a). Re-evaluating the safety profile of pediatrics: a comparison of computerized adverse drug event surveillance and voluntary reporting in the pediatric environment. Pediatrics, 121, e1201-e1207.

Ferranti, J., Horvath, M., Cozart, H., Whitehurst, J., Eckstrand, J., Pietrobon, R., Rajgor, D., & Ahmad, A. (2008b). A multifaceted approach to safety:  The synergistic detection of adverse drug events in adult inpatients. Journal of Patient Safety, 4, 184-190.

Ferranti, J. M., Langman, M. K., Tanaka, D., McCall, J., & Ahmad, A. (2010). Bridging the gap: Leveraging business intelligence tools in support of patient safety and financial effectiveness. Journal of the American Medical Informatics Association, 17, 136-143.

Gandhi, T. K., Seger, D. L., & Bates, D. W. (2000). Identifying drug safety issues: From research to practice. International Journal for Quality in Health Care, 12, 69-76.

Holden, R. J., &  Karsh, B. T. (2007). A review of medical error reporting system design considerations and a proposed cross-level systems research framework. Human Factors, 49, 257-276.

Horvath, M. M., Cozart, H., Ahmad, A., Langman, M. K., &  Ferranti, J. (2009). Sharing adverse event data using business intelligence technology. Journal of Patient Safety, 5, 35-41.

Johnson, C. W. (2003). How will we get the data and what will we do with it then? Issues in the reporting of adverse healthcare events. Quality and Safety in Health Care, 12 Suppl 2, ii64- 67.

Kaplan, H. S., &  Fastman, B. R. (2003). Organization of event reporting data for sense making and system improvement. Quality and Safety in Health Care, 12, II68-II72.

Kilbridge, P. M., Campbell, U. C., Cozart, H. B., &  Mojarrad, M. G. (2006). Automated surveillance for adverse drug events at a community hospital and an academic medical center. Journal of the American Medical Informatics Association, 13, 372-377.

Naessens, J. M., Campbell, C. R., Huddleston, J. M., Berg, B. P., Lefante, J. J., Williams, A. R., & Culbertson, R. A. (2009). A comparison of hospital adverse events identified by three widely used detection methods. International Journal for Quality in Health Care, 21, 301- 307.

Pronovost, P. J., Goeschel, C. A., Marsteller, J. A., Sexton, J. B., Pham, J. C., &  Berenholtz, S. M. (2009). Framework for patient safety research and improvement. Circulation, 119, 330- 337.

Pronovost, P., Holzmueller, C., Young, J., Whitney, P., Wu, A. W., Thompson, D. A., Lubomski, L. H., &  Morlock, L. (2007). Using incident reporting to improve patient safety: a conceptual model. Journal of Patient Safety, 3, 27-33.

Whitehurst, J., Cozart, H., Leonard, D., Schroder, J., Horvath, M., Avent, S., &  Ferranti, J. (2010). Tailoring “best-of-breed” safety classification for patient fall voluntary reporting. Journal of Patient Safety, 6, 192-198.