Variation
The title of this section should really be "managing variation" because variation is at the root of all quality issues. Whether found in a highly mechanical production environment or a consumer-oriented service industry, variation invariably precedes system failure. Hospitals exhibit strong characteristics of both types of industries. This mix of organizational designs presents unique challenges as hospitals attempt to reduce variation in the care they provide. High-reliability production decreases waste and risk exposure, while excellent service results in loyal patients and engaged physicians and nurses. Measurement is the most fundamental tool in the hospital leader's toolkit to identify and mitigate variation.
Performance Measurement
Performance measurement is simply a step in the feedback mechanism telling a unit (service or production) how it is performing. Hospitals have always been data-driven organizations. Historically, it has been financial processes that have been measured, analyzed, and acted upon. Just as hospitals have collected financial data to give feedback to multiple stakeholders, they now must collect quality data for an expanding number of internal and external stakeholders. The three major foci of measurement are:
- Regulatory/Accreditation—Examples would include the Center for Medicare & Medicaid Services (CMS) required core measures (e.g., fibrinolytic therapy received within 30 minutes of emergency department (ED) arrival, aspirin at arrival) and documentation of Joint Commission standard achievement.
- Mission—In addition to financial data, this would include department-specific quality improvement goals identified in a hospital's strategic plans. Another example would be data needed to establish additional credentials (Stroke Certification) or for award applications (Baldrige, American Hospital Association [AHA] NOVA). Hospitals benefit from comparing their performance to similar organizations through participation in benchmarking projects.
- Rapid Cycle Change—Project-specific data collected during the Plan-Do-Study-Act (PDSA) process to test small-scale process improvements and determine if the change should be accepted, modified, or rejected. Measurement is usually done at the unit level by the same staff delivering the care, and collection is short term in nature. This type of measurement is one of the most effective levers for achieving and sustaining process improvements. More will be said about PDSA in the next section.
Data to Information
Measurement usually begins with a question and quickly moves to data collection. There are three major steps required to collect data that can be used to provide feedback to clinicians and other stakeholders. Each has a unique set of challenges. The first is data generation, which includes all the processes and opportunities clinicians have to enter information into the medical record or management information system. Clinicians need to be aware of the definitions of data elements they are recording and the rationale for collecting the information. Data elements should be easy to document, otherwise clinician cooperation wanes and data accuracy suffers. Periodic surveillance and audits (stratified by provider) will help ensure creation of accurate data.
The second phase is data abstraction. During this phase, data are harvested from the system. This can be a very resource-intensive process, depending on the capabilities of the organization's data system. Over a third of U.S. EDs remain exclusively paper based.20 Interoperability of computer systems continues to be a challenge for many hospitals. Data elements drawn from billing or coding systems tend to have a consistent location. Data elements reflecting clinical processes may have multiple locations throughout the medical record, thus increasing staff time and training costs for abstraction. Interdepartmental cooperation may be required for successful abstraction, creating workflows that need to be choreographed between busy departments. The final step in the abstraction process should be validation: a systematic, random spot check performed by a second abstractor to ensure the data are accurate.
The third and final phase is data reporting. Key decisions include how much to report and to whom. This is a strategic planning decision that needs to align with the administrative, departmental, and unit goals.
As hospitals move fully into the world of reporting quality data, they are learning that there is a difference between collecting financial data and collecting quality data. Whereas the number of individuals generating, abstracting, reporting, and receiving financial data is fairly limited within the facility, reporting of quality data is a hospital-wide enterprise. Quality issues arise from variation, and no one knows the sources of and solutions to variation better than the front-line staff. Therefore, hospital leaders must establish an expectation that unit and department care teams will identify key process variables, measure them, report the results widely, and improve them as needed. This may require structure and culture modifications along the power/authority continuum (Figure 1). To manage the current complexity and future uncertainty of modern health care, quality improvement is no longer just a department, it must be a way of thinking and behaving.
Figure 1. Power/authority continuum
Pending Measures
Multiple measures are already in place affecting the ED, and new measures are scheduled to start affecting hospital payment in 2012 and beyond (Figure 2). These measures will ultimately end up on Medicare's Hospital Compare Web site,b as have the core measures.
Figure 2. Pending emergency department measures
Measure Name | Effective Date |
---|---|
Use of Brain Computed Tomography (CT) in the Emergency Department (ED) for Atraumatic Headache | 2012 |
Head CT Scan Results for Acute Ischemic Stroke or Hemorrhagic Stroke Patients Who Received Head CT Scan Interpretation Within 45 minutes of Arrival | 2013 |
Troponin Results for ED Acute Myocardial Infarction (AMI) Patients or Chest Pain Patients (with Probable Cardiac Chest Pain) Received Within 60 minutes of Arrival | 2013 |
Median Time to Pain Management for Long Bone Fracture | 2013 |
Patient Left Before Being Seen | 2013 |
Door to Diagnostic Evaluation by a Qualified Medical Professional | 2013 |
Median Time from ED Arrival to ED Departure for Discharged ED Patients | 2013 |
Median Time from ED Arrival to ED Departure for Admitted ED Patients | 2014 |
Admit Decision Time to ED Departure Time for Admitted Patients | 2014 |
Previous CMS ED measures related primarily to clinical processes (fibrinolytic therapy received within 30 minutes of ED arrival and median time to ECG). Pending measures continue to focus on clinical processes (time to pain management and troponin results). But CMS has signaled a willingness to look more globally at ED processes by including the throughput measures (arrival to departure for admitted and discharged patients, decision to admit, door-to-diagnostic evaluation, and left before being seen). CMS has fended off criticism of these "nonclinical" measures by stating that despite their lack of focus on a specific clinical issue, they capture the totality of the ED experience, which frequently includes collaboration and coordination between many departments throughout the hospital. This rationale was supported by the results of a first-of-its-kind field test of the ED throughput measures.21 For 12 months, the Urgent Matters Learning Network (UMLN) II hospitals collected and reported monthly on the "arrival to departure for admitted and discharged patients" and the "decision to admit" measures. The hospital staff members were then interviewed to better understand the benefits and burdens of collecting and reporting the measure.
Staff reported that the measures were initially difficult to collect, but the learning curve quickly flattened. The need to access multiple IT systems was the challenge most frequently identified. Staff did not anticipate a need to hire additional staff when the measures became permanent, nor was additional training required to abstract the measures. One staff member needed "a 5-minute phone call" to learn how to access the nursing documentation system. Staff overwhelmingly voiced support for the measures. An ED medical director said the throughput measures were like "barometers" because they gave a global view of ED performance, while other, narrower measures, such as Door to Doctor, were "yardsticks" yielding more specific information.
An ED nurse recalled how his facility chose the throughput target of 150 minutes for discharged patients: "We saw that our patients are grumpy after 150 minutes... that's how we picked the 150 minutes... But that's not really the best way to pick." Several staff reported that having and sharing the data gave them "greater legitimacy" when dealing with other departments and helped create a "culture of continuous quality improvement within the ED." Most importantly, staff used this information to support their position that ED crowding requires hospital-wide solutions and that it is not just an ED problem.
b. Centers for Medicare & Medicaid Services. Hospital Compare Web site. Available at https://www.medicare.gov/hospitalcompare/search.html