ORIGINAL CONTRIBUTION Is Emergency Department Quality Related to Other Hospital Quality Domains? Megan McHugh, PhD, Jennifer Neimeyer, PhD, Emilie Powell, MD, MS, Rahul K. Khare, MD, MS, and James G. Adams, MD Abstract Objectives: Systems theory suggests that there should be relatively high correlations among quality measures within an organization. This was an examination of hospital performance across three types of quality measures included in Medicare s Hospital Inpatient Value-Based Purchasing (HVBP) program: emergency department (ED)-related clinical process measures, inpatient clinical process measures, and patient experience measures. The purpose of this analysis was to determine whether hospital achievement and improvement on the ED quality measures represent a distinct domain of quality. Methods: This was an exploratory, descriptive analysis using publicly available data. Composite scores for the ED, inpatient, and patient experience measures included in the HVBP program were calculated. Correlations and frequencies were run to examine the extent to which achievement and improvement were related across the three quality domains and the number of hospitals that were in the top for performance across multiple quality domains. Results: Achievement scores were calculated for 2,927 hospitals, and improvement scores were calculated for 2,842 hospitals. There was a positive, moderate correlation between ED and inpatient achievement scores (correlation coefficient of.5, 95% confidence interval [CI] =.47 to.53), but all other correlations were weak (.16 or less). Only 96 hospitals (3.3%) scored in the top for achievement across the three quality domains; 73 (2.6%) scored in the top for improvement across all three quality domains. Conclusions: Little consistency was found in achievement or improvement across the three quality domains, suggesting that the ED performance represents a distinct domain of quality. Implications include the following: 1) there are broad opportunities for hospitals to improve, 2) patients may not experience consistent quality levels throughout their hospital visit, 3) quality improvement interventions may need to be tailored specifically to the department, and 4) consumers and policy-makers may not be able to draw conclusions on overall facility quality based on information about one domain. ACADEMIC EMERGENCY MEDICINE 14;21:551 557 14 by the Society for Academic Emergency Medicine The Centers for Medicare and Medicaid Services (CMS) is changing the way that it pays for health services. 1 In an effort to improve the value of its expenditures, CMS now reimburses providers based on care quality, not just the quantity of services provided. One important component of CMS value-based purchasing strategy is the Hospital Inpatient Value-Based Purchasing (HVBP) program. 2 Beginning October 12, CMS began withholding 1% of Medicare payments and redistributing those funds back to hospitals based on achievement or improvement on 12 process measures and eight patient satisfaction measures (Table 1). Of the 12 process measures included in the first year of the program, four are related to care delivered in the emergency department (ED): fibrinolytic therapy received within 3 minutes of hospital arrival (acute myocardial infarction [AMI]-7a), primary percutaneous coronary intervention (PCI) received within 9 minutes of hospital From the Center for Healthcare Studies (MM, JN, EP RKK), the Department of Emergency Medicine (MM, EP, RKK, JA), Northwestern University, Feinberg School of Medicine, Chicago, IL. Received October 21, 13; revisions received November 15 and November 17, 13; accepted November 18, 13. The authors did not receive outside support or funding for this research. This work has not been published or presented elsewhere. The authors have no potential conflicts of interest to disclose. Supervising Editor: Lowell Gerson, PhD. Address for correspondence and reprints: Megan McHugh, PhD; e-mail: megan-mchugh@northwestern.edu. 14 by the Society for Academic Emergency Medicine ISSN 169-6563 551 doi: 1.1111/acem.12376 PII ISSN 169-6563583 551
552 McHugh et al. ED QUALITY AND HOSPITAL QUALITY arrival (AMI-8a), blood cultures performed in the ED prior to initial antibiotic received in hospital (pneumonia [PN]-3b), and initial antibiotic selection for communityacquired pneumonia (CAP) in immunocompetent patients (PN-6). 2,3 There is a growing interest in emergency medicine to understand factors influencing performance on these measures. 4,5 Systems theory holds that high performance results from a culture of excellence that permeates throughout a hospital and that one should see correlation among quality measures within an organization. 6,7 However, previous studies have found that hospitals that perform highly on one dimension of quality (e.g., patient experience) do not necessarily perform highly on others (e.g., mortality). 8 1 One could speculate that ED performance represents a distinct dimension of hospital quality. EDs are physically separate and have different reimbursement structures, management, and staffing than inpatient units. If ED performance is not related to hospital performance, it signals a lack of consistency in quality within an organization and that broad hospital quality improvement initiatives may need to be tailored to individual departments. We have previously described hospital performance on the ED measures included in the HVBP program. 11 However, to date, there has been no examination of the extent to which ED performance mirrors performance on other domains of hospital quality. We examined hospital achievement and improvement across the three domains of hospital quality included in Medicare s HVBP program: ED-related clinical process measures, inpatient clinical process measures, and patient experience measures. Our purpose was to determine whether a hospital s achievement and improvement on the ED quality domain is related to achievement and improvement on the inpatient and patient experience quality domains. Although several studies have investigated hospital performance on publicly reported quality measures, 12,13 this effort is unique in its focus on emergency care, its examination of both achievement and improvement, and its use of measures included the new HVBP program. Results have important implications for department and quality improvement leaders, consumers, and policy-makers. METHODS Study Design This was an exploratory, descriptive analysis of secondary data. Our institutional review board determined that approval was not required because the study did not involve human subjects. Study Setting and Population We obtained 8 through 1 performance data for the four ED-related clinical process measures, eight inpatient clinical process measures, and eight patient experience measures from the CMS Web site Hospital Compare (http://www.hospitalcompare.hhs.gov/). The clinical process measures are chart-abstracted measures that assess hospitals compliance on evidence-based care related to AMI, heart failure, pneumonia, and surgical care improvement. The patient experience measures are derived from the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey, which was developed by the Agency for Table 1 Measures Included in the Hospital Inpatient Value-Based Purchasing Program, Fiscal Year 13 ED-related Clinical Process Measures AMI-7a Fibrinolytic therapy received within 3 minutes of hospital arrival AMI-8a Primary PCI received within 9 minutes of hospital arrival PN_3b Blood cultures performed in the ED prior to initial antibiotic received in hospital PN_6 Initial antibiotic selection for CAP in immunocompetent patient Inpatient Clinical Process Measures HF_1 Discharge instructions SCIP_INF_1 Prophylactic antibiotic received within 1 hour prior to surgical incision SCIP_INF_2 Prophylactic antibiotic selection for surgical patients SCIP_INF_3 Prophylactic antibiotics discontinued within 24 hours after surgery ends SCIP_INF_4 Cardiac surgery patients with controlled 6 AM postoperative serum glucose SCIP_CARD_2 Surgery patients on a beta blocker prior to arrival that received a beta blocker during the postoperative period SCIP_VTE-1 Surgery patients with recommended venous thromboembolism prophylaxis ordered SCIP-VTE-2 Surgery patients who received appropriate venous thromboembolism prophylaxis within 24 hours prior to surgery to 24 hours after surgery Patient Experience Measures Nurses always communicated well Doctors always communicated well Patients always received help as soon as they wanted Pain was always well controlled Staff always explained Room was always clean and room was always quiet at night Yes, staff did give patients discharge information Patients who gave a rating of 9 or 1 (high) Source: Federal Register 11;76;2649 547. Additional information on measure specifications can be found in the measure specifications manual on CMS QualityNet website: http://www.qualitynet.org/dcs/contentserver?c=page&pagename=qnetpublic% 2FPage%2FQnetTier4&cid=1228771525863. AMI = acute myocardial infarction; CAP = community-acquired pneumonia; CARD = cardiac; HF = heart failure; INF = infection; PCI = percutaneous coronary intervention; PN = pneumonia; SCIP = surgical care improvement project; VTE = venous thromboembolism. *ED-related measures.
ACADEMIC EMERGENCY MEDICINE May 14, Vol. 21, No. 5 www.aemj.org 553 Healthcare Research and Quality, and asks patients about their experience in the hospital. Although public reporting of data on Hospital Compare is voluntary, only hospitals that report their performance measures are eligible for a full Medicare payment update. Ninetyseven percent of hospitals satisfactorily met the reporting requirements in 1. 14 We linked these data to the 9 American Hospital Association Annual Survey, which contains information on hospital characteristics (e.g., size, ownership, region, teaching status) to compare the characteristics of the study hospitals with all other general medical and surgical hospitals. Study Protocol We limited our analysis to hospitals that met the criteria for the HVBP program. Hospitals must be acute care hospitals paid under the Inpatient Prospective Payment System (IPPS). An exception was made for acute care hospitals in Maryland, which are not paid under the IPPS, but are included in the program. Additionally, between 9 and 1, hospitals must have reported data from at least 1 HCAHPS surveys and data for at least four clinical process measures with at least 1 eligible cases. We calculated scores for each performance measure according to the method used by CMS for the HVBP program, the details of which are described in the program s Final Rule published in the Federal Register on May 6, 11. 2 In brief, for each performance measure, hospitals receive an achievement score between 1 and 1 based on how much their current performance score exceeds the median for all hospitals. If the score is below the median, the hospital receives an achievement score of. Additionally, hospitals also receive an improvement score between 1 and 1 based on how much the score on the performance measure improved from the previous (i.e., baseline) year. If performance did not improve, the hospital receives an improvement score of. Under the HVBP program, the final performance score is the higher of the achievement or improvement score. However, because we were interested in looking at both achievement and improvement, we investigated both scores separately. In administering the HVBP program, CMS calculates a composite score for all clinical process measures included in the program. We applied CMS composite score methodology separately to the four ED measures, the eight inpatient measures, and the eight patient experience measures for both achievement and improvement, to create six composite scores. For each hospital, we summed total points earned for the performance measures and divided by the total number of points for which the hospital was eligible. Eligible points is equal to 1 (the highest possible score on a performance measure) times the number of performance measures for which the hospital reported at least 1 cases. Following CMS s methodology, we then multiplied by 1. Each hospital had two composite scores (achievement and improvement) for each quality dimension. The scores ranged from to 1. There are two important differences between CMS methodology and our approach. First, CMS uses a 9- month performance period for the current and baseline time periods. However, due to the way data are reported in Hospital Compare, we used 12-month periods. October 8 through September 9 represented our baseline period, and October 9 through September 1 was our current period. These were the most recent data available from Hospital Compare at the time of our analysis., CMS uses a minimum of four clinical process measures to calculate program composite scores. Because there are only four ED-related performance measures, we did not set a minimum for the calculation of the ED quality dimension scores. Data Analysis For both achievement and improvement, we calculated the distribution of hospitals by performance. We calculated the number of hospitals that scored within the top for the ED quality domain and then calculated whether those hospitals were also in the top for the inpatient and patient experience domains. Because we were interested in the association between quality domain scores, but we did not want to assume one-way causal effect (e.g., ED achievement is dependent on inpatient achievement), we used correlations and scatterplots to explore the relationship between the raw composite scores on the three quality domains. We ran the analyses separately for achievement and improvement. Analyses were performed using Stata 1.. RESULTS Number of Hospitals in the Analysis We determined that 3,3 hospitals met the criteria for the HVBP program. A total of 13 low-volume hospitals were dropped from the analysis because they either did not report, or reported fewer than 1 cases, for every ED measure. The final sample for our analysis related to achievement included 2,927 hospitals. Because 85 of those hospitals did not report enough cases in the baseline period to receive improvement scores, our analyses related to improvement included 2,842 hospitals. Compared to all general medical and surgical hospitals in the United States, the hospitals in the analysis included fewer small hospitals. Correlations and Frequencies There was a positive, moderate correlation between the ED and inpatient achievement scores (Figure 1). The correlation coefficient was.5 (95% confidence interval [CI] =.47 to.53). Of the 731 hospitals that scored in the top for ED achievement, 382 (52.3%) scored within the top for inpatient achievement (Table 2). All other correlations were weak (.16 or less). Only 198 hospitals (6.7%) scored in the top for achievement on both the ED and patient experience areas. Similarly, there was little overlap on high performance for improvement. Of the 678 hospitals that scored in the top for ED improvement, only 254 (37%) scored in the top for inpatient improvement and only 187 (28%) scored in the top for patient experience improvement. Only 96 hospitals (3.3%) scored in the top across all three
554 McHugh et al. ED QUALITY AND HOSPITAL QUALITY Inpatient Quality A. ED and Inpatient s - Achievement 1 9 8 7 5 3 1 1 3 5 7 8 9 1 Inpatient Quality 1 8 ED Quality Dimension B. Inpatient and Patient Experience s - Achievement 1 3 5 7 8 9 1 Inpatient Quality C. ED and Patient Experience s - Achievement 1 9 8 7 5 3 1 1 3 5 7 8 9 1 ED Quality D. ED and Inpatient s - Improvement 1 9 8 7 5 3 1 1 3 5 7 8 9 1 ED Quality E. Inpatient and Patient Experience s - Improvement 1 9 8 7 5 3 1 1 3 5 7 8 9 1 Inpatient Quality F. ED and Patient Experience s - Improvement 1 9 8 7 5 3 1 1 3 5 7 8 9 1 ED Quality Figure 1. Scatterplots for the ED, inpatient, and patient experience quality dimensions, achievement, and improvement scores achievement dimensions, and only 73 (2.6%) scored in the top for all three domains of improvement. DISCUSSION Our investigation of hospital performance using measures included in Medicare s new HVBP program revealed a positive, moderate correlation for achievement on the ED and inpatient measures. One would generally expect to see a strong positive relationship between ED and inpatient performance, as hospital quality is ultimately overseen by a single hospital board of directors, and resources that enhance quality (e.g., electronic health records) are often consistently available (or unavailable) across departments. However, our findings also show a weak relationship between achievement on the ED and patient experience measures. These findings are consistent with other studies showing a weak relationship between process of care and patient experience measures. 8,15 This is the first study to investigate whether improvement on the ED quality domain is related to improvement on other quality domains. We found that the three quality domains do not necessarily improve in tandem. There are several implications of our findings for policy-makers, hospital leaders, and consumers. First, for policy-makers, our results lend support to the notion that value-based purchasing programs should include measures that represent multiple dimensions of performance. If ED measures were excluded from Medicare s Hospital Inpatient VBP program, a different group of hospitals would have achieved the highest relative performance (and therefore the highest rewards). Although hospitals report that submitting multiple quality measures can be burdensome, 2 our findings indicate that including measures from multiple dimensions provides a more complete picture of hospital quality., few hospitals achieve high performance across the three quality dimensions, and almost half of hospitals did not qualify as top performers in any dimension. This highlights a broad opportunity for improvement. Hospital and department leaders should review their scores on the measures included in the HVBP program, identify the dimensions of performance that are weakest, and focus attention in those areas, with the goal of improvement. Further, our findings have implications for hospital boards, who are ultimately responsible for hospital quality. 16,17 Given that quality is one of many topics that boards discuss during their meetings, 18 they may be inclined to review few measures. Our results suggest that they should request information on quality across departments and consider the consistency of quality across the organization. Our findings also have important implications for consumers. Our results suggest that the level of quality a consumer will encounter during an ED-initiated hospital stay is likely to vary during that single stay. While quality of care may be high in the ED, this will not necessarily be consistent across the inpatient stay and measures of patient experience. Further, the variation in achievement across the three quality domains illustrates how difficult it may be for consumers to choose a hospital based on publicly available quality data. Assuming a
ACADEMIC EMERGENCY MEDICINE May 14, Vol. 21, No. 5 www.aemj.org 555 Table 2 A. Distribution of Hospitals, by Quality Domain and Achievement Quartile ED Quality Dimension Inpatient Quality Dimension Domain Achievement (n =71) (n =7313) Quartile (n =755) (n =716) Quartile (n =745) Patient experience quality dimension Inpatient quality dimension Top (n =697) Third (n =761) (n =734) Top Third (n =716) (n =745) 198 179 139 181 188 189 155 165 198 184 168 185 175 189 195 176 196 183 8 174 183 4 191 183 139 164 216 215 185 153 175 221 382 182 98 69 185 215 195 1 98 187 221 21 66 126 217 336 B. Distribution of Hospitals, by Quality Domain and Improvement Quartile ED Quality Dimension Inpatient Quality Dimension Domain Improvement (n =678) (n =743) (n =68) Quartile (n =741) (n =716) Quartile (n =745) Patient experience quality dimension Inpatient quality dimension Top (n =785) (n =713) Third (n =685) (n =659) Top (n =78) (n =679) Third (n =734) (n =721) 187 213 164 221 4 17 225 186 19 165 175 183 199 172 167 175 149 196 176 164 143 186 168 188 152 169 165 173 162 151 174 172 254 165 15 139 148 293 1 163 1 7 185 2 136 163 185 237
556 McHugh et al. ED QUALITY AND HOSPITAL QUALITY consumer is willing to review quality scores, he or she may have difficulty deciding between hospitals that have high patient experience scores versus those with high clinical quality scores. Our results raise an important question about why hospitals perform strongly (or greatly improve) on one quality domain, but not in others. In this effort, we did not explore the link between organizational culture or resources and performance; however, the absence of a strong correlation among the measures calls into question the notion that a culture of excellence can permeate through an organization, raising performance in all areas. 6,19 Instead, targeted approaches in each quality domain may be necessary. Specifically, the ED may require separate and tailored quality initiatives to earn the highest quality improvement or achievement. This is an important topic for future research. Another possible explanation for the absence of a strong correlation is that our implicit assumption about hospital leaders attention to the three dimensions may be incorrect. Hospital leaders may be focused on aspects of quality that they believe are more meaningful (e.g., resuscitation performance, hospital-acquired conditions) than the measures included in the HVBP program. Importantly, the Institute of Medicine defined six domains of quality, and the measures currently included in the HVBP program do not encompass all six domains. Instead, the program measures provide insight only into selected aspects of performance. LIMITATIONS Our findings should be viewed in light of several limitations. First, as noted above, we used a 12-month period rather than a 9-month period of performance. Doing so likely resulted in the inclusion of certain small hospitals that would not have met the reporting criteria sample size (at least four measures with 1 eligible cases) for the HVBP program had we used a 9-month reporting period., in the 11 Federal Register final rule for the HVBP program, several commenters expressed concern that 1 eligible cases per clinical process measure (the minimum threshold used by CMS and in this analysis) may be insufficient to produce reliable measure scores. CMS maintains that an independent analysis found that 1 cases was sufficient to produce reliable scores. 2 Still, it is possible that some of our estimates in this analysis were unstable. Third, the four ED-related process of care measures were selected because of their relevance to care provided in the ED. 3 However, not all cases for a particular measure represent care provided in the ED. For example, initial antibiotic selection (PN-6) is sometimes performed in an inpatient unit rather than the ED. That may explain the larger correlation between the ED and inpatient measures. 21 Using the Hospital Compare data, there was no way for us to limit our analysis to patients who received care in the ED. CONCLUSIONS With the exception of a moderate positive relationship between performance on the ED and inpatient quality domains, we found little consistency between achievement and improvement across the three quality domains included in Medicare s new Hospital Valuebased Purchasing program. Most hospitals that demonstrated high achievement or improvement in one quality domain did not demonstrate high achievement or improvement in the other domains. Our findings suggest that 1) there are broad opportunities for hospitals to improve, 2) patients may not experience consistent quality levels throughout their hospital visit, 3) quality improvement interventions may need to be tailored specifically to the department, and 4) consumers and policy makers may not be able to draw conclusions on overall facility quality based on information about one domain. References 1. Centers for Medicare & Medicaid Services. Report to Congress: Plan to Implement a Medicare Hospital Value-based Purchasing Program. Available at: http://www.cms.gov/medicare/medicare-fee-for-service-payment/acuteinpatientpps/downloads/hospital VBPPlanRTCFINALSUBMITTED7.pdf. Accessed Feb 11, 14. 2. Centers for Medicare & Medicaid Services. Medicare Program; Hospital Inpatient Value-based Purchasing Program. Federal Register, 11. Available at: http://www.gpo.gov/fdsys/pkg/fr-11-5-6/ pdf/11-1568.pdf. Accessed Feb 11, 14. 3. Cheung D, Wiler JL, Newell R, Brenner J. The State of Emergency Medicine Quality Measures. ACEP News. Available at: http://www.acep.org/content. aspx?id=82861. Accessed Feb 11, 14. 4. Newell R, Slesinger T, Cheung D, et al. Emergency medicine quality measures. ACEP News. August, 12. 5. American College of Emergency Physicians. Value Based Emergency Care (VBEC) Task Force. Report of the Value Based Emergency Care (VBEC) Task Force Quality and Performance Strategic Plan. Available at: http://www.google.com/url?sa=t&rct= j&q=&esrc=s&source=web&cd=1&ved=ccwqfjaa &url=http%3a%2f%2fwww.acep.org%2fcontent. aspx%3fid%3d46846&ei=nceuuctgmeu2wxc3i- GYBQ&usg=AFQjCNEvM5QCI8AujpQANuipNxd7nz BqA&sig2=w6biGdAcKAfz4jZYsDyUg&bvm=bv. 563433,d.b2I. Accessed Feb 11, 14. 6. Shwartz M, Cohen AB, Restuccia JD, et al. How well can we identify the high-performing hospital? Med Care Res Rev 1;68:29 31. 7. Wilson IB, Landon BE, Marsden PV, et al. Correlations among measures of quality in HIV care in the United States: cross sectional study. BMJ 7;335:185. 8. Jha AK, Orav EJ, Zheng J, Epstein AM. Patients perception of hospital care in the United States. N Engl J Med 8;359:1921 31. 9. Lehrman WG, Elliott MN, Goldstein E, et al. Characteristics of hospitals demonstrating superior performance in patient experience and clinical process measures of care. Med Care Res Rev 1;67: 38 55.
ACADEMIC EMERGENCY MEDICINE May 14, Vol. 21, No. 5 www.aemj.org 557 1. Rosenthal MB, Landrum MB, Meara E, et al. Using performance data to identify preferred hospitals. Health Serv Res 7;42:219 19. 11. McHugh M, Neimeyer J, Powell E, et al. An early look at performance on the emergency care measures included in Medicare s hospital inpatient value-based purchasing program. Ann Emerg Med 13;61:616 23. 12. Pines JM, Localio AR, Hollander JE, et al. The impact of emergency department crowding measures on time to antibiotics for patients with community-acquired pneumonia. Ann Emerg Med 7;5:51 6. 13. Atzema CL, Austin PC, Tu JV, Schull MJ. Emergency department triage of acute myocardial infarction patients and the effect on outcomes. Ann Emerg Med 9;53:736 45. 14. Centers for Medicare & Medicaid Services. APU Recipients: Hospital Inpatient Quality Reporting Program. Available at: http://www.qualitynet.org/ dcs/contentserver?cid=1154977996543&pagename= QnetPublic%2FPage%2FQnetTier3&c=Page. Accessed Feb 11, 14. 15. Young GJ, Meterko M, Desai KR. Patient satisfaction with hospital care: effects of demographic and institutional characteristics. Med Care ;38:325 34. 16. Conway J. Getting boards on board: engaging governing boards in quality and safety. Jt Comm J Qual Patient Saf 8;34:214. 17. Murphy S, Mullaney A. The New Age of Accountability: Board Education and Certification, Peer Review, Director Credentialing and Quality. Chicago, IL: Center for Healthcare Governance, 1. 18. Jha A, Epstein A. Hospital governance and the quality of care. Health Aff 1;29:182 7. 19. Curry LA, Spatz E, Cherlin E, et al. What distinguishes top-performing hospitals in acute myocardial infarction mortality rates? Ann Intern Med 11;154:384 9.. Institute of Medicine. Crossing the Quality Chasm. Washington, DC: National Academies Press, 1. 21. Friedberg M, Mehrotra A, Linder J. Reporting hospitals antibiotic timing in pneumonia: adverse consequences for patients? Am J Manag Care 9;15:137 44.