Public Dissemination of Provider Performance Comparisons

Similar documents
3M Health Information Systems. The standard for yesterday, today and tomorrow: 3M All Patient Refined DRGs

DRGs & MS-DRGs. System that takes into consideration the role that a hospitals case mix plays in influencing costs

Case-mix Analysis Across Patient Populations and Boundaries: A Refined Classification System

Continuously Measuring Patient Outcome using Variable Life-Adjusted Displays (VLAD)

Clinical Documentation: Beyond The Financials Cheryll A. Rogers, RHIA, CDIP, CCDS, CCS Senior Inpatient Consultant 3M HIS Consulting Services

THE PEPPER AND YOUR CDI PROGRAM. Kat McFarland, RN, MN, ACM Director Care Management Providence Regional Medical Center Everett 9/28/2018

using data analytics to transform care management and reduce clinical variation

Online library of Quality, Service Improvement and Redesign tools. Pareto. collaboration trust respect innovation courage compassion

3M Health Information Systems. A case study in coding compliance: Achieving accuracy and consistency

implementing a site-neutral PPS

Prepared for North Gunther Hospital Medicare ID August 06, 2012

August 25, Dear Ms. Verma:

Using the Inpatient Psychiatric Facility (IPF) PEPPER to Support Auditing and Monitoring Efforts: Session 1

Clinical Documentation Improvement (CDI) Programs: What Role Should Compliance Play?

Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings

Surgical Care at the District Hospital. EMERGENCY & ESSENTIAL SURGICAL CARE

CASE-MIX ANALYSIS ACROSS PATIENT POPULATIONS AND BOUNDARIES: A REFINED CLASSIFICATION SYSTEM DESIGNED SPECIFICALLY FOR INTERNATIONAL USE

Diagnosis-Related Groups (DRGs) are a type of

Casemix Measurement in Irish Hospitals. A Brief Guide

The New Jersey Gainsharing Experience By Robert G. Coates, MD, MMM, CPE

Identifying Potentially Preventable Readmissions

NHS performance statistics

Definitions/Glossary of Terms

Carondelet Health Network APR DRG Information for Physicians September 2014

Payments for Death-Related One-Day Inpatient Admissions. M e dicaid Progra m Department of Health

NHS Performance Statistics

1 - ICU EVALUATION. inconsistently synthesizes accurate, thorough histories, exams, and data to diagnose critically ill patients

Disclosure of Proprietary Interest

Tracking Functional Outcomes throughout the Continuum of Acute and Postacute Rehabilitative Care

XIII. Health Statistics and Research. Kathy C. Trawick, EdD, RHIA, FAHIMA

CHAPTER 1. Documentation is a vital part of nursing practice.

Cause of death in intensive care patients within 2 years of discharge from hospital

Health Economics Program

Hospital Clinical Documentation Improvement

COMPARATIVE STUDY OF HOSPITAL ADMINISTRATIVE DATA USING CONTROL CHARTS

Appendix #4. 3M Clinical Risk Groups (CRGs) for Classification of Chronically Ill Children and Adults

Cultural Transformation To Prevent Falls And Associated Injuries In A Tertiary Care Hospital p. 1

Ambulatory emergency care Reimbursement under the national tariff

How Allina Saved $13 Million By Optimizing Length of Stay

Gantt Chart. Critical Path Method 9/23/2013. Some of the common tools that managers use to create operational plan

NHS performance statistics

COMMUNITY HEALTH NEEDS ASSESSMENT HINDS, RANKIN, MADISON COUNTIES STATE OF MISSISSIPPI

Suicide Among Veterans and Other Americans Office of Suicide Prevention

Patients Experience of Emergency Admission and Discharge Seven Days a Week

Community Performance Report

This is the Full Title of a Session

3M Health Information Systems. 3M Clinical Risk Groups: Measuring risk, managing care

Pricing and funding for safety and quality: the Australian approach

Hospital Discharge Data, 2005 From The University of Memphis Methodist Le Bonheur Center for Healthcare Economics

User s Guide Tenth Edition

Surgical Critical Care Sub I

Inpatient Bed Need Planning-- Back to the Future?

LESSONS LEARNED IN LENGTH OF STAY (LOS)

INPATIENT HOSPITAL REIMBURSEMENT

Policies for Controlling Volume January 9, 2014

Total Joint Partnership Program Identifies Areas to Improve Care and Decrease Costs Joseph Tomaro, PhD

ED Transfer Communication

USE OF APR-DRG IN 15 ITALIAN HOSPITALS Luca Lorenzoni APR-DRG Project Co-ordinator

Clinical Integration Data Needs for Assessing a Project

Increase Your Bottom Line by Eliminating Physician Driven Denials. Olakunle Olaniyan MD President Case Management Covenants

Creating a Patient-Centered Payment System to Support Higher-Quality, More Affordable Health Care. Harold D. Miller

IMAGES & ASSOCIATES O UR S ERVICES OPERATIONAL REVIEW AND ENHANCEMENT

AHRQ Quality Indicators. Maryland Health Services Cost Review Commission October 21, 2005 Marybeth Farquhar, AHRQ

National Cardiac Arrest Audit Report

Chan Man Yi, NC (Neonatal Care) Dept. of Paed. & A.M., PMH 16 May 2017

Total Cost of Care Technical Appendix April 2015

Cost impact of hospital acquired diagnoses and impacts for funding based on quality signals Authors: Jim Pearse, Deniza Mazevska, Akira Hachigo,

Paying for Outcomes not Performance

DELAWARE FACTBOOK EXECUTIVE SUMMARY

Special Open Door Forum Participation Instructions: Dial: Reference Conference ID#:

Quality Assessment and Performance Improvement in the Ophthalmic ASC

Integrated heart failure service working across the hospital and the community

reducing lost revenue from inpatient medical-necessity denials

LIVINGSTON COUNTY HEALTH PROFILE. Finger Lakes Health Systems Agency, 2017

Expert Rev. Pharmacoeconomics Outcomes Res. 2(1), (2002)

In Press at Population Health Management. HEDIS Initiation and Engagement Quality Measures of Substance Use Disorder Care:

Inpatient Experience Survey 2012 Research conducted by Ipsos MORI on behalf of Great Ormond Street Hospital

Emergency admissions to hospital: managing the demand

Preventable Readmissions Payment Strategies

The 5 W s of the CMS Core Quality Process and Outcome Measures

MONROE COUNTY HEALTH PROFILE. Finger Lakes Health Systems Agency, 2017

General Background of CDI

National Patient Safety Foundation at the AMA

Maroon Inpatient Rotation PL-1 Residents

Choice of a Case Mix System for Use in Acute Care Activity-Based Funding Options and Considerations

DC Inpatient APR-DRG Payment for Acute Care Hospitals

Potentially Avoidable Hospitalizations in Tennessee, Final Report. May 2006

Hospital Inpatient Quality Reporting (IQR) Program

NCLEX PROGRAM REPORTS

CERTIFICATE OF NEED (CON) REGULATION General Perspectives Maryland Perspectives

What is CDI? 2016 HTH FL Boot Camp. HIM/Documentation: Endurance in the Clinical Documentation Improvement (CDI) Race

Utah Fire and Rescue Academy Quality Report

THE ART OF DIAGNOSTIC CODING PART 1

Health Care Quality Indicators in the Irish Health System:

Driving Out Clinical Variation to Drive Up Your Bottom Line

A preliminary analysis of differences in coded data from Australia and Maryland

Coordinated cancer care: better for patients, more efficient. Background

CAH PREPARATION ON-SITE VISIT

CLINICAL SERVICES OVERVIEW

Admissions and Readmissions Related to Adverse Events, NMCPHC-EDC-TR

Transcription:

Public Dissemination of Provider Performance Comparisons Richard F. Averill, M.S. Recent health care cost control efforts in the U.S. have focused on the introduction of competition into the health care system. This has lead to the development of managed care plans that compete for patients based on price, quality, and scope of services. The assumption underlying any competitive system is that patients will select providers at least in part, based on price and quality. In order for competition to be effective, sufficient information about the cost and quality of different providers must be available to patients (consumers) for them to make informed and effective choices. Thus, as managed care has expanded, the importance of information about provider performance has increased. As a result, many states have established health data agencies that are responsible for the public dissemination of information on provider performance. Disseminating Provider Performance Comparisons To date, the focus of the information disseminated by health data agencies has been on inpatient care. In most states hospitals are required to submit a coded discharge abstract for every inpatient to the state data agency. The comparative information produced by the health data agencies is often referred to as provider report cards because, in essence, the reports grade providers on their performance. The provider report cards usually include comparative information on length of stay (LOS), charges, mortality, and sometimes on complication rates. In order for the comparisons to be meaningful, they need to be adjusted for differences in severity of illness and risk of mortality. As a result, the state data agencies have needed to select a system for measuring severity of illness and risk of mortality before provider report cards could be produced. Measuring Severity of Illness and Risk of Mortality The most prevalent system for measuring severity of illness and risk of mortality used by state data agencies is the All Patient Refined DRGs (APR-DRGs), which are in use in more than 20 states. APR-DRGs refine the basic concept of Diagnosis Related Groups (DRGs), which are used for payment in the U.S. Medicare systems, by adding two sets of four subgroups. One set of subgroups addresses patient differences relating to severity of illness and the second set addresses risk of mortality. In APR-DRGs, severity of illness is defined as the extent of organ system loss of function or physiologic decompensation, while risk of mortality is the likelihood of dying. Since severity of illness and risk of mortality are distinct patient attributes, separate subgroups are assigned to a patient for severity of illness and risk of mortality. The four severity of illness subgroups and the four risk of mortality subgroups represent minor, moderate, major or extreme severity of illness or risk of mortality. The assignment of a patient to one of these four subgroups takes into consid- 3M HIS Research Report 7-9 1

eration not only the specific secondary diagnoses, but also the interaction between secondary diagnoses, age, principal diagnosis, and certain procedures. In the APR-DRGs, the assessment of the severity of illness or risk of mortality of a patient is specific to the patient s underlying problem. In other words, the determination of the severity of illness and risk of mortality is disease specific. The most important component of determining patient severity of illness or risk of mortality is the recognition of the impact of interactions among secondary diagnoses. In APR-DRGs, high severity of illness and risk of mortality are primarily determined by the interaction of multiple diseases. Patients with multiple diseases involving multiple organ systems constitute the patients with a more extensive disease process who are difficult to treat and who have poor outcomes. As an example of the impact of severity of illness and risk of mortality on resource use and outcomes, the average LOS, charges, and mortality for the intracranial hemorrhage APR-DRG is shown in Table 1. Across the severity of illness and risk of mortality subclasses for intracranial hemorrhage, there is a 2.53 fold difference in average LOS, a 3.74 fold difference in average charge, and a 9.03 fold difference in mortality. Provider Report Cards An example of a provider report card is contained in Figure 1. This type of report card is annually produced by the state of Florida. The format of the provider report card is similar to the format used by consumer magazines that rate consumer products. The Florida provider report cards are produced separately for individual hospital product lines. The report card in Figure 1 is for cardiac surgery. Florida uses APR-DRGs to adjust for differences in severity of illness and risk of mortality. The squares indicate poor performance and the triangles indicate good performance. In the Florida provider report cards, good performance indicates that the difference between actual performance of a hospital and the severity adjusted or risk of mortality adjusted expected performance based on state or local norms, is positive and is statistically significant. It also indicates that the hospital s positive performance rank it in the top 15 percent of hospitals in terms of the magnitude of difference between expected and actual performance. Conversely, poor performance indicates that the difference between actual performance of a hospital and the severity adjusted or risk of mortality adjusted expected performance based on state or local norms is negative and is statistically significant. It also indicates that the hospital s negative performance 2 3M HIS Research Report 7-99

ranks it in the bottom 15 percent of hospitals in terms of the magnitude of difference between expected and actual performance. A circle indicates that the performance difference was statistically significant but the hospital did not rank in the top or bottom 15 percent of hospitals. A dash means that the performance difference was not statistically significant. The provider report cards were widely distributed and highly publicized in Florida. There were many newspaper articles reviewing the results. It was not uncommon to see the results of the provider report cards published by providers in radio or newspaper advertisements or on billboards. The advertisements would typically encourage patients to seek care at the hospital because, based on the provider report cards, it had the lowest mortality rate or was the most cost effective hospital in the area. Preparing for Provider Report Cards Provider report cards place increased demands on the quality of medical records data and information systems within hospitals. The severity of illness and risk of mortality adjustments require a thorough reporting of a patients diagnoses. In particular, the presence of specific combinations of secondary diagnoses is a key factor in a patient being assigned to the higher severity of illness or risk of mortality subclasses. While DRGs have been used in the U.S. for payment for more than 15 years, the DRGs used for payment are not nearly as sensitive to the completeness and accuracy of the diagnostic information as the severity of illness and risk of mortality adjustments. Thus, in states where provider report cards are published, hospitals have needed to implement processes to ensure that incomplete or undercoded medical records data did not negatively impact the evaluation of their institution on the provider report cards. Hospitals also need to be proactive in the evaluation of their own data. It is essential that individual hospitals not be surprised when the results of a provider report card are published. Hospitals need to anticipate their performance on the report cards using their own internal information systems. This permits the hospital to have a well-planned response to any negative results, and to develop improvement programs to correct any identified problem areas. In general, states that publish provider report cards make the APR-DRG severity of illness and risk of mortality adjusted state norms available to individual hospitals. In addition, national and regional APR-DRG severity of illness and risk of mortality norms are commercially available. These APR-DRG norms can be integrated into a hospital s internal information system to estimate the hospital s performance on future public provider report cards. Table 2 contains an example of the type of internal report that is essential for anticipating and responding to public provider report cards. This report examines the average LOS of patients having a percutaneous cardiovascular procedure such as a PTCA. In this report, the hospital as a whole, and each physician who treats at least 20 patients, are compared to the state norm. The total line in the report is for the hospital as a whole and shows that there were 270 percutaneous cardiovascular procedures performed with an average length of stay of 5.21 days. The expected LOS (exp LOS) is the average LOS that would be expected adjusting for severity of illness and using state APR- DRG LOS norms. Thus, the average LOS 3M HIS Research Report 7-9 3

for percutaneous cardiovascular procedures is 1.02 days longer than would be expected, and this difference is statistically significant as indicated by a P-value of 0.05 (i.e., there is less than a 5 percent chance that the difference between the average and expected LOS was due to chance). The report also shows that two of the physicians (D50147 and E59541) have an average LOS of approximately 3 days, even though their patients have a relatively high severity of illness as indicated by an expected LOS of approximately 5 days. In a similar report, the risk-adjusted mortality for these two physicians was found to be consistent with state norms. The practice pattern of these two physicians can be the basis of the development of a clinical pathway for percutaneous cardiovascular procedures that has a target LOS of three days. A successful implementation of such a clinical pathway would result in the hospital having an average LOS on the state report card that is one day lower than expected instead of one day higher than expected. Although the report in Table 2 is quite basic, it contains all the essential information necessary to anticipate and prepare for the results of provider report cards. Communicating Results to Physicians The development of an effective response to any negative results from a provider report card requires the active participation of the medical staff. A key factor in achieving the necessary participation is the presentation of data in a form that can be readily understood by the medical staff. The severity of illness adjustment and test of significance in Table 2 can provide the medical staff with confidence that the results are reliable. In addition, graphic displays of the data can be useful for illustrating the magnitude and nature of performance differences. A bub- 4 3M HIS Research Report 7-99

ble graph can be particularly useful for this purpose. Figure 2 contains a bubble graph that displays the mortality and LOS results for the 12 physicians who annually admit 20 or more patients with a CVA. The size of the bubble is proportional to the number of patients treated by each physician. The horizontal axis plots the difference between the actual and expected average LOS; and the vertical axis plots the difference between actual and expected mortality. The expected LOS is severity adjusted and the expected mortality is risk of mortality adjusted using APR-DRGs. Physician C, who treats the greatest number of CVA patients, has an average LOS 3 days higher than expected, and a mortality rate 7 percentage points higher than expected. In contrast, physicians A and L have an average LOS and mortality that are both substantially lower than expected. Although Physician B has a lower than expected LOS, his/her mortality rate is higher than expected. On a single page, a bubble graph clearly illustrates the extent of the differences in resource use and outcomes. Summary As competition among health care providers has increased in the U.S., there has been a rapid growth in the amount of comparative information on provider performance made available to consumers. This availability requires hospitals to have information systems that have the necessary capabilities to allow them to prepare proactively for the public dissemination of comparative performance data. The information contained in provider report cards can be a valuable tool for hospitals to use for internal management and planning. 3M 1998 3M HIS Research Report 7-9 5