EVALUATING THE MEDICARE HOSPITAL VALUE- BASED PURCHASING PROGRAM
|
|
- Samuel Briggs
- 5 years ago
- Views:
Transcription
1 EVALUATING THE MEDICARE HOSPITAL VALUE- BASED PURCHASING PROGRAM A Thesis submitted to the Faculty of the Graduate School of Arts and Sciences of Georgetown University in partial fulfillment of the requirements for the degree of Master of Public Policy By Lars D. Johnson, B.A. Washington, DC April 16, 2014
2 Copyright 2014 by Lars D. Johnson All Rights Reserved ii
3 EVALUATING THE MEDICARE HOSPITAL VALUE- BASED PURCHASING PROGRAM Lars D. Johnson, B.A. Thesis Advisor: William Encinosa, Ph.D. ABSTRACT The Patient Protection and Affordable Care Act (ACA) contains numerous provisions intended to improve the quality of health care in the United States. The act established the Medicare Hospital Value- Based Purchasing (HVBP) program, which distributes a small portion of net Medicare fee- for- service dollars to hospitals based on how well they perform on a set of quality measures. Prior evaluations of smaller hospital value- based purchasing programs primarily the Premier Hospital Quality Incentive Demonstration (PHQID) that began in 2003 found few significant program effects. This study sought to answer two questions about the new Medicare HVBP program: Did the program have an effect on patient quality outcomes, as measured by hospital inpatient mortality rates? And, if the program did have an effect, how much would we expect outcomes to improve if a hospital improved on its quality measures by, say, 10 percent? Using data on all Medicare fee- for- service hospital stays from 2010 through 2012 in the states of Arizona (where the first HVBP performance period began on June 1, 2011) and Maryland (which was exempt from the HVBP program), this study applied a difference- in- differences logit hospital fixed effects analysis and found that HVBP had a statistically significant effect on patient outcomes. Hospital inpatient mortality rates declined an estimated 4.04 percent as a result of the policy. Additionally, it was estimated that if hospitals were to improve their HVBP total performance score by 10 percent, the iii
4 probability of inpatient mortality in that hospital would fall by 6.6 percent. While there are inherent limitations to difference- in- differences evaluations, and further research should seek to confirm these findings, this initial evidence suggests that large scale value- based purchasing may be an effective strategy to improve the quality of health care, at least for the Medicare population. iv
5 I dedicate this thesis to those whose lives have been, could be, or could have been improved by high quality health care. Thanks are due to my teachers and supporters, including: Hannah Carlson- Donohoe William Encinosa Walter McClure Ted Kolderie Judy Feder Jean Mitchell Keith Johnson Natalee Johnson v
6 TABLE OF CONTENTS Introduction... 1 Institutional Background... 2 Review of Quality Theory Literature... 6 Review of Value- Based Purchasing Literature... 8 Conceptual Model Data and Variables Methodology Results Suggestions for Further Research Policy Implications and Conclusions Bibliography vi
7 TABLES Table 1: HVBP program performance periods and measure domain weightings... 3 Table 2: Measures used in the first performance period of the HVBP program... 5 Table 3: Number of Medicare FFS stays and hospitals in data set, by state and year Table 4: Average inpatient death rate, by state and HVBP performance period Table 5: Descriptive statistics of primary variables of interest Table 6: Logit estimates of probability of inpatient death, by state and year Table 7: Logit estimates of probability of inpatient death, by HVBP TPS scores FIGURES Figure 1: Determinants of health care quality Figure 2: Distribution of hospital HVBP total performance scores vii
8 Introduction Year after year the federal budget is squeezed by the high and ever- rising costs of public health care programs (Congressional Budget Office 2013). In the private sector, the growing cost of employee health insurance has wiped out all gains in real income for employees over the last decade (Auerbach and Kellerman 2011). We might find this cost growth acceptable if we were getting what we pay for. Unfortunately, the American health care system spends more per capita than any other county, yet ranks in the middle of the pack on many measures of care quality such as in- hospital patient mortality rates (Squires 2011). Health care in the United States is, in relative terms, low value. For the health of our people, and for the fiscal health of our nation, public policy should seek to address this dilemma. Policy makers and private employers have tried various approaches to improve the value of U.S. health care, by attempting both to reduce cost and to improve quality. Through capitated managed care plans, they have tried paying a flat yearly rate for each patient rather than paying for any and all services delivered. They have tried using rate- setting systems combined with market purchasing power to lower procedure reimbursement rates (Sommers et al. 2012). Most recently, as data collection and measure design in health care have improved, a new approach is on the rise, which focuses on the quality half of the value equation: paying for performance. Under this approach, a portion of payments to providers and hospitals depends on how well they score on measures that attempt to capture patient quality outcomes, care processes, hospital structure, and even patient satisfaction levels (Rosenthal et al. 2004). 1
9 As prescribed by section 3001 of the Patient Protection and Affordable Care Act (ACA), the Center for Medicare and Medicaid Services (CMS) now pays partly for performance with its new Hospital Value- Based Purchasing (HVBP) program. Under HVBP, up to two percent of net Medicare fee- for- services payments to hospitals are based on their scores on a set of quality measures. Ideally, this incentive will push hospitals to improve on these measures and therefore to improve their quality of care. This raises two important policy questions. First, does the incentive imposed by HVBP work as intended to improve health care quality? And second, if HVBP does work as intended and leads hospitals to improve their quality measures by, for example, 10 to 20 percent, how much do patient outcomes improve, if at all? While it would also be worth considering the impact of HVBP on the other half of the value equation the total cost of care that is beyond the scope of this paper. Institutional Background The Medicare Hospital Value- Based Purchasing (HVBP) program distributes a small and gradually increasing portion of Medicare Part A payments to hospitals according to their quality measure scores. In its first year of operation fiscal year 2013 the payments were equal to 1 percent of the total expended under Part A. That number rises to 1.25 percent in 2014, 1.5 percent in 2015, 1.75 percent in 2016, and will finally plateau at 2 percent in Funds redistributed as part of the program are reallocated from the lowest performing hospitals to the highest performing hospitals, thus maintaining overall budget neutrality. One important quirk that will be discussed further below is that all 2
10 hospitals in the state of Maryland are exempt from the HVBP program. This allows researchers to use Maryland hospitals as a natural control group. Most of the measures upon which the HVBP is based were originally gathered through the Inpatient Quality Reporting (IQR) program, established by the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA). In its first year, HVBP used twelve IQR process measures, which are designed to evaluate the extent to which critical medical best practices are being followed. Additionally, the program used a set of eight measures based on patient responses to the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) patient experience of care survey. In subsequent years, the program adds five additional measures for clinical outcomes (such as death and complication rates) and one measure for overall efficiency (i.e. Medicare spending per beneficiary). Each year, the aggregated scores in each measure domain are weighted as described in Table 1 to yield a final total performance score (TPS) for each hospital. This score is used as the basis for HVBP incentive payments. Fiscal Year (Period) FY 2013 (Period 1) FY 2014 (Period 2) FY 2015 (Period 3) Table 1. HVBP program performance periods and measure domain weightings Dates Covered July 1, 2011 March 31, 2012 April 1, 2012 December 31, 2012 January 1, 2013 December 31, 2013 Process Domain Weight Patient Survey Domain Weight Outcome Domain Weight 70% 30% N/A N/A 45% 30% 25% N/A 20% 30% 30% 20% Efficiency Domain Weight Fiscal year (FY) refers to the year of the incentive payment, which is based on past performance over the period in the dates covered column. 3
11 Several of the process and outcome measures in use under HVBP are specific to three conditions that CMS considers high priority for improvement. Namely, five of the twelve process measures and three of the five outcome measures are specific to acute myocardial infarction (commonly known as heart attack), heart failure, and pneumonia. The remaining process and outcome measures, as well as all of the patient survey measures, are more general in their focus. The full set of measures used by HVBP in its initial performance period is described in Table 2. 4
12 Table 2. Measures used in the first performance period of the HVBP program Short Name Domain Description AMI7a Process The percentage of heart attack patients who were given fibrinolytic medication within 30 minutes of arriving at the hospital. AMI8a Process The percentage of heart attack patients who were given percutaneous coronary intervention (PCI) within 90 minutes of arriving at the hospital. HF1 Process The percentage of heart failure patients to which clear, detailed discharge instructions were given. PN3b Process The percentage of pneumonia patients who received their initial emergency room blood culture prior to administration of antibiotics. PN6 Process The percentage of pneumonia patients who were initially given the most appropriate antibiotic(s). SCIPCard2 Process The percentage of surgery patients who were already taking beta- blocker drugs before surgery, and continued taking those drugs right before and right after their surgery. SCIPVTE1 Process The percentage of surgery patients who were prescribed blood clot treatments, after some types of surgeries. SCIPVTE2 Process The percentage of surgery patients who received their blood clot treatment within 24 hours, after some types of surgeries. SCIPInf1 Process The percentage of surgery patients who were given an antibiotic within one hour before their surgery. SCIPInf2 Process The percentage of surgery patients who were given the correct type of antibiotic before surgery. SCIPInf3 Process The percentage of surgery patients who stopped preventative antibiotics within 24 hours of completing their surgery. SCIPInf4 Process The percentage of heart surgery patients whose blood glucose levels were kept at the correct levels in the several days before surgery. Nurse Communication Patient Survey Percent of patients who reported that their nurses 'Always' communicated well. Doctor Communication Patient Survey Percent of patients who reported that their doctors 'Always' communicated well. Responsiveness Patient Survey Percent of patients who reported that they 'Always' received help as soon as they wanted. Pain Management Patient Survey Percent of patients who reported that their pain was 'Always' well controlled. Medicine Patient Percent of patients who reported that staff 'Always' explained medicines Communication Cleanliness & Quietness Discharge Info Given Overall Hospital Rating Survey Patient Survey Patient Survey Patient Survey before administering them. Percent of patients who reported that their room and bathroom were 'Always' clean, and that the area around their room was 'Always' quiet at night. Percent of patients at each hospital who reported that, YES, they were given information about what to do during recovery. Patients who gave their hospital a rating of 9 or 10 on a scale from 0 (lowest) to 10 (highest). 5
13 Review of Quality Theory Literature For the last half a century researchers have sought to define and develop a theory around the concept of quality in health care. Seminal thinkers in the field settled on a three- term taxonomy for measures of quality: those based on structure, on process, and on outcomes (Donabedian 1966). Structural measures refer to the physical properties of a hospital, such as the technology and devices available and whether the hospital is for- profit or non- profit. Processes measures refer to actions taken by providers in the delivery of care, such as the administration of a specific medication when a patient arrives at the hospital with a heart attack. Process measures are usually constructed in consultation with clinical best practices research. Outcome measures refer to the resulting effects of care, such as whether a patient died within 30 days of their operation or whether they acquired an infection in the hospital (Brook and McGlynn 1996). Process and outcome measures are the most commonly used measures of quality in financial incentive programs (Ibid). Researches and policy makers often debate which measure type is most appropriate; in reality the two have different strengths and are appropriate in different contexts (Mant 2001). Ultimately, positive patient outcomes are the final objective: they have inherent value to those receiving care. However, their use as a measure of provider quality is complicated by the fact that providers cannot control many things that contribute to patient outcomes. For example, patient age and health status affect outcomes regardless of the actions taken by a care provider. Outcome measures are also sensitive to the effects of chance; for example, individual providers and even whole hospitals may have their outcome measures heavily distorted by just a couple of additional 6
14 patient mortalities. Gathering a truly representative sample could require many years of data. The advantage of outcome measures is that they capture all care given, not just that measured by process measures. Outcome measures are analogous to a diagnosis of exclusion; we assume that, controlling for patient severity and with enough observations to normalize chance, the remaining difference in outcomes is due to variation in the quality of care. The primary advantage of process of care measures is that they can be considered valid with far fewer observations. For example, Mant and Hicks described how detecting a significant 10 percent difference in a mortality outcome for patients with myocardial infarction would require data on 3619 patients using outcome measures, whereas process measures shown to be correlated with those outcomes could be valid with only 48 patients (Mant and Hicks 1995). Additionally, process measures, when incorporated into the design of an incentive program, have the advantage of being directly actionable. For example, providers can easily see how to improve on a measure of whether fibrinolytic medication was administered on arrival at the hospital with a heart attack. In contrast, improving on outcome measures can be less straightforward, and can leave providers feeling like they don t know where to start (Mant 2001). Fortunately for policy makers, process measures are linked with outcome measures, at least for certain conditions (Werner and Bradlow 2006; Peterson et al. 2006). Werner and Bradlow showed that ten process measures for acute myocardial infarction, pneumonia, and heart failure were jointly significantly correlated with risk- adjusted mortality rates, as determined from 2004 Medicare Hospital Compare and administrative 7
15 claims data (Ibid). The correlation between process and outcomes measures for acute myocardial infarction and pneumonia were each independently significant, while heart failure was not independently so. Many of the ten measures studied by Werner and Bradlow are the same as those now in use under the HVBP program. Review of Value- Based Purchasing Literature The idea of paying for performance or purchasing value has been on the rise in recent decades (Rosethal et al. 2004; Rosenthal and Frank 2006). Numerous programs based these concepts have been launched for physicians, clinics, and hospitals. By 2006, nearly half of all HMOs were using pay for performance approaches and the number has continued to rise (Rosenthal and Damberg 2013). An environmental scan in 2007 uncovered 40 value- based purchasing programs specifically at the hospital level (Damberg et al. 2007). The central idea behind such programs is that those delivering care, be they individual physicians or entire hospitals, should face incentives to deliver high quality care, as measured either by quality measures or, more commonly, process measures. In 2009, Mehrota et al. conducted a thorough literature review of evaluations of hospital value- based purchasing, and discovered eight studies of three different programs (Mehrota et al. 2009). Two studies evaluating the Hawaii Medical Service Association Hospital Quality Service and Recognition Pay for Performance Program found that rates of complication for surgical and obstetric patients declined by 2 percentage points after program onset, and that length of hospital stay also declined. Three studies examined the Blue Cross Blue Shield Michigan Participating Hospital Agreement Incentive Program. One of the studies (Nahra et al. 2006) found that process measures did improve under the 8
16 program, resulting in an estimated saving of 733 to 1,701 quality- adjusted life years. Mehrota et al. concluded, however, that these five studies had serious methodological flaws, mostly notably that they all lacked control groups. Also among the eight studies that Mehrota et al. reviewed were three that addressed the Premier Hospital Quality Incentive Demonstration (PHQID) program. The PHQID was initiated in 2003 among hospital participants of the Premier Inc. data system across the nation, and is considered to be the program after which the current Medicare HVBP program was modeled. In perhaps the most widely cited of all hospital value- based purchasing studies, Lindenauer and colleagues found that PHQID resulted in significant improvement on ten process measures for acute myocardial infarction, heart failure, and pneumonia relative to hospitals outside of the program; the amount of improvement ranged from 2.6 percent (for acute myocardial infarction) to 4.1 percent (for heart failure) over the program s first three years (Lindenauer et al. 2007). In contrast to the previously discussed studies, Lindenauer et al. made use of matched control hospitals, selected on the basis of properties such as number of beds, teaching status, location, and other characteristics. In another study of PHQID, Grossbart examined a limited set of hospitals in Cincinnati, Ohio, and found that four hospitals participating in PHQID showed a 2.6 percentage point greater improvement on the program s 17 process measures, relative to six non- participating control hospitals (Grossbart 2006). In a third PHQID study, Glickman et al. studied 500 hospitals participating in CRUSADE (Can Rapid Risk Stratification of Unstable Angina Patients Suppress Adverse Outcomes With Early Implementation of the 9
17 American College of Cardiology/American Heart Association Guidelines), including 54 hospitals that participated in both PHQID and CRUSADE as well as 446 control hospitals that participated only in CRUSADE (Glickman et al. 2007). They examined six acute myocardial infarction measures that were common to both PHQID and control group hospitals. They found no significant difference in a six- measure composite score between the two groups, however two of the six measured showed individual significance. The study also found that there was no significant improvement in hospital mortality rates between the PHQID and control groups. While the initial three PHQID evaluations showed some promising results, many of those results disappeared in later follow- up studies of the program. Werner et al. compared 260 hospitals participating in PHQID with 780 hospitals not participating in the program (Werner et al. 2011). They found that, while in the initial years of the program participating hospitals showed greater improvement in process measures than non- participants, those differences began to taper off by In 2008 there was no longer any statistically significant difference between the hospitals that were and were not participating. The authors hypothesize that this was due to ceiling effects, whereby leading hospitals neared the upper limits of the measures and were not incentivized to improve further. Ryan et al. also found results that contradicted those of the initial round of PHQID studies (Ryan et al. 2012). They looked specifically at the effects of changes to the PHQID incentive structure that were introduced in Per the phase 2 changes, incentive payments were based partly on improvement from the prior year, rather than being awarded only to those in the top two quintiles as originally structured; the change was 10
18 intended to increase participation among the lowest performing hospitals. Ryan et al. used a difference- in- differences analysis and found that, while hospitals continued to improve in phase 2, their rate of improvement slowed. And, hospitals in the lowest quartile that would have benefitted most from the new incentive structure did not show significantly more improvement than hospitals in the second- lowest quartile. In light of these results, Ryan et al. questioned whether quality improvement in phase 1 of the demonstration was simply an artifact of unobserved selection. (Ibid) Two studies examined the effect of the PHQID program on mortality rates specifically, and both found no statistically significant results (Ryan 2009; Jha et al. 2012). Ryan generated a 30- day mortality measure for each hospital, based on the actual mortality rate relative to the mortality rate as predicted by the characteristics of the population served. He then used a hospital fixed effects regression model to control for program selection effects. His analysis showed no significant effect of program participation on risk- adjusted 30- day mortality. In a similar evaluation, Jha et al. studied the 30- day mortality among six million patients with acute myocardial infarction, heart failure, or pneumonia between 2003 and They too found that, after controlling for differences among PHQID hospital participants and patient risk factors, there was no significant difference in mortality rates under the pay- for- performance program. The results of these two studies are in keeping with the 2007 Glickman study results on mortality rates discussed earlier. While the vast majority of academic evaluations of hospital pay- for- performance were directed at the PHQID program, one recent study examined the effects of a pay for performance program in northwest England (Sutton et al. 2012). In 2008 the English 11
19 National Health Service introduced the Advancing Quality demonstration project, modeled after PHQID. Sutton et al. examined risk- adjusted 30- day in- hospital mortality rates for acute myocardial infarction, heart failure, and pneumonia in 24 hospitals included in the program. They used a difference- in- differences analysis, comparing two periods 18 months before and 18 months after adoption of the program, for patients both inside and outside the demonstration project. They found that risk- adjusted mortality decreased significantly at participating hospitals, by 1.3 percentage points or 6 percent from prior levels. Sutton hypothesized that these results differed from those of the PHQID studies for two primary reasons: the incentive size was substantially larger for the English demonstration, and the participating hospitals had agreed to invest heavily in quality improvement activities as part of their involvement in the program. One unpublished study examined the early effects of the new Medicare HVBP program (Ryan et al. 2013b). Ryan et al. compared 2,791 hospitals included in the program (all acute care hospitals except those in Maryland) to 1,018 hospitals not included (critical access hospitals, Veterans Affairs hospitals, and hospitals in Maryland), using a difference- in- differences analysis three years before and three quarters after HVBP took effect. Ryan and colleagues found that hospitals participating in HVBP did not show significant improvement beyond their previous trends in improvement, and also that they did not show significant improvement relative to non- participating hospitals. The authors suspected that the relatively small size of the performance bonus might have been responsible for the insignificant results. 12
20 Conceptual Model Patients go to the hospital when they get sick, seeking medical care to return them to a healthier state. In other words, patients seek a high quality outcome from their care. Patients have a multifaceted conception of a high quality outcome; for example, they want to avoid future readmission to the hospital due to preventable causes, to avoid persistent chronic pain, to avoid restrictions to their daily activities, and to avoid a reduction in their life span. Unfortunately these are all difficult concepts to measure, especially at a time scale feasible for incentivizing better care. At the extreme end of the outcome spectrum, patients clearly do not want to die in the hospital. Fortunately for researchers, inpatient hospital deaths are commonly tracked and reported and thus easy to study. For this reason this study, along with many of the other hospital value- based purchasing studies previously reviewed, assumes patient mortality to be the primary health outcome of interest. There are three primary types of factors that contribute to the outcome quality of a hospital stay. Some factors are related to the patient, such as their age, gender, and whether they have chronic conditions. Second, the physical structure of the hospital has an influence including, for example, the medical equipment available and whether the hospital is a teaching hospital. Lastly, the actions taken by physicians and other providers in delivering care is highly important. A surgeon with a sharp mind and a commitment to research- based clinical best practice will perform better than one without these characteristics. These concepts are depicted in Figure 1 and described in further detail below. 13
21 Figure 1: Determinants of health care quality Patient severity factors include properties inherent in the patient such as their age, gender, and level of income. Others factors relate to the health conditions a patient has upon admission including, for example, whether the patient has renal failure, congestive heart failure, diabetes, or other chronic conditions. Still other severity factors are related to the urgency of the patient s situation, for example whether they entered the hospital through the emergency room or through non- emergency physician referral. Hospital structural factors include things like whether a hospital has advanced medical technology on site, which may result in higher costs but gives doctors quicker results upon which to act. Non- technical factors come into play as well, such as whether a hospital is clean and well maintained. Structural factors also include the quality of management at the hospital and its organizational form such as for profit vs. nonprofit. 14
22 This study makes use of hospital fixed effects to factor out the static structural effects of each hospital. In doing so, it avoids the need for data on hospital structural characteristics. Medical practice factors, such as the actions taken by doctors and other providers during a stay, also have influence on the quality of care outcomes. Care given by careful, well- trained, and intellectually current providers will be more likely to result in positive outcomes. Care quality is also influenced by whether effective processes are in place at a hospital and whether the hospital has a culture of consistently following best practices. Clearly there are countless ways in which medical practice influences patient outcomes and it s not possible to measure all of them. The process measures that attempt to evaluate medical practice usually target only the most important best practices, used to treat the most prominent medical conditions. Patient satisfaction, as measured in this study via the Medicare HCAHPS patient survey, is the final theoretical concept presented in Figure 1 above. Patients derive satisfaction partly from perceptions of the care received, for example depending on whether they felt listened to and whether they thought the hospital was clean and well managed. But patients also care greatly about the final quality outcomes of their stay, and even about the total costs, especially if they must pay a coinsurance or copay fee. A patient s satisfaction would likely be tainted if, for example, they suffered complications that required readmission or if their final bill seemed unreasonably high. 15
23 Data and Variables This study uses two source data sets, which were ultimately merged into a single data set as described below. The first data set is a record of all hospital stays in the states of Arizona and Maryland for the years 2010 through Each observation includes information about the patient (such as age, gender, race, etc.), the diagnoses and treatments they received (stored in the form of ICD- 9 codes), and a dummy variable indicating whether they died during the hospital stay. The data is part of the collection of Health Care Utilization Project State Inpatient Databases (HCUP SID 2012), managed by the Agency for Health Care Research and Quality (AHRQ). The second data set is the HVBP total performance scores (TPS) for all hospitals in Arizona during the first HVBP performance period, which is publicly available on the Medicare Hospital Compare website. The HCUP SID data were restricted to include only stays for which the primary payer was fee- for- service (FFS) Medicare. Specific hospital and stay counts for each state and year, after limiting to FFS Medicare, are presented in Table 3. Additionally, given that inpatient mortality is a relatively rare occurrence, in running the fixed effects regressions described in the results section, several hospitals with few numbers of stays were dropped automatically due to issues with perfect prediction of the dependent variable. This, however, resulted in dropping only 0.78 percent of stays in the sample. The HVBP TPS data from Hospital Compare were then merged into the HCUP SID data on the hospital identifier column for only those stays that took place in Arizona during the first HVBP period. This allowed each such stay to be associated with the TPS score of the hospital in which the stay took place. TPS scores were only available for 16
24 Arizona hospitals because Maryland hospitals are exempt from the HVBP program and so are not given scores; and, TPS scores were only available for stays in the first HVBP period because CMS had not yet released scores for the second period at the time of analysis. Merging between the HCUP SID and the Hospital Compare data set was accomplished using two crosswalk tables, first to link between the HCUP hospital identifier number and the American Hospital Association (AHA) identifier number, and then to link from the AHA identifier to the Medicare Provider Number used in the Hospital Compare data. Table 3. Number of Medicare FFS stays and hospitals in data set, by state and year Year 2010 Year 2011 Year 2012 State Stays Hospitals Stays Hospitals Stays Hospitals Arizona 227, , , Maryland 261, , , The dependent variable of interest in this study is the HCUP SID dummy variable indicating whether or not a hospital stay ended with the patient dying. As previously described in the literature review section, patient mortality has been the most commonly used construct for outcome quality in past evaluations of value- based purchasing programs. Across all stays in the sample, approximately 2.65 percent resulted in the patient dying in the hospital. A more detailed by breakdown by state and by the HVBP performance period in which the stay occurred is given in Table 4. These preliminary summary statistics seem to suggest that HVBP may have had an effect in Arizona relative to Maryland, where the policy was not in effect. 17
25 Table 4. Average inpatient death rate, by state and HVBP period Pre- HVBP* HVBP Period 1* HVBP Period 2* All Years* Arizona 2.14% 2.09% 2.08% 2.11% Maryland 3.10% 3.12% 3.21% 3.13% Both States 2.64% 2.62% 2.69% 2.65% The pre- HVBP period includes all stays prior to July 1, 2011; period 1 includes all stays between July 1, 2011 and March 31, 2012; period 2 includes all stays between April 1, 2012 and December 31, The HVBP policy was in effect in Arizona during period 1 and period 2. An * indicates a statistically significant difference in mean inpatient mortality between Arizona and Maryland during this period. The primary independent variables of interest in this study are a dummy variable indicating whether the stay was at an Arizona hospital or at a Maryland hospital, and two variables indicating the month and year of the stay. The month and year variables allowed each stay to be assigned into either the pre- HVBP period, the first HVBP performance period, or the second performance period; three new indicator variables were generated for these three periods, and an extra combined indicator was generated for stays which occurred within either of the two performance periods covered by the policy. The final independent variable of interest is the HVPB TPS score for the hospital at which the stay occurred. As previously described, this variable was only available for Arizona hospitals during the first HVBP performance period. The average TPS score across Arizona hospitals was 52.90, with a minimum of 16.1, a maximum of 81.8, and a standard deviation of Technically, according to HVBP rules, a TPS score can range from 1 to 100. The distribution of TPS scores across Arizona hospitals is shown in Figure 2. 18
26 Figure 2. Distribution of the HVBP total performance score across hospitals Total performance scores as calculated by the Center for Medicare & Medicaid Services (CMS) for all Arizona hospital eligible for HVBP incentive payments over the first HVBP period, between July 1, 2011 and March 31, There are also a number of control variables from the HCUP SID data that are used in this study. These include age, gender, race, gender, admission source, and the patient s income quartile within their home zip code. The age variable was also coded as a series of five age quintile indicator variables. Descriptive statistics for these variables are presented in Table 5, both separated by state and in aggregate. 19
27 Table 5. Descriptive statistics of primary variables of interest Variable Arizona Only Maryland Only Full Sample Age Mean: Std. Deviation: Mean: Std. Deviation: Mean: Std. Deviation: Gender Female: 52.91% Male: 47.09% Female: 56.62% Male: 43.38% Female: 54.87% Male: 45.13% Race White: 83.56% Black: 3.48% Hispanic: 9.24% Other: 3.72% White: 68.08% Black: 27.95% Hispanic: 1.21% Other: 2.76% White: 75.36% Black: 16.45% Hispanic: 4.98% Other: 3.21% Home zip code income quartile 1 st Quartile: 24.58% 2 nd Quartile: 27.00% 3 rd Quartile: 28.65% 4 th Quartile: 19.77% Admission source Emergency: 64.53% Urgent: 14.41% Elective: 20.95% Other: 0.11% Methodology 1 st Quartile: 34.19% 2 nd Quartile: 26.35% 3 rd Quartile: 22.02% 4 th Quartile: 17.45% Emergency: 77.02% Urgent: 5.70% Elective: 13.76% Other: 3.51% 1 st Quartile: 29.74% 2 nd Quartile: 26.65% 3 rd Quartile: 25.09% 4 th Quartile: 18.52% Emergency: 71.12% Urgent: 9.81% Elective: 17.16% Other: 1.85% This study uses two different models to answer the first and the second questions presented in the introduction. To answer the first question, of whether the HVBP program successfully improved patient health outcomes, this study uses a series of logit hospital fixed effects regressions at the hospital discharge level, of the form shown in Equation 1. This form allows for a difference- in- differences analysis to test for the effect of the HVBP program on the probability of inpatient death. HVBP Period in the equation is an indicator variable equal to one if the stay occurred during a period incentivized by the HVBP program. Arizona is an indicator variable equal to one if the stay occurred at an Arizona hospital. By looking for significance on the interaction between these two indicators, it s possible to determine whether the policy had an effect. Additionally, if the 20
28 coefficient on the interaction is negative, it can be concluded that the policy successfully reduced the inpatient death rate. Equation 1. Predicted probability of inpatient death based on HVBP policy status died = β! + β! Arizona + β! HVBP Period + β! Arizona HVBP Period + γ + δ γ = patient controls (age, gender, race, income, emergency admission, comorbidities) δ = hospital- specific fixed effects To answer the second research question, of the extent of the link between a hospital s HVBP TPS score and the likelihood of inpatient death at that hospital, this study uses a second model described in Equation 2. Since the TPS score is only available for stays in Arizona during the first HVBP performance period, all regressions based on this model use a smaller subset of stays. Hospital fixed effects are not included in this second model since the regressions use only stays during one HVBP period, and since it s desirable that the full outcome variation among hospitals is reflected in the coefficient on the TPS Score variable. By looking for significance on the β1 coefficient, it can be determined whether the TPS score is linked with inpatient death rates. Additionally, if that coefficient is negative, it can be said that a higher TPS score is associated with lower death rates. Equation 2. Predicted probability of inpatient death based on HVBP TPS score died = β! + β! TPS Score + γ γ = patient controls (age, gender, race, income, emergency admission, comorbidities) In addition to the control variables already described above, the γ placeholder in Equation 1 and Equation 2 also includes a set of 25 dummy variables that indicate the presence of several standard chronic conditions also known as comorbidities (Elixhauser 21
29 et al. 1998). These indicator variables were generated using a piece of SAS software available on the HCUP website. The software references the diagnostic codes for each hospital stay to determine whether any of the chronic conditions apply. Results Regression using model 1 yielded significant results on the policy interaction term, suggesting that the HVBP policy did have an effect on inpatient mortality (see Table 6). Specifically, three regressions were tried. In the first regression all variables of interest and all control variables were used, but no fixed effects were included; in the second and third regressions, hospital fixed effects were added. In both the first and second regression, the HVBP Period dummy variable used was an indicator of whether the stay happened during either of the two HVBP performance periods i.e. whether the stay occurred between July 1, 2011 and December 31, In the third regression, the two performance periods were separated into two distinct coefficients and interaction terms. It s interesting to note that in the third regression, only the coefficient on the second performance period s policy interaction term is significant. This seems to suggest that, while the policy effects were significant across both periods when considered together, the effects were more pronounced in the second period. This makes sense given the quality improvement efforts incentivized by the program were likely implemented gradually over time. While the third regression tells us more about the timing of the quality improvement, the second regression is best suited to answer this study s question of whether the policy had an overall effect, given the period indicator variable used in regression 2 covers both of the HVBP performance periods. 22
30 Interpreting regression 2 gives results consistent with the hypothesis that HVBP had a positive effect on outcomes, and with the trends shown in Table 4. The odds ratio coefficient for the Period FY 2013 & 2014 dummy indicates that a stay in Maryland during period one or two is times as likely to result in impatient death as a stay that took place in Maryland prior to the first period. The odds ratio coefficient for the period/arizona interaction term indicates that a stay at a hospital in Arizona during period one or two has 4.4 percent lower odds ( ) of ending in mortality relative to a stay in Maryland in the pre- HVBP period. Note that with the introduction of fixed effects in regression 2, the coefficient on Arizona alone is no longer significant, and thus cannot be interpreted directly. It s likely that the significant effect on Arizona originally shown in regression 1 was absorbed into the hospital fixed effects coefficients for Arizona hospitals. Manual examination of the fixed effects coefficients corroborates this. Regression using model 2 also yielded significant results on the variable of interest (see Table 7). In total, three regressions were run. The first regression included TPS Score as the only independent variable; the second and third regressions also included the standard set of patient controls. Additionally, in the third regression the sample was limited to stays where the patient was admitted for one of the three conditions specifically targeted by the HVBP program measures namely, acute myocardial infarction, heart failure, and pneumonia. The Total Performance Score odds ratio coefficient remains significant and less than one across all three regressions, indicating that an increase in a hospital s TPS score is linked with a decrease in the probability of inpatient mortality. 23
31 Table 6. Logit estimates of probability of inpatient death, by state and year (odds ratios) MODEL 1 (1) (2) (3) No FE FE, Combined FE, Split VARIABLES Period Period Arizona 0.793*** (0.0125) (0.297) (0.295) Period FY 2013 & ** 1.026* (0.0139) (0.0139) Period FY 2013 & 2014 * AZ 0.948** 0.954** (0.0207) (0.0210) Period FY (0.0166) Period FY *** (0.0175) Period FY 2013 * AZ (0.0258) Period FY 2014 * AZ 0.941** (0.0259) Female 0.864*** 0.872*** 0.872*** ( ) ( ) ( ) Black 0.840*** 0.837*** 0.837*** (0.0132) (0.0146) (0.0146) Hispanic 0.932** 0.893*** 0.892*** (0.0259) (0.0257) (0.0257) ER admission 1.336*** 1.443*** 1.443*** (0.0191) (0.0233) (0.0233) Age quintile *** 1.301*** 1.301*** (0.0258) (0.0265) (0.0265) Age quintile *** 1.466*** 1.466*** (0.0282) (0.0293) (0.0293) Age quintile *** 1.723*** 1.723*** (0.0314) (0.0332) (0.0332) Age quintile *** 2.202*** 2.202*** (0.0403) (0.0429) (0.0429) Zip income quartile *** (0.0132) (0.0151) (0.0151) Zip income quartile *** 0.966* 0.966* (0.0128) (0.0170) (0.0170) Zip income quartile *** (0.0147) (0.0196) (0.0196) Constant *** *** *** ( ) ( ) ( ) 25 comorbidity controls YES YES YES Hospital fixed effects NO YES YES Observations 1,441,380 1,430,255 1,430,255 Pseudo R Standard errors in parentheses. *** p<0.01, ** p<0.05, * p<0.1 24
32 Table 7. Logit estimates of probability of inpatient death, by HVBP TPS scores (odds ratios) MODEL 2 (1) (2) (3) No Controls With Controls With Controls, VARIABLES AMI/HF/PN Only Total Performance Score 0.990*** 0.987*** 0.985*** ( ) ( ) ( ) Female 0.843*** 0.821** (0.0312) (0.0747) Black 0.790** (0.0875) (0.257) Hispanic (0.0604) (0.166) ER admission 1.134*** (0.0479) (0.138) Age quintile *** 1.463** (0.0843) (0.277) Age quintile *** 1.863*** (0.0976) (0.336) Age quintile *** 2.279*** (0.110) (0.394) Age quintile *** 2.504*** (0.138) (0.444) Zip income quartile * (0.0427) (0.111) Zip income quartile *** (0.0333) (0.0981) Zip income quartile *** (0.0411) (0.112) Constant *** *** *** ( ) ( ) ( ) 25 comorbidity controls NO YES YES Observations 161, ,018 16,222 Pseudo R Standard errors in parentheses *** p<0.01, ** p<0.05, * p<0.1 25
33 To estimate the magnitude of the effect of the HVBP policy, predictions were calculated using the coefficients from model 1, regression 2. The second regression was chosen because the HVBP period variable in that regression represented the full period covered by the policy. First, the variables Arizona and Period FY 2013 & 2014 were manually set to 0, the Arizona * Period FY 2013 & 2014 interaction variable was set to 1, and a predicted chance of mortality was calculated for each hospital stay in the sample. Then, the interaction variable was also set to 0, and a second probability of mortality was calculated. The mean predicted probability of dying across stays fell from in the former prediction to in the latter, yielding an estimated 4.04 percent reduction in the probability of inpatient mortality as a result of the policy. A paired t- test among the two predicted values found them to be significantly different, at P < Using a similar approach, question 2 can be answered by estimating the percent change in the probability of dying if the HVBP TPS score improved by, say, 10 to 20 percent. The TPS Score variable was manually increased, first by 10 percent then by 20 percent, and after each increment the probability of dying was estimated. The average probability of dying was based on the initial TPS score, after a 10 percent increase, and after a 20 percent increase. This indicates that if a hospital were to improve its score by 10 percent, it would expect to see its mortality rate drop by 6.6 percent; if it improved by 20 percent, it would drop by 12.7 percent. A 20 percent improvement doesn t seem impossible for a hospital to achieve; the average TPS score was 52.9 and the standard deviation was 13.06, indicating a 20 percent improvement is less than a single standard deviation change. A paired t- test found the differences between the baseline, a 10 percent 26
34 TPS score improvement, and a 20 percent improvement to be significant at P < in both cases. Suggestions for Further Research Both question 1 and question 2 yielded statistically significant results that indicated HVBP had a positive effect on health care quality. While the substantive impact of the policy is relatively small a 4.04 percent reduction in death rate it is notable, especially given that hospital fixed effects were used. Still, additional research should be done to confirm the veracity of the policy effect. Namely, research should seek to identify whether there were other quality improvement efforts in Arizona that began during the same period in which the HVBP policy came into effect, to determine whether they may have played a role in the observed quality improvement. Similarly, research should confirm that other independent external forces did not cause Maryland s outcomes to decline over the same period. While external forces that had an impact on both states would be controlled for using the difference- in- differences analysis, any forces acting in only one of the states that began at roughly the same time as the HVBP program could confound the effects of HVBP itself. In addition to looking more closely at what was happening in Arizona and Maryland during the years studied, it might be wise to consider a different set of control and experimental hospital stays. On the control side, this could mean using a data set that includes Veterans Affairs (VA) and critical access hospitals. These types of hospitals are not subjected to the HVBP program, and they are present in nearly every state thus providing a more geographically diverse control group. On the experimental side, hospital data from 27
35 other states, or even a much broader set of data from all non- Maryland states, could be used in place of Arizona hospital data as the experimental group. Future research should also consider the implications of the policy on non- Medicare fee- for- service patients, including members of Medicare Advantage, members of other public health care programs, and members of private insurance. This study considered the Medicare fee- for- service population alone and did not seek to identify whether quality improvements incentivized through HVBP trickled on to other payer types. Lastly, a full look at the topic of health care value must also consider the effects of the HVBP program on costs. While quality outcomes did appear to have improved under the policy, this study did not seek to identify whether costs increased or decreased as a result. It is also, therefore, not possible to say whether the net value of care improved under the policy. Policy Implications & Conclusions Pending further confirmation that indeed the policy is working as intended, policy makers should consider how to amplify its effects. Researchers Sutton et al., as described earlier, studied a value- based purchasing project in northwest England with significant and substantial positive effects on outcomes, on the order of a six percent reduction in risk- adjusted mortality rates (Sutton et al. 2012). The initial incentive size for the top quartile of hospitals under the English program was four percent of their revenue received (Ibid). To further incentivize quality improvement, policy makers could increase the payment incentive size beyond its current plateau of two percent. 28
Medicare P4P -- Medicare Quality Reporting, Incentive and Penalty Programs
Medicare P4P -- Medicare Quality Reporting, Incentive and Penalty Programs Presenter: Daniel J. Hettich King & Spalding; Washington, DC dhettich@kslaw.com 1 I. Introduction Evolution of Medicare as a Purchaser
More informationValue-based incentive payment percentage 3
Report Run Date: 07/12/2013 Hospital Value-Based Purchasing Value-Based Percentage Payment Summary Report Page 1 of 5 Percentage Summary Report Data as of 1 : 07/08/2013 Total Score Facility State National
More informationMedicare Value-Based Purchasing for Hospitals: A New Era in Payment
Medicare Value-Based Purchasing for Hospitals: A New Era in Payment Daniel J. Hettich March, 2012 I. Introduction: Evolution of Medicare as a Purchaser Cost reimbursement rewards furnishing more services
More informationHospital Compare Quality Measures: 2008 National and Florida Results for Critical Access Hospitals
Hospital Compare Quality Measures: National and Results for Critical Access Hospitals Michelle Casey, MS, Michele Burlew, MS, Ira Moscovice, PhD University of Minnesota Rural Health Research Center Introduction
More informationManaging Healthcare Payment Opportunity Fundamentals CENTER FOR INDUSTRY TRANSFORMATION
Managing Healthcare Payment Opportunity Fundamentals dhgllp.com/healthcare 4510 Cox Road, Suite 200 Glen Allen, VA 23060 Melinda Hancock PARTNER Melinda.Hancock@dhgllp.com 804.474.1249 Michael Strilesky
More informationScottish Hospital Standardised Mortality Ratio (HSMR)
` 2016 Scottish Hospital Standardised Mortality Ratio (HSMR) Methodology & Specification Document Page 1 of 14 Document Control Version 0.1 Date Issued July 2016 Author(s) Quality Indicators Team Comments
More informationDelivery System Reform The ACA and Beyond: Challenges Strategies Successes Failures Future
Delivery System Reform The ACA and Beyond: Challenges Strategies Successes Failures Future Arnold Epstein MSU 2018 Health Care Policy Conference April 6, 2018 The Good Ole Days 2 Per Capita National Healthcare
More informationValue Based Purchasing
Value Based Purchasing Baylor Health Care System Leadership Summit October 26, 2011 Sheri Winsper, RN, MSN, MSHA Vice President for Performance Measurement & Reporting Institute for Health Care Research
More informationMedicare Value Based Purchasing August 14, 2012
Medicare Value Based Purchasing August 14, 2012 Wes Champion Senior Vice President Premier Performance Partners Copyright 2012 PREMIER INC, ALL RIGHTS RESERVED Premier is the nation s largest healthcare
More informationPatient Selection Under Incomplete Case Mix Adjustment: Evidence from the Hospital Value-based Purchasing Program
Patient Selection Under Incomplete Case Mix Adjustment: Evidence from the Hospital Value-based Purchasing Program Lizhong Peng October, 2014 Disclaimer: Pennsylvania inpatient data are from the Pennsylvania
More informationValue-Based Purchasing & Payment Reform How Will It Affect You?
Value-Based Purchasing & Payment Reform How Will It Affect You? HFAP Webinar September 21, 2012 Nell Buhlman, MBA VP, Product Strategy Click to view recording. Agenda Payment Reform Landscape Current &
More informationAssociation between organizational factors and quality of care: an examination of hospital performance indicators
University of Iowa Iowa Research Online Theses and Dissertations 2010 Association between organizational factors and quality of care: an examination of hospital performance indicators Smruti Chandrakant
More informationHOSPITAL QUALITY MEASURES. Overview of QM s
HOSPITAL QUALITY MEASURES Overview of QM s QUALITY MEASURES FOR HOSPITALS The overall rating defined by Hospital Compare summarizes up to 57 quality measures reflecting common conditions that hospitals
More informationCMS in the 21 st Century
CMS in the 21 st Century ICE 2013 ANNUAL CONFERENCE David Saÿen, MBA Regional Administrator Centers for Medicare & Medicaid Services San Francisco November 15, 2013 The strategy is to concurrently pursue
More informationNational Provider Call: Hospital Value-Based Purchasing
National Provider Call: Hospital Value-Based Purchasing Fiscal Year 2015 Overview for Beneficiaries, Providers, and Stakeholders Centers for Medicare & Medicaid Services 1 March 14, 2013 Medicare Learning
More informationProgram Summary. Understanding the Fiscal Year 2019 Hospital Value-Based Purchasing Program. Page 1 of 8 July Overview
Overview This program summary highlights the major elements of the fiscal year (FY) 2019 Hospital Value-Based Purchasing (VBP) Program administered by the Centers for Medicare & Medicaid Services (CMS).
More informationMedicare Quality Based Payment Reform (QBPR) Program Reference Guide Fiscal Years
julian.coomes@flhosp.orgjulian.coomes@flhosp.org Medicare Quality Based Payment Reform (QBPR) Program Reference Guide Fiscal Years 2018-2020 October 2017 Table of Contents Value Based Purchasing (VBP)
More informationIMPROVING HCAHPS, PATIENT MORTALITY AND READMISSION: MAXIMIZING REIMBURSEMENTS IN THE AGE OF HEALTHCARE REFORM
IMPROVING HCAHPS, PATIENT MORTALITY AND READMISSION: MAXIMIZING REIMBURSEMENTS IN THE AGE OF HEALTHCARE REFORM OVERVIEW Using data from 1,879 healthcare organizations across the United States, we examined
More informationState of the State: Hospital Performance in Pennsylvania October 2015
State of the State: Hospital Performance in Pennsylvania October 2015 1 Measuring Hospital Performance Progress in Pennsylvania: Process Measures 2 PA Hospital Performance: Process Measures We examined
More informationObjectives. Integrating Performance Improvement with Publicly Reported Quality Metrics, Value-Based Purchasing Incentives and ISO 9001/9004
Integrating Performance Improvement with Publicly Reported Quality Metrics, Value-Based Purchasing Incentives and ISO 9001/9004 Session: C658 2013 ANCC National Magnet Conference Thursday, October 3, 2013
More informationHow Your Hospital s Total Performance Score (TPS) Will Impact Your Medicare Payments
WHITE PAPER: How Your Hospital s Total Performance Score (TPS) Authors: Brooke Palkie, EdD, RHIA and David Marc, MBA, CHDA Copyright 2015 Panacea Healthcare Solutions, Inc. All Rights Reserved As a follow-up
More information2013 Health Care Regulatory Update. January 8, 2013
2013 Health Care Regulatory Update January 8, 2013 Quality-Based Payment Reform, ACOs and Clinical Integration Bruce Johnson and Tom Donohoe Overview Quality-based payment reform programs Major programs
More informationThe Patient Protection and Affordable Care Act of 2010
INVITED COMMENTARY Laying a Foundation for Success in the Medicare Hospital Value-Based Purchasing Program Steve Lawler, Brian Floyd The Centers for Medicare & Medicaid Services (CMS) is seeking to transform
More informationHospital Value-Based Purchasing (VBP) Program
Fiscal Year (FY) 2018 Percentage Payment Summary Report (PPSR) Overview Questions & Answers Moderator Maria Gugliuzza, MBA Project Manager, Hospital VBP Program Hospital Inpatient Value, Incentives, and
More informationHospital Strength INDEX Methodology
2017 Hospital Strength INDEX 2017 The Chartis Group, LLC. Table of Contents Research and Analytic Team... 2 Hospital Strength INDEX Summary... 3 Figure 1. Summary... 3 Summary... 4 Hospitals in the Study
More informationThe Role of Analytics in the Development of a Successful Readmissions Program
The Role of Analytics in the Development of a Successful Readmissions Program Pierre Yong, MD, MPH Director, Quality Measurement & Value-Based Incentives Group Centers for Medicare & Medicaid Services
More informationCommunity Performance Report
: Wenatchee Current Year: Q1 217 through Q4 217 Qualis Health Communities for Safer Transitions of Care Performance Report : Wenatchee Includes Data Through: Q4 217 Report Created: May 3, 218 Purpose of
More informationFY 2014 Inpatient Prospective Payment System Proposed Rule
FY 2014 Inpatient Prospective Payment System Proposed Rule Summary of Provisions Potentially Impacting EPs On April 26, 2013, the Centers for Medicare and Medicaid Services (CMS) released its Fiscal Year
More informationValue based Purchasing Legislation, Methodology, and Challenges
Value based Purchasing Legislation, Methodology, and Challenges Maryland Association for Healthcare Quality Fall Education Conference 29 October 2009 Nikolas Matthes, MD, PhD, MPH, MSc Vice President for
More informationHospital Inpatient Quality Reporting (IQR) Program
Hospital Quality Star Ratings on Hospital Compare December 2017 Methodology Enhancements Questions and Answers Moderator Candace Jackson, RN Project Lead, Hospital Inpatient Quality Reporting (IQR) Program
More informationsnapshot Improving Experience of Care Scores Alone is NOT the Answer: Hospitals Need a Patient-Centric Foundation
SATISFACTION snapshot news, views & ideas from the leader in healthcare satisfaction measurement The Satisfaction Snapshot is a monthly electronic bulletin freely available to all those involved or interested
More informationHospital Inpatient Quality Reporting (IQR) Program
Hospital IQR and VBP Programs: Reviewing Your Claims-Based Measures Hospital-Specific Reports Questions and Answers Speakers Tamara Mohammed, MHA, PMP Measure Implementation and Stakeholder Communication
More informationWorking Paper Series
The Financial Benefits of Critical Access Hospital Conversion for FY 1999 and FY 2000 Converters Working Paper Series Jeffrey Stensland, Ph.D. Project HOPE (and currently MedPAC) Gestur Davidson, Ph.D.
More informationInpatient Quality Reporting Program
Overview of the Hospital Value-Based Purchasing (VBP) Fiscal Year (FY) 2017 Q & A Transcript Moderator: Bethany Wheeler, BS Hospital VBP Program Support Contract Lead Education and Outreach Speaker: Kayte
More informationRural-Relevant Quality Measures for Critical Access Hospitals
Rural-Relevant Quality Measures for Critical Access Hospitals Ira Moscovice PhD Michelle Casey MS University of Minnesota Rural Health Research Center Minnesota Rural Health Conference Duluth, Minnesota
More informationCreating a Patient-Centered Payment System to Support Higher-Quality, More Affordable Health Care. Harold D. Miller
Creating a Patient-Centered Payment System to Support Higher-Quality, More Affordable Health Care Harold D. Miller First Edition October 2017 CONTENTS EXECUTIVE SUMMARY... i I. THE QUEST TO PAY FOR VALUE
More informationpaymentbasics The IPPS payment rates are intended to cover the costs that reasonably efficient providers would incur in furnishing highquality
Hospital ACUTE inpatient services system basics Revised: October 2015 This document does not reflect proposed legislation or regulatory actions. 425 I Street, NW Suite 701 Washington, DC 20001 ph: 202-220-3700
More informationThe 5 W s of the CMS Core Quality Process and Outcome Measures
The 5 W s of the CMS Core Quality Process and Outcome Measures Understanding the process and the expectations Developed by Kathy Wonderly RN,BSPA, CPHQ Performance Improvement Coordinator Developed : September
More informationResearch Design: Other Examples. Lynda Burton, ScD Johns Hopkins University
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More informationSuicide Among Veterans and Other Americans Office of Suicide Prevention
Suicide Among Veterans and Other Americans 21 214 Office of Suicide Prevention 3 August 216 Contents I. Introduction... 3 II. Executive Summary... 4 III. Background... 5 IV. Methodology... 5 V. Results
More informationHospital Inpatient Quality Reporting (IQR) Program
Fiscal Year 2018 Hospital VBP Program, HAC Reduction Program and HRRP: Hospital Compare Data Update Questions and Answers Moderator Maria Gugliuzza, MBA Project Manager, Hospital Value-Based Purchasing
More informationFacility State National
Percentage Summary Report Page 1 of 5 Data As Of: 07/27/2016 Total Performance Facility State National 35.250000000000 37.325750561167 35.561361414483 Unweighted Domain Weighting Weighted Domain Clinical
More informationNEW JERSEY HOSPITAL PERFORMANCE REPORT 2014 DATA PUBLISHED 2016 TECHNICAL REPORT: METHODOLOGY RECOMMENDED CARE (PROCESS OF CARE) MEASURES
NEW JERSEY HOSPITAL PERFORMANCE REPORT 2014 DATA PUBLISHED 2016 TECHNICAL REPORT: METHODOLOGY RECOMMENDED CARE (PROCESS OF CARE) MEASURES New Jersey Department of Health Health Care Quality Assessment
More informationAugust 1, 2012 (202) CMS makes changes to improve quality of care during hospital inpatient stays
DEPARTMENT OF HEALTH & HUMAN SERVICES Centers for Medicare & Medicaid Services Room 352-G 200 Independence Avenue, SW Washington, DC 20201 FACT SHEET FOR IMMEDIATE RELEASE Contact: CMS Media Relations
More informationIs Emergency Department Quality Related to Other Hospital Quality Domains?
ORIGINAL CONTRIBUTION Is Emergency Department Quality Related to Other Hospital Quality Domains? Megan McHugh, PhD, Jennifer Neimeyer, PhD, Emilie Powell, MD, MS, Rahul K. Khare, MD, MS, and James G. Adams,
More informationInpatient Quality Reporting Program
Hospital Value-Based Purchasing Program: Overview of FY 2017 Questions & Answers Moderator: Deb Price, PhD, MEd Educational Coordinator, Inpatient Program SC, HSAG Speaker(s): Bethany Wheeler, BS HVBP
More informationAnalyzing Readmissions Patterns: Assessment of the LACE Tool Impact
Health Informatics Meets ehealth G. Schreier et al. (Eds.) 2016 The authors and IOS Press. This article is published online with Open Access by IOS Press and distributed under the terms of the Creative
More informationMedicaid HCBS/FE Home Telehealth Pilot Final Report for Study Years 1-3 (September 2007 June 2010)
Medicaid HCBS/FE Home Telehealth Pilot Final Report for Study Years 1-3 (September 2007 June 2010) Completed November 30, 2010 Ryan Spaulding, PhD Director Gordon Alloway Research Associate Center for
More informationFactors that Impact Readmission for Medicare and Medicaid HMO Inpatients
The College at Brockport: State University of New York Digital Commons @Brockport Senior Honors Theses Master's Theses and Honors Projects 5-2014 Factors that Impact Readmission for Medicare and Medicaid
More informationNEW JERSEY HOSPITAL PERFORMANCE REPORT 2012 DATA PUBLISHED 2015 TECHNICAL REPORT: METHODOLOGY RECOMMENDED CARE (PROCESS OF CARE) MEASURES
NEW JERSEY HOSPITAL PERFORMANCE REPORT 2012 DATA PUBLISHED 2015 TECHNICAL REPORT: METHODOLOGY RECOMMENDED CARE (PROCESS OF CARE) MEASURES New Jersey Department of Health Health Care Quality Assessment
More informationtime to replace adjusted discharges
REPRINT May 2014 William O. Cleverley healthcare financial management association hfma.org time to replace adjusted discharges A new metric for measuring total hospital volume correlates significantly
More informationQuality Based Impacts to Medicare Inpatient Payments
Quality Based Impacts to Medicare Inpatient Payments Overview New Developments in Quality Based Reimbursement Recap of programs Hospital acquired conditions Readmission reduction program Value based purchasing
More informationHOSPITAL READMISSION REDUCTION STRATEGIC PLANNING
HOSPITAL READMISSION REDUCTION STRATEGIC PLANNING HOSPITAL READMISSIONS REDUCTION PROGRAM In October 2012, CMS began reducing Medicare payments for Inpatient Prospective Payment System (IPPS) hospitals
More informationThe dawn of hospital pay for quality has arrived. Hospitals have been reporting
Value-based purchasing SCIP measures to weigh in Medicare pay starting in 2013 The dawn of hospital pay for quality has arrived. Hospitals have been reporting Surgical Care Improvement Project (SCIP) measures
More informationReducing Readmissions: Potential Measurements
Reducing Readmissions: Potential Measurements Avoid Readmissions Through Collaboration October 27, 2010 Denise Remus, PhD, RN Chief Quality Officer BayCare Health System Overview Why Focus on Readmissions?
More informationMedicare Value Based Purchasing Overview
Medicare Value Based Purchasing Overview Washington State Hospital Association Apprise Health Insights / Oregon Association of Hospitals and Health Systems DataGen Susan McDonough Lauren Davis Bill Shyne
More informationLicensed Nurses in Florida: Trends and Longitudinal Analysis
Licensed Nurses in Florida: 2007-2009 Trends and Longitudinal Analysis March 2009 Addressing Nurse Workforce Issues for the Health of Florida www.flcenterfornursing.org March 2009 2007-2009 Licensure Trends
More informationState FY2013 Hospital Pay-for-Performance (P4P) Guide
State FY2013 Hospital Pay-for-Performance (P4P) Guide Table of Contents 1. Overview...2 2. Measures...2 3. SFY 2013 Timeline...2 4. Methodology...2 5. Data submission and validation...2 6. Communication,
More informationJoint Replacement Outweighs Other Factors in Determining CMS Readmission Penalties
Joint Replacement Outweighs Other Factors in Determining CMS Readmission Penalties Abstract Many hospital leaders would like to pinpoint future readmission-related penalties and the return on investment
More informationModel VBP FY2014 Worksheet Instructions and Reference Guide
Model VBP FY2014 Worksheet Instructions and Reference Guide This material was prepared by Qualis Health, the Medicare Quality Improvement Organization for Idaho and Washington, under a contract with the
More informationIntroduction and Executive Summary
Introduction and Executive Summary 1. Introduction and Executive Summary. Hospital length of stay (LOS) varies markedly and persistently across geographic areas in the United States. This phenomenon is
More informationPASSPORT ecare NEXT AND THE AFFORDABLE CARE ACT
REVENUE CYCLE INSIGHTS PATIENT ACCESS PASSPORT ecare NEXT AND THE AFFORDABLE CARE ACT Maximizing Reimbursements For Acute Care Hospitals Executive Summary The Affordable Care Act (ACA) authorizes several
More informationHospital Value-Based Purchasing Program
Hospital Value-Based Purchasing (VBP) Program Fiscal Year (FY) 2017 Percentage Payment Summary Report (PPSR) Overview Presentation Transcript Moderator/Speaker: Bethany Wheeler-Bunch, MSHA Project Lead,
More informationFrequently Asked Questions (FAQ) Updated September 2007
Frequently Asked Questions (FAQ) Updated September 2007 This document answers the most frequently asked questions posed by participating organizations since the first HSMR reports were sent. The questions
More informationSpecial Open Door Forum Participation Instructions: Dial: Reference Conference ID#:
Page 1 Centers for Medicare & Medicaid Services Hospital Value-Based Purchasing Program Special Open Door Forum: FY 2013 Program Wednesday, July 27, 2011 1:00 p.m.-3:00 p.m. ET The Centers for Medicare
More informationChapter VII. Health Data Warehouse
Broward County Health Plan Chapter VII Health Data Warehouse CHAPTER VII: THE HEALTH DATA WAREHOUSE Table of Contents INTRODUCTION... 3 ICD-9-CM to ICD-10-CM TRANSITION... 3 PREVENTION QUALITY INDICATORS...
More informationThe Determinants of Patient Satisfaction in the United States
The Determinants of Patient Satisfaction in the United States Nikhil Porecha The College of New Jersey 5 April 2016 Dr. Donka Mirtcheva Abstract Hospitals and other healthcare facilities face a problem
More informationMedicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings
Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings Executive Summary The Alliance for Home Health Quality and
More informationNebraska Final Report for. State-based Cardiovascular Disease Surveillance Data Pilot Project
Nebraska Final Report for State-based Cardiovascular Disease Surveillance Data Pilot Project Principle Investigators: Ming Qu, PhD Public Health Support Unit Administrator Nebraska Department of Health
More informationPG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes
PG snapshot news, views & ideas from the leader in healthcare experience & satisfaction measurement The Press Ganey snapshot is a monthly electronic bulletin freely available to all those involved or interested
More informationThe Current State of CMS Payfor-Performance. HFMA FL Annual Spring Conference May 22, 2017
The Current State of CMS Payfor-Performance Programs HFMA FL Annual Spring Conference May 22, 2017 1 AGENDA CMS Hospital P4P Programs Hospital Acquired Conditions (HAC) Hospital Readmissions Reduction
More informationHospital Inpatient Quality Reporting (IQR) Program
Hospital Readmissions Reduction Program Early Look Hospital-Specific Reports Questions and Answers Transcript Speakers Tamyra Garcia Deputy Division Director Division of Value, Incentives, and Quality
More informationUnderstanding Risk Adjustment in Medicare Advantage
Understanding Risk Adjustment in Medicare Advantage ISSUE BRIEF JUNE 2017 Risk adjustment is an essential mechanism used in health insurance programs to account for the overall health and expected medical
More informationUsing the patient s voice to measure quality of care
Using the patient s voice to measure quality of care Improving quality of care is one of the primary goals in U.S. care reform. Examples of steps taken to reach this goal include using insurance exchanges
More informationMinnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System
Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System JUNE 2015 DIVISION OF HEALTH POLICY/HEALTH ECONOMICS PROGRAM Minnesota Statewide Quality Reporting and Measurement
More informationCME Disclosure. HCAHPS- Hardwiring Your Hospital for Pay-for-Performance Success. Accreditation Statement. Designation of Credit.
CME Disclosure Accreditation Statement Studer Group is accredited by the Accreditation Council for Continuing Medical Education (ACCME) to provide continuing medical education for physicians. Designation
More informationAppendix. We used matched-pair cluster-randomization to assign the. twenty-eight towns to intervention and control. Each cluster,
Yip W, Powell-Jackson T, Chen W, Hu M, Fe E, Hu M, et al. Capitation combined with payfor-performance improves antibiotic prescribing practices in rural China. Health Aff (Millwood). 2014;33(3). Published
More informationHospital Value-Based Purchasing (At a Glance)
Hospital Value-Based Purchasing (At a Glance) Healthcare Financial Management Association South Carolina Chapter March 20, 2012 Presenters: Linda Moore, RN, Manager of Federal Programs and Services, CCME
More informationMissed Nursing Care: Errors of Omission
Missed Nursing Care: Errors of Omission Beatrice Kalisch, PhD, RN, FAAN Titus Professor of Nursing and Chair University of Michigan Nursing Business and Health Systems Presented at the NDNQI annual meeting
More informationHospital Inpatient Quality Reporting (IQR) Program
Hospital Inpatient Quality Reporting (IQR) and Hospital Value-Based Purchasing (VBP) Programs Claims-Based Measures Hospital-Specific Report (HSR) Overview and Updates Questions and Answers Moderator Bethany
More informationHospital Inpatient Quality Reporting (IQR) Program
Clinical Episode-Based Payment (CEBP) Measures Questions & Answers Moderator Candace Jackson, RN Project Lead, Hospital IQR Program Hospital Inpatient Value, Incentives, and Quality Reporting (VIQR) Outreach
More informationRegulatory Advisor Volume Eight
Regulatory Advisor Volume Eight 2018 Final Inpatient Prospective Payment System (IPPS) Rule Focused on Quality by Steve Kowske WEALTH ADVISORY OUTSOURCING AUDIT, TAX, AND CONSULTING 2017 CliftonLarsonAllen
More informationDemographic Profile of the Officer, Enlisted, and Warrant Officer Populations of the National Guard September 2008 Snapshot
Issue Paper #55 National Guard & Reserve MLDC Research Areas Definition of Diversity Legal Implications Outreach & Recruiting Leadership & Training Branching & Assignments Promotion Retention Implementation
More informationImproving Hospital Performance Through Clinical Integration
white paper Improving Hospital Performance Through Clinical Integration Rohit Uppal, MD President of Acute Hospital Medicine, TeamHealth In the typical hospital, most clinical service lines operate as
More informationMBQIP Quality Measure Trends, Data Summary Report #20 November 2016
MBQIP Quality Measure Trends, 2011-2016 Data Summary Report #20 November 2016 Tami Swenson, PhD Michelle Casey, MS University of Minnesota Rural Health Research Center ABOUT This project was supported
More informationNORTHWESTERN LAKE FOREST HOSPITAL. Scorecard updated May 2011
NORTHWESTERN LAKE FOREST HOSPITAL Performance Scorecard 2011 updated May 2011 Northwestern Lake Forest Hospital is committed to providing the communities we serve the highest quality health care through
More information2014 MASTER PROJECT LIST
Promoting Integrated Care for Dual Eligibles (PRIDE) This project addressed a set of organizational challenges that high performing plans must resolve in order to scale up to serve larger numbers of dual
More informationFrequently Asked Questions (FAQ) The Harvard Pilgrim Independence Plan SM
Frequently Asked Questions (FAQ) The Harvard Pilgrim Independence Plan SM Plan Year: July 2010 June 2011 Background The Harvard Pilgrim Independence Plan was developed in 2006 for the Commonwealth of Massachusetts
More information(202) or CMS Proposals to Improve Quality of Care during Hospital Inpatient Stays
DEPARTMENT OF HEALTH & HUMAN SERVICES Centers for Medicare & Medicaid Services Room 352-G 200 Independence Avenue, SW Washington, DC 20201 FACT SHEET FOR IMMEDIATE RELEASE April 30, 2014 Contact: CMS Media
More informationHCAHPS: Background and Significance Evidenced Based Recommendations
HCAHPS: Background and Significance Evidenced Based Recommendations Susan T. Bionat, APRN, CNS, ACNP-BC, CCRN Education Leader, Nurse Practitioner Program Objectives Discuss the background of HCAHPS. Discuss
More informationMaking the Business Case
Making the Business Case for Payment and Delivery Reform Harold D. Miller Center for Healthcare Quality and Payment Reform To learn more about RWJFsupported payment reform activities, visit RWJF s Payment
More informationCER Module ACCESS TO CARE January 14, AM 12:30 PM
CER Module ACCESS TO CARE January 14, 2014. 830 AM 12:30 PM Topics 1. Definition, Model & equity of Access Ron Andersen (8:30 10:30) 2. Effectiveness, Efficiency & future of Access Martin Shapiro (10:30
More informationAnalysis of 340B Disproportionate Share Hospital Services to Low- Income Patients
Analysis of 340B Disproportionate Share Hospital Services to Low- Income Patients March 12, 2018 Prepared for: 340B Health Prepared by: L&M Policy Research, LLC 1743 Connecticut Ave NW, Suite 200 Washington,
More informationIs there an impact of Health Information Technology on Delivery and Quality of Patient Care?
Is there an impact of Health Information Technology on Delivery and Quality of Patient Care? Amanda Hessels, PhD, MPH, RN, CIC, CPHQ Nurse Scientist Meridian Health, Ann May Center for Nursing 11.13.2014
More informationEducational Innovation Brief: Educating Graduate Nursing Students on Value Based Purchasing
Rhode Island College Digital Commons @ RIC Master's Theses, Dissertations, Graduate Research and Major Papers Overview Master's Theses, Dissertations, Graduate Research and Major Papers 1-1-2014 Educational
More informationBCBSM Physician Group Incentive Program
BCBSM Physician Group Incentive Program Organized Systems of Care Initiatives Interpretive Guidelines 2012-2013 V. 4.0 Blue Cross Blue Shield of Michigan is a nonprofit corporation and independent licensee
More informationSubmitted electronically:
Mr. Andy Slavitt Acting Administrator Centers for Medicare and Medicaid Services Department of Health and Human Services Attention: CMS-5517-FC P.O. Box 8013 7500 Security Boulevard Baltimore, MD 21244-8013
More informationHospital Value-Based Purchasing (VBP) Program
Hospital Value-Based Purchasing (VBP) Program: Overview of the Fiscal Year 2020 Baseline Measures Report Presentation Transcript Moderator Gugliuzza, MBA Project Manager, Hospital VBP Program Hospital
More informationScoring Methodology SPRING 2018
Scoring Methodology SPRING 2018 CONTENTS What is the Hospital Safety Grade?... 4 Eligible Hospitals... 4 Measures... 6 Measure Descriptions... 9 Process/Structural Measures... 9 Computerized Physician
More informationMinnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System
Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System JUNE 2016 HEALTH ECONOMICS PROGRAM Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive
More information