Summary Report of Findings and Recommendations

Similar documents
Introduction to Patient Experience Surveys

SHORT FORM PATIENT EXPERIENCE SURVEY RESEARCH FINDINGS

North Carolina. CAHPS 3.0 Adult Medicaid ECHO Report. December Research Park Drive Ann Arbor, MI 48108

Oklahoma Health Care Authority. ECHO Adult Behavioral Health Survey For SoonerCare Choice

Patient Experience of Care

Research Brief IUPUI Staff Survey. June 2000 Indiana University-Purdue University Indianapolis Vol. 7, No. 1

Evaluation of Health Care Homes:

2017 CAHPS Child Medicaid Survey Summary Report

Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System

Supporting Statement for the National Implementation of the Hospital CAHPS Survey A 1.0 CIRCUMSTANCES OF INFORMATION COLLECTION

Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System Framework

Colorado Community College System ACADEMIC YEAR NEED-BASED FINANCIAL AID APPLICANT DEMOGRAPHICS BASED ON 9 MONTH EFC

Colorado Community College System ACADEMIC YEAR NEED-BASED FINANCIAL AID APPLICANT DEMOGRAPHICS BASED ON 9 MONTH EFC

Patient survey report Survey of people who use community mental health services 2011 Pennine Care NHS Foundation Trust

Appendix A Registered Nurse Nonresponse Analyses and Sample Weighting

Palomar College ADN Model Prerequisite Validation Study. Summary. Prepared by the Office of Institutional Research & Planning August 2005

The Centers for Medicare & Medicaid Services (CMS) strives to make information available to all. Nevertheless, portions of our files including

DoDEA Seniors Postsecondary Plans and Scholarships SY

Demographic Profile of the Officer, Enlisted, and Warrant Officer Populations of the National Guard September 2008 Snapshot

Patient survey report 2004

Patient survey report Outpatient Department Survey 2009 Airedale NHS Trust

Demographic Profile of the Active-Duty Warrant Officer Corps September 2008 Snapshot

Patient-Mix Adjustment Factors for Home Health Care CAHPS Survey Results Publicly Reported on Home Health Compare in July 2017

Patient survey report Survey of people who use community mental health services gether NHS Foundation Trust

Survey of people who use community mental health services Leicestershire Partnership NHS Trust

Patient survey report Outpatient Department Survey 2011 County Durham and Darlington NHS Foundation Trust

2014 MASTER PROJECT LIST

Licensed Nurses in Florida: Trends and Longitudinal Analysis

2013 Workplace and Equal Opportunity Survey of Active Duty Members. Nonresponse Bias Analysis Report

A comparison of two measures of hospital foodservice satisfaction

CALIFORNIA HEALTHCARE FOUNDATION. Medi-Cal Versus Employer- Based Coverage: Comparing Access to Care JULY 2015 (REVISED JANUARY 2016)

(For care delivered in 2008)

Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System

HCAHPS Survey SURVEY INSTRUCTIONS

Table 1: ICWP and Shepherd Care Program Differences. Shepherd Care RN / Professional Certification. No Formalized Training.

Minnesota Statewide Quality Reporting and Measurement System:

P: E: P: E:

Patient survey report Survey of people who use community mental health services Boroughs Partnership NHS Foundation Trust

Patient survey report Mental health acute inpatient service users survey gether NHS Foundation Trust

PRC EasyView Training HCAHPS Application. By Denise Rabalais, Director Service Measurement & Improvement

Health Care Home Benchmarking. Marie Maes-Voreis MDH Director, Health Care Homes Nathan Hunkins MNCM Account/Program Manger

Member Satisfaction Survey Evaluation Table 19: Jai Medical Systems Member Satisfaction Survey : Overall Ratings

The Prior Service Recruiting Pool for National Guard and Reserve Selected Reserve (SelRes) Enlisted Personnel

Inspecting Informing Improving. Patient survey report Mental health survey 2005 Humber Mental Health Teaching NHS Trust

Patient survey report Survey of adult inpatients in the NHS 2010 Yeovil District Hospital NHS Foundation Trust

Outpatient Experience Survey 2012

MICHIGAN PATIENT EXPERIENCE OF CARE (MiPEC) INITIATIVE IMPLEMENTATION GUIDE 2016 (Round 3)

Your Voice Matters: Patient Experience with Primary Care Providers in Washington State Report.

Patient survey report Survey of adult inpatients in the NHS 2009 Airedale NHS Trust

Patient survey report National children's inpatient and day case survey 2014 The Mid Yorkshire Hospitals NHS Trust

Patient survey report Inpatient survey 2008 Royal Devon and Exeter NHS Foundation Trust

Patient survey report Accident and emergency department survey 2012 North Cumbria University Hospitals NHS Trust

Population Representation in the Military Services

Analysis of Nursing Workload in Primary Care

Minnesota Department of Human Services Nursing Facility Rates and Policy Division. Instruction Manual

School of Public Health University at Albany, State University of New York

time to replace adjusted discharges

2005 Survey of Licensed Registered Nurses in Nevada

Medicaid HCBS/FE Home Telehealth Pilot Final Report for Study Years 1-3 (September 2007 June 2010)

National Cancer Patient Experience Survey National Results Summary

Reenlistment Rates Across the Services by Gender and Race/Ethnicity

Patient survey report Survey of adult inpatients 2013 North Bristol NHS Trust

Minnesota s Physical Therapist Assistant Workforce, 2015

Minnesota s Registered Nurse Workforce

REGISTERING A PATIENT

Long-Term Services and Supports Study Committee: Person-Centered Medicaid Managed Care

National Inpatient Survey. Director of Nursing and Quality

Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) MBQIP Educational Session One Phase Two, January 2013

PEONIES Member Interviews. State Fiscal Year 2012 FINAL REPORT

Long Term Care Nurses Feelings on Communication, Teamwork and Stress in Long Term Care

South Carolina Nursing Education Programs August, 2015 July 2016

Patient survey report Survey of adult inpatients 2012 Sheffield Teaching Hospitals NHS Foundation Trust

CMS Quality Program Overview

Situational Judgement Tests

METHODOLOGY FOR INDICATOR SELECTION AND EVALUATION

Addressing Cost Barriers to Medications: A Survey of Patients Requesting Financial Assistance

2016 National NHS staff survey. Results from Wirral University Teaching Hospital NHS Foundation Trust

Quality ID #348: HRS-3 Implantable Cardioverter-Defibrillator (ICD) Complications Rate National Quality Strategy Domain: Patient Safety

The attitude of nurses towards inpatient aggression in psychiatric care Jansen, Gradus

PATIENT ASSESSMENT SURVEY (PAS) METHODOLOGY <REPORTING YEAR 2017, MEASUREMENT YEAR 2016>

The Florida KidCare Evaluation: Statistical Analyses

To Dial-in: or Event Number: # 4/21/2016

State advocacy roadmap: Medicaid access monitoring review plans

Please answer the survey questions about the care the patient received from this hospice: [NAME OF HOSPICE]

Summary of Findings. Data Memo. John B. Horrigan, Associate Director for Research Aaron Smith, Research Specialist

SOUTHAMPTON UNIVERSITY HOSPITALS NHS TRUST National Inpatient Survey Report July 2011

Kate Goodrich, MD MHS. Director, Center for Clinical Standards & Quality. Center for Medicare and Medicaid Services (CMS) May 6, 2016

Enhancing Outcomes with Quality Improvement (QI) October 29, 2015

Survey of Nurses 2015

THE SURVEY SAYS A SNAPSHOT OF. HealthStream s Pilot of the NEW EMERGENCY ROOM PATIENT EXPERIENCES. with Care Survey (ED-CAHPS)

Are physicians ready for macra/qpp?

Minnesota s Marriage & Family Therapist (MFT) Workforce, 2015

Advancing Nursing Education Science: An Analysis of NLN's Grant Program

State of New York Office of the State Comptroller Division of Management Audit

University of Michigan Health System. Current State Analysis of the Main Adult Emergency Department

REPORT OF THE COUNCIL ON MEDICAL SERVICE. Hospital-Based Physicians and the Value-Based Payment Modifier (Resolution 813-I-12)

Quality Assurance Guidelines Version 2

Michigan Primary Care Transformation (MiPCT) Project Frequently Asked Questions

2016 National NHS staff survey. Results from Surrey And Sussex Healthcare NHS Trust

Medicare Advantage Star Ratings

Transcription:

Patient Experience Survey Study of Equivalency: Comparison of CG- CAHPS Visit Questions Added to the CG-CAHPS PCMH Survey Summary Report of Findings and Recommendations Submitted to: Minnesota Department of Health Health Care Homes Unit by Dale Shaller, MPA Alistair James O'Malley, PhD Carla Zema, PhD vember 23, 2012

Acknowledgements The authors wish to acknowledge Cherylee Sherry of the Minnesota Department of Health for her leadership in spearheading the planning and implementation of this project. We also thank Sarah Fryda and Jason Newton of NRC Picker for conducting the survey data collection and preparation of data files for analysis. We are grateful to the clinic sites of HealthEast Care System, HealthPartners, rth Memorial Health Care, and Stillwater Medical Group for their voluntary participation, and to their many patients for responding to the surveys used in this study. About the Authors Dale Shaller, MPA is principal of Shaller Consulting Group, a health policy analysis and management consulting practice based in Stillwater, Minnesota. Alistair James O'Malley, PhD, is an associate professor of statistics in the Department of Health Care Policy at Harvard Medical School. Carla Zema, PhD, is an independent consultant and assistant professor of economics and health policy at St. Vincent College in Pittsburgh, Pennsylvania. All three authors are members of the national CAHPS Consortium providing technical assistance to CAHPS survey users across the country. 2

Introduction This report presents a summary of findings and recommendations from a study commissioned by the Health Care Homes Unit of the Minnesota Department of Health (MDH) to assess the equivalency of two versions of the CAHPS Clinician & Group (CG-CAHPS) Survey: 1) the CG- CAHPS Visit Survey (CG-Visit), compared to 2) the CG-CAHPS Patient-Centered Medical Home Survey (CG-PCMH) modified to include the core questions from the CG-Visit Survey. Under current State of Minnesota health care reporting requirements, all clinics with sufficient patient volume are required to collect and report data using the CG-Visit survey. In addition, the MDH Health Care Homes Unit is considering asking certified health care home clinics to collect and report the CG-PCMH survey as part of its patient experience recertification requirements. In order to meet both surveying requirements, practices would have to administer two different survey versions. If survey results for the core CG-Visit survey questions used in both surveys are found to be the same, health care home clinics would be able to use only the modified CG-PCMH survey to meet two State of Minnesota reporting requirements, thus reducing survey burden and cost. This report summarizes the study background and aims, methods, key findings, and implications for state policymakers regarding the use of the CG-Visit survey versus the CG-PCMH survey with core CG-Visit items added. Background and Aims Minnesota s health reform law enacted in 2008 includes a number of initiatives aimed at improving the quality and affordability of care. The health care homes provision of the law is designed to promote a new approach to primary care emphasizing the delivery of coordinated care from a team of health care providers focusing on common goals. Health care providers must meet a set of standards and criteria in order to be certified as a health care home in Minnesota. Among these requirements, MDH is considering asking health care homes to collect and report the new CG-CAHPS Patient-Centered Medical Home Survey (CG-PCMH) as part of its patient experience recertification requirements. The Minnesota health reform law also includes requirements for providers to collect and report a set of quality measures through the Minnesota Statewide Quality Reporting and Measurement System. As part of this system, a rule that took effect in 2012 requires all primary care clinics with sufficient patient volume to collect and report the CG-CAHPS Visit Survey (CG-Visit). A large number of clinics throughout the state are currently collecting CG-Visit survey data that will be publicly reported in the spring of 2013. Given these two requirements for patient experience survey data -- one currently in effect for the CG-Visit survey and the other under consideration by the MDH Health Care Homes Unit -- primary care clinics seeking health care home recertification would need to collect and report both CG-Visit and CG-PCMH data. However, if a modified version of the CG-PCMH survey including the required CG-Visit core could satisfy both requirements, health care home clinics would need to collect and report only one survey instead of two. The ability of health care homes to use just one survey would reduce data collection and reporting burdens, and also would make 3

possible the comparison of core CG-Visit results from primary care clinics that are health care homes to primary care clinics that are not health care homes. Methods To test equivalency between the core questions administered using the CG-Visit survey alone against the CG-PCMH survey modified to include the CG-Visit core questions, a randomized split sample of patients was drawn from six participating clinic sites in the Twin Cities metropolitan area recruited by the MDH Health Care Homes Unit. The distribution of sites volunteered by health care organizations participating in the study is shown in Table 1. Table 1. Distribution of Participating Clinic Sites Health Care Organization Number of Sites HealthEast Care System 2 HealthPartners 2 rth Memorial Health Care 1 Stillwater Medical Group 1 Total 6 Within each site, one arm of the sample received the CG-Visit survey. The other arm of the sample received the CG-PCMH survey plus Visit survey core questions. Figure 1 illustrates the split sample study design. Figure 1. Randomized Split Sample Design The CG-Visit Survey (adult version) is a 37-item questionnaire that assesses the patient s experience of care in three composite domains: Access, Communication, and Office Staff. The CG-Visit Survey also includes individual items for Provider Rating, Office Recommendation, and Follow-up on Test Results. The Access composite consists of 5 core questions that ask the patient to report on their experience in the last 12 months. The communication (6 core questions) and 4

office staff (2 core questions) composites ask patients to report on their experience during their most recent visit. (See Appendix A for further detail on the CG-Visit questionnaire.) The CG-PCMH Survey (adult version) is a 52-item questionnaire based on the 12-month version of CG-CAHPS. In addition to the core Access, Communication, and Office Staff composites, the CG-PCMH survey includes supplemental items sets assessing coordination of care, shared decision-making, comprehensiveness, and self management. The Access composite measure is the same for both the CG-Visit and the CG-PCMH surveys. Both surveys also include the same core individual item for Provider Rating. The Communication and Office Staff composites as well as an individual core item on Follow-up on Test Results have similar items but ask about a different timeframe (i.e. last 12 months vs most recent visit). The CG-Visit survey also has an additional core item for Office Recommendation. For purposes of this study, the CG-PCMH survey was modified to include the core Communication and Office Staff composite questions from the CG-Visit version, as well as the individual items for Office Recommendation and Follow-up on Test Results. These additional CG-Visit core questions were added as supplemental items at the end of the survey. The modified version had a total of 66 questions. MDH contracted with NRC Picker through a competitive bidding process to conduct all sampling and data collection for the study. The sample frame for the study consisted of adult (age 18 or over) English-speaking, primary care patients who had at least one visit to the clinic site in the previous twelve months. Primary care was defined to include visits by patients to family practice and internal medicine, including Med-Peds (Combined Internal Medicine & Pediatrics) providers. Samples were drawn from patient data files provided by the clinic sites. The sample was drawn irrespective of either the reason for the visit or the duration of the patient/provider relationship so that the full range of patients was represented. In order to achieve the target of 250 completed surveys for each of the two study arms within the site, samples of 1,250 patients for each of the two study arms were drawn at the site level for each participating clinic for a total of 2,500 patients sampled at each of the 6 clinics. Data were collected using a standard two-wave mailing methodology consisting of an initial mailing followed by a second mailing to non-respondents approximately three weeks after the first questionnaire was sent. The field period for the study was initiated in early May 2012 and was closed at the end of June 2012. Data collected for the study were provided to members of the CAHPS Consortium (the authors) for analysis. Results from the two surveys collected from both sample arms were compared at the site level and overall to assess equivalency with respect to response rates, respondent characteristics, item scores, item reliability, and site-level reliability. Any differences between the arms also were assessed to determine the impact on the overall ranking of the practice site. Survey results were analyzed as both mean and top box scores. Mean scores are reported as the numerical average of a given response scale after converting response categories to a number, as shown in Table 4. For all mean scores, a higher number is better. Top box scores refer to the percentage of respondents choosing the most positive response category in a given response scale. Higher top box scores are better. The exact question wording for each rating and composite measure can be found in Appendix A. 5

Table 4. Definition of Labels, Response Scales, and Top Box Scores Label Definition Response Scale Top Box Score RateDoc Patients' rating of the 0-10 scale where 0 = worst, % 9 and 10 combined provider 10 = best RecDoc Willingness to recommend the provider's office 3= 2= % Access Communication Staff Getting timely appointments, care, and information (composite of 5 questions) How well providers communicate with patients (composite of 6 questions) Helpful, courteous, and respectful office staff (composite of 2 questions) 1= 4=Always 3=Usually 2=Sometimes 1=Never 3= 2= 1= 3= 2= 1= % Always % % Adjusted and unadjusted results were calculated. Adjusted scores were obtained using a hierarchical regression model that accounts for patient characteristics and unmeasured clinic-level survey invariant factors. Because it allows for the possibility that there are small differences in patient populations across survey type, the adjusted ratings provide a more rigorous estimate of the magnitude of the survey effect for each item. The associated t-statistics and p-values quantify the level of statistical evidence that the true difference is non-zero. The adjusted mean is the expected mean for a baseline patient and so represents the mean outcomes for an item one would expect if the same patient counterfactually answered both surveys. Results Response Rates Table 2 shows individual clinic site-level responses as well as totals for the two survey versions. Response rates for the two surveys were similar, with the CG-Visit survey receiving an average of close to a 33% unit response rate (i.e., 33% of individuals returned at least a partially complete survey) and the CG-PCMH survey just over 29%. For all clinic sites, the average response rate for the CG-Visit survey was higher than the average response rate for the CG-PCMH survey. In some sites, the response rate for the CG-PCMH Survey was significantly lower than the CG-Visit where in other sites the response rates were similar. These findings suggest that surveys of even considerable length (n = 66 questions) are able to achieve response rates comparable to shorter surveys. 6

Table 2. Comparison of Response Rates and Completed Survey Returns Response Rate (and n of completed survey returns) Clinic Site CG-Visit CG-PCMH A 35.93% (447) 34.36% (425) B 32.50% (404) 29.69% (369) C 33.03% (405) 30.19% (368) D 32.21% (400) 26.77% (329) E 22.88% (283) 22.05% (273) F 38.92% (481) 32.09% (397) Totals 32.58% (2,420) 29.19% (2,161) Respondent Characteristics An important consideration for comparing the performance of the two surveys is to assure that the characteristics of patients surveyed in both arms of the study are approximately the same. Table 3 compares key characteristics of survey respondents across all 6 sites. The comparison indicates that patients surveyed with the two survey versions were strikingly similar with respect to gender, race, age, education, and self-reported physical and mental health status. Significance testing found no differences (p >.10 for all comparisons). Table 3. Comparison of Respondent Characteristics Across All Sites # Respondents Percent of total Characteristic CG-Visit CG-PCMH CG-Visit CG-PCMH Gender Female 1452 1310 62.24 62.98 Male 881 770 37.76 37.02 Hispanic 2184 2016 98.47 98.78 Yes 34 25 1.53 1.22 Race White 2151 1955 94.92 94.54 Black 34 35 1.5 1.69 Asian 44 47 1.94 2.27 Pacific Islander 2 2 0.09 0.1 Native American 2 6 0.09 0.29 Other 33 23 1.46 1.11 Age 18-24 years 49 42 2.1 2.02 25-34 years 154 121 6.6 5.81 35-44 years 169 147 7.25 7.06 7

# Respondents Percent of total 45-54 years 329 249 14.11 11.97 55-64 years 557 524 23.89 25.18 65-74 years 547 537 23.46 25.8 75 years or older 527 461 22.6 22.15 Education 8th grade or less 20 19 0.89 0.93 High school, partial 47 41 2.09 2 High school, finish 497 493 22.06 24 College, partial 707 607 31.38 29.55 College, finish 423 402 18.77 19.57 Graduate study 559 492 24.81 23.95 Physical health Excellent 314 278 13.56 13.45 Very Good 941 846 40.65 40.93 Good 784 716 33.87 34.64 Fair 240 191 10.37 9.24 Poor 36 36 1.56 1.74 Mental health Excellent 665 641 28.59 30.91 Very Good 953 845 40.97 40.74 Good 547 460 23.52 22.18 Fair 147 108 6.32 5.21 Poor 14 20 0.6 0.96 Internal Validity Table 4 presents correlations between the composite items and their constituent items. The objective is to assess whether the constituent items belong to a given composite (indicated by a high correlation). We compared the results between surveys to assess whether the composite items hold together equally well on each. Almost all correlations are 0.7 or higher implying that they do. Table 4. Correlations Between Question Items Within Composites CG-Visit CG-PCMH Measure N Cor (mean) Cor (top) N Cor (mean) Cor (top) Access composite 2420 2161 Q6 929 0.779 0.773 757 0.784 0.776 Q8 1638 0.722 0.760 1539 0.750 0.746 Q10 662 0.719 0.702 648 0.745 0.707 Q12 84 0.819 0.775 97 0.796 0.781 Q13 2139 0.772 0.734 1926 0.743 0.703 Communication composite 2420 2161 8

Measure N CG-Visit CG-PCMH Cor (mean) Cor (top) N Cor (mean) Cor (top) Q16 2164 0.816 0.796 1932 0.841 0.817 Q17 2163 0.883 0.846 1930 0.892 0.872 Q19 1865 0.858 0.827 1554 0.853 0.834 Q20 2154 0.758 0.701 1915 0.772 0.723 Q21 2165 0.848 0.809 1926 0.850 0.817 Q22 2159 0.814 0.778 1924 0.805 0.771 Office staff composite 2420 2161 Q27 2132 0.924 0.922 1923 0.950 0.945 Q28 2155 0.870 0.860 1919 0.928 0.925 Site-Level Reliability The analysis of clinic site-level reliability indicates that results are consistent across the survey versions. For most of these measures, reliabilities based on mean scoring are close to or exceed the.70 threshold generally recommended for public reporting of CAHPS results (assuming 250 completed surveys at the clinic site level). As expected, reliabilities are consistently lower when using top-box scoring as opposed to mean scoring. However, reliability estimates will likely improve when more clinics are included. Table 5. Site-level Reliability Estimates Survey Version CG-Visit CG-PCMH Mean RateDoc 0.762 0.715 RecDoc 0.707 0.754 Access composite 0.844 0.833 Communication composite 0.627 0.690 Office Staff composite 0.839 0.908 Top Box RateDoc 0.486 0.639 RecDoc 0.644 0.760 Access composite 0.813 0.782 Communication composite 0.554 0.663 Office Staff composite 0.817 0.902 Mean and Top Box Scores 9

Given that the patient characteristics of the samples drawn for both surveys are the same, and that overall response rates are quite similar, the scores obtained using the two surveys can be fairly compared even without adjusting for individual characteristics to determine if the CG-Visit rating and composite measures used in both surveys yield equivalent results. Tables 6 and 7 present comparisons of unadjusted to adjusted scores for the CG-Visit rating and composite measures for both mean and top-box results, respectively. Table 6. Comparison of Mean Scores Measure N Unadjusted CG- CG-Visit PCMH CG-Visit Adjusted CG- PCMH p-value Rating questions RateDoc 2146 8.980 9.025 8.736 8.732 0.938 RecDoc 2142 2.818 2.828 2.730 2.730 0.978 Composites Access 2164 3.408 3.380 3.355 3.321 0.001 Communication 2179 2.866 2.844 2.825 2.802 0.016 OffStaff 2168 2.926 2.879 2.851 2.809 0.0001 Table 7. Comparison of Top Box Scores Measure N Unadjusted CG- CG-Visit PCMH CG-Visit Adjusted CG- PCMH p-value Rating questions RateDoc 2146 76.1 76.5 71.3 70.6 0.592 RecDoc 2142 85.6 86.0 78.5 78.4 0.911 Composites Access 2164 58.1 56.4 53.9 51.7 0.001 Communication 2179 88.7 87.0 85.6 83.6 0.007 OffStaff 2168 92.2 88.7 85.4 81.6 0.0001 Two important findings emerge from the comparisons shown in Tables 6 and 7: First, no significant differences are found when comparing the rating questions, implying that there is no evidence of any difference for these measures due to the surveys. Second, although small in magnitude, there are significant differences for the composite items in both unadjusted comparisons and adjusted comparisons based on the hierarchical model. In all cases, the mean and top box scores for CG-PCMH composites are consistently lower than the CG-Visit survey composite scores. 10

Impact on Clinic Rankings Given the small but significant differences in composite results between the two surveys, we also examined the impact of survey version on clinic rankings. Each clinic was assigned a ranking from highest (=1) to lowest (=6) across the two surveys. The concordance of the results (using both mean scoring and top-box scoring) between the surveys was calculated using the Kendall correlation coefficient. Unlike the Pearson correlation, which uses the actual means and thus is susceptible to outliers, the Kendall correlation coefficient is based on the clinic ranks. Coefficients closer to +1 indicate a high degree of concordance in the rankings of the clinics. Table 9 shows the Kendall correlation coefficients. Table 9. Kendall Correlation of Clinic Rankings Mean Scores Top Box Scores Measure Kendall Kendall RateDoc 0.600 0.867 RecDoc 0.467 0.467 Access composite 0.733 0.733 Communication composite 0.867 1.000 Staff composite 0.333 0.333 The results for the Provider Rating, Access and Communication composites indicate positive concordance of clinic rankings. In other words, the ranking of the clinics based on the CG- PCMH were similar to those rankings based on the CG-Visit. However, the results for the Provider Recommendation and the Office Staff composite measure indicate that clinic rankings change based on the survey version used. This finding has significant implications for public reporting. Discussion and Recommendations When patients respond to the same Access, Communication, and Office Staff questions on the shorter CG-Visit survey, their mean and top box responses are consistently higher than when similar patients respond using the longer CG-PCMH survey modified to include the core Visit survey items. Although the differences are statistically significant (p<.05), the absolute difference in scores for the composite measures is not large -- approximately.02 points on average between the two surveys. Differences that are statistically significant are not surprising given the large number of surveys analyzed in this study. Perhaps the most compelling observation is that, while not large in magnitude, the differences are so consistent in the same direction. Interestingly, there is no difference in responses between the two surveys to the Rating and Recommend questions. Such small differences in the composite measures are not likely to be material, but they may matter to a clinic concerned about getting slightly lower scores on average by using CG-PCMH survey with the core Visit questions added as supplemental items. In most cases the difference is not likely to substantially affect rank order, but it is possible that it could in some cases, and that 11

might matter to the affected clinics. The reasons why the CG-PCMH plus Visit survey scores are lower are not known, but this finding is consistent with the survey literature suggesting that adding extra similar questions to the end of a survey results in lower scores for those questions. Cognitive testing by the CAHPS Consortium suggests that lower scores may be due to respondent fatigue and/or frustration at having to answer similar questions twice. The initial decision by the State of Minnesota to use the CG-Visit Survey for public reporting was based largely on the fact that most practices were currently using a visit-based survey for their own internal surveying efforts. Some medical groups had even adopted the CG-Visit version as their routine internal survey. The State sought to minimize the burden to practices. However, over the past several years since that decision was made, other stakeholders, such as the Centers for Medicare & Medicaid Services (CMS) and the National Committee for Quality Assurance (NCQA), have chosen CG-CAHPS versions based on the 12-month core survey (e.g., the PCMH version). Psychometric analyses show that the CG-PCMH Survey performs better than the CG- Visit Survey at differentiating between practices. Given the high stakes nature of public reporting, we recommend the most prudent course for the State might be to require the same survey for all primary care clinics and that survey should be the CG-PCMH survey (without the Visit items added). This would satisfy the MDH health care homes objective, and also help clinics meet other evolving national reporting requirements. 12

Appendix A CG-CAHPS Visit Survey Reporting Composites and Ratings ACCESS: Getting Timely Appointments, Care, and Information Q6 Q8 Q10 Q12 Q13 In the last 12 months, when you phoned this doctor s office to get an appointment for care you needed right away, how often did you get an appointment as soon as you thought you needed? In the last 12 months, when you made an appointment for a check-up or routine care with this doctor, how often did you get an appointment as soon as you thought you needed? In the last 12 months, when you phoned this doctor s office during regular office hours, how often did you get an answer to your medical question that same day? In the last 12 months, when you phoned this doctor s office after regular office hours, how often did you get an answer to your medical question as soon as you needed? Wait time includes time spent in the waiting room and exam room. In the last 12 months, how often did you see this doctor within 15 minutes of your appointment time? Never Sometimes Usually Always Never Sometimes Usually Always Never Sometimes Usually Always Never Sometimes Usually Always Never Sometimes Usually Always COMMUNICATION: How Well Doctors Communicate With Patients Q18 During your most recent visit, did this doctor explain things in a way that was easy to understand? Q19 During your most recent visit, did this doctor listen carefully to you? Q21 Q22 Q23 Q24 During your most recent visit, did this doctor give you easy-tounderstand instructions about taking care of these health problems or concerns? During your most recent visit, did this doctor seem to know the important information about your medical history? During your most recent visit, did this doctor show respect for what you had to say? During your most recent visit, did this doctor spend enough time with you? OFFICE STAFF: Helpful, Courteous, and Respectful Office Staff Q28 During your most recent visit, were clerks and receptionists at this doctor s office as helpful as you thought they should be? 13

Q29 During your most recent visit, did clerks and receptionists at this doctor s office treat you with courtesy and respect? Follow-up on Test Results Q17 During your most recent visit, when this doctor ordered a blood test, x-ray, or other test for you, did someone from this doctor s office follow up to give you those results? Patients Rating of the Doctor Q25 Using any number from 0 to 10, where 0 is the worst doctor possible and 10 is the best doctor possible, what number would you use to rate this doctor? Willingness to Recommend Q26 Would you recommend this doctor s office to your family and friends? Yes 0-10 14

Appendix B Clinic-Level Results The following table compares mean scores and top box scores for the 6 clinic sites that participated in the MDH Health Care Homes comparison study of the CG-CAHPS Visit Survey (CG-Visit) and PCMH Survey modified to include the core CG-CAHPS Visit Survey items (CG- PCMH). There are small differences in clinic mean and top box scores, although Site F appears to have consistently low scores across the ratings and composites; the Access score is especially low. Site A and Site D appear to have consistently high scores but not for all measures. 15