Best Practices: Data for Learning and Improvement

Similar documents
What Your Patient Experience Data is Telling You Kris White, RN, BSN, MBA The Patient Experience: Improving Safety, Efficiency, and CAHPS

Patient Experience & Satisfaction

Basic Skills for CAH Quality Managers

PRC EasyView Training HCAHPS Application. By Denise Rabalais, Director Service Measurement & Improvement

PG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes

Rx for a Great Future *** Engagement, Alignment, & Leadership

Leadership for Transforming Health Care

INPATIENT SURVEY PSYCHOMETRICS

Introduction to Patient Experience Surveys

Bold Goal PI Radar Dashboard

The Clinician s Impact on the Patient Experience

Terri D. Nuss, MS, MBA Vice President, Patient Centeredness Baylor Health Care System HCAHPS PUBLIC TRUST

National Patient Experience Survey Mater Misericordiae University Hospital.

NURSING SPECIAL REPORT

The Link Between Patient Experience and Patient and Family Engagement

Returning to the Why: Patient and Caregiver Suffering and Care. Christy Dempsey, MSN MBA CNOR CENP SVP, Chief Nursing Officer

PATIENT SATISFACTION REPORT HCAHPS 1 - Inpatient Adult Units MARCH DATA - Final Report 2

Improving the Patient Experience from Admission to Discharge. Yvonne Chase Section Head Patient Access & Business Services Mayo Clinic Arizona

CMS Quality Program Overview

The Science of Emotion

Hospice CAHPS Analysis for Performance Improvement

UNC2 Practice Test. Select the correct response and jot down your rationale for choosing the answer.

03/24/2017. Measuring What Matters to Improve the Patient Experience. Building Compassion Into Everyday Practice

CAHPS Focus on Improvement The Changing Landscape of Health Care. Ann H. Corba Patient Experience Advisor Press Ganey Associates

Electronic Physician Documentation: Increased Satisfaction

PATIENT SATISFACTION REPORT HCAHPS 1 - Inpatient Adult Units APRIL DATA - Final Report 2

Are National Indicators Useful for Improvement Work? Exercises & Worksheets

The Patient Experience at Florida Hospital Learning Module for Students

Hospital Value-Based Purchasing (VBP) Quality Reporting Program

The Voice of Patients:

Drivers of HCAHPS Performance from the Front Lines of Healthcare

PG snapshot PRESS GANEY IDENTIFIES KEY DRIVERS OF PATIENT LOYALTY IN MEDICAL PRACTICES. January 2014 Volume 13 Issue 1

North Carolina. CAHPS 3.0 Adult Medicaid ECHO Report. December Research Park Drive Ann Arbor, MI 48108

Sound Masking Solutions in Healthcare

Inspiring Innovation: Patient Report of Hourly Rounding

To Dial-in: or Event Number: # 4/21/2016

Food for Thought: Maximizing the Positive Impact Food Can Have on a Patient s Stay

CME Disclosure. HCAHPS- Hardwiring Your Hospital for Pay-for-Performance Success. Accreditation Statement. Designation of Credit.

Inpatient Rehabilitation Facilities Patient Satisfaction System

time to replace adjusted discharges

BEDSIDE REGISTRATION CAPE CANAVERAL HOSPITAL

Supporting Statement for the National Implementation of the Hospital CAHPS Survey A 1.0 CIRCUMSTANCES OF INFORMATION COLLECTION

Decreasing Environmental Services Response Times

Inpatient Experience Survey 2012 Research conducted by Ipsos MORI on behalf of Great Ormond Street Hospital

Summary Report of Findings and Recommendations

Visualizing the Patient Experience Using an Agile Framework

Annual Complaints Report 2014/15

Hospital Inpatient Quality Reporting (IQR) Program

Quality Metrics in Post-Acute Care: FIVE-STAR QUALITY RATING SYSTEM

Opportunities and Challenges to Improve the Patient Experience: One Group s Practice

BOARD OF DIRECTORS PAPER COVER SHEET. Meeting Date: 27 May 2009

Patient Payment Check-Up

Special Open Door Forum Participation Instructions: Dial: Reference Conference ID#:

2017/18 Quality Improvement Plan Improvement Targets and Initiatives

How Facilities Can Improve HCAHPS

Composite Results and Comparative Statistics Report

QAPI Making An Improvement

Pursuing the Triple Aim: CareOregon

Saving Lives with Best Practices and Improvements in Sepsis Care

Patient and Family Advisor Orientation Manual

Understand the current status of OAS CAHPS related to

Ready, Set, Go! CG-CAHPS Readiness Carter Ahl Vice President, Engagement Services Avatar Solutions. October 22, 2015

2018 Press Ganey Award Criteria

Focus on Diagnostic Errors: Understanding and Prevention

Patient Experience Heart & Vascular Institute

Advancing Accountability for Improving HCAHPS at Ingalls

National Patient Experience Survey UL Hospitals, Nenagh.

Implementing Health Coaching

ACCELERATING PATIENT EXPERIENCE IMPROVEMENT IN AMBULATORY CARE

MEDICAL PRACTICE REPORT

Enhancing Patient Care through Effective and Efficient Nursing Documentation

Oklahoma Health Care Authority. ECHO Adult Behavioral Health Survey For SoonerCare Choice

Cleveland Clinic Implementing Value-Based Care

Technical Notes for HCAHPS Star Ratings (Revised for October 2017 Public Reporting)

Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) MBQIP Educational Session One Phase Two, January 2013

Joy in Medicine Physician well-being: A discussion on burnout and achieving joy in practice

Executing a Patient Experience Measurement Initiative

Intermountain Healthcare. Culture and Communication, Fostering Healing for Life

1 Million Surveys and Counting: Big Data Reveals the Importance of Communication

Page 1 of 26. Clinical Governance report prepared for NHS Lanarkshire Board Report title Clinical Governance Corporate Report - November 2014

USE OPEN-ENDED QUESTIONS

Cancer Hospital Workgroup

Cancer Hospital Workgroup. Agenda. PPS-Exempt Cancer Hospital Quality Reporting Program. Roll Call PCHQR Program Updates HCAHPS Updates

FREQUENTLY ASKED QUESTIONS FOR HOSPITALS AND ASCS OAS CAHPS

MODULE 8 HOW TO COLLECT, ANALYZE, AND USE HEALTH INFORMATION (DATA) ACCOMPANIES THE MANAGING HEALTH AT THE WORKPLACE GUIDEBOOK

POPULATION HEALTH MANAGEMENT, PROGRAMS, MODELS, AND TOOLS A. LEE MARTINEZ DBH-C, MA, LAC, CPHQ

HIMSS Davies Enterprise Application --- COVER PAGE ---

Healthcare Financial Management Association October 13 th, 2016 Introduction to Accountable Care Organizations and Clinically Integrated Networks

Patient Experience Strategy

Advanced SPC for Healthcare. Introductions

Family Inpatient Communication Survey. Instructions and Instrument

Hospital Strength INDEX Methodology

Activities, Accomplishments, and Impact. Report on the Implementation of the School Based Health Center Quality Improvement Initiative

TeamSTEPPS TM National Implementation

HCAHPS, HSOPS, HACs and HIQRP Connecting the Dots

Improving the patient experience through nurse leader rounds

Quality Management and Accreditation

Case Study High-Performing Health Care Organization December 2008

Inpatient Experience Survey 2016 Results for Royal Infirmary of Edinburgh

Inpatient Experience Survey 2016 Results for Western General Hospital, Edinburgh

Transcription:

These presenters have nothing to disclose. Best Practices: Data for Learning and Improvement Kevin Little, Ph.D. Improvement Advisor Kris White, IHI Faculty DRAFT November 2014 Session Objectives At the conclusion of this session, participants will be able to: Develop changes to improve analysis of survey data Evaluate current state in use of qualitative and quantitative patient experience data Distinguish slow feedback from fast feedback

Agenda Overview Recognize many data sources Survey Data Top Five Advice Qualitative Data: Start with stories Slow vs Fast Feedback Patient Experience Data Overview No single perfect measure of patient experience exists Good enough data, multiply sourced, drives improvement Focus on things that matter Understand organizational opportunities AND team specific opportunities 4

STOP DATA CRAZINESS Stages of Dealing with Patient Experience Data Deny We have Patient Experience data???? Ignore Just don t make eye contact, don t open the email and if subject comes up change it and talk fast! Shoot the messenger The survey tool is biased, my patients are crabbier than anybody else s, cannot possibly reflect what is going on in my unit!! Accept OK- help me learn how to use this to drive change and understand our impact on patients and families. Use Identify high leverage improvement to create the best care outcomes and best environment in which to work.

Symptoms of trouble 7 We pretty much just look at our performance internally and overall we feel pretty good about it. We look only at organizational numbers rolled up, that s what matters at the end of the day. We regularly review our data and form teams around the lowest scores. Every month we review our scores and if we drop down, we form a team to fix it, and if it s up- we get a pizza party. It s all so overwhelming- it s just so hard to know where to start. CAHPS has really changed our focus- it s really the only thing we are focusing on now in my organization. It s all about the clues 8 Work Look Feel These clues are then translated into evaluation of the care and quality of the providers/organization. We must constantly be asking ourselves- what are the data really telling us?

9 Sources of Patient Experience Data Sources of Patient Experience Data A holistic perspective is critical! CAHPS: respecting its influence, understanding its limitations Press Ganey, NRC Picker, Gallup, Avatar, etc. Focus groups Patient Relations Patient/Family advisors Billing Physicians Safety culture surveys Staff and provider engagement surveys Hot comments- a gold mine!

A Table To Organize Patient Experience Data Data Source CAHPS surveys (national government-sponsored patient experience surveys in U.S.) 3 rd party formal surveys, linked to common set of questions across multiple organizations In-house Comment Cards/Open Ended questions of patients Staff vitality surveys, safety culture surveys Patient/Family Advisors Patients and Families Staff Physicians Front line process/service performance data Rounding observations Patient Relations data (grievances, complaints and positive letters) Billing complaints and issues (U.S.) Dashboard metrics: LWBS, errors, safety performance etc. Data Type Survey data Focus groups, conversations Workplace ( Gemba ) data Admin/Operations data Direct or Indirect Patient Experience Direct Indirect Direct Indirect Indirect/Direct Indirect Direct Indirect 11 12 Why are multiple data sources important?

Patient experience is multi-faceted Each data source gives you one view of patient experience. Multiple sources provide different views and details, enable you to build a coherent picture. Activity: Data self-assessment 14

Activity Instructions 15 Individual fill out the chart Table Bingo (5 minutes): (3 minutes) find out who has at least one row with YES filled in across all four columns. What's the row that is most rarely filled in across the columns? Group Debrief (5 minutes) 16 CAHPS and other formal survey data

What about CAHPS? 17 Why we care Common across all U.S. hospitals (and now clinical groups, too) Public access Ballpark right stuff Suitable for dashboards, on run charts CMS has your attention Limitations in our work Time lag too delayed for improvement work Global numbers may not reflect targeted unit work Low response rates silo focus, not team focus for care The n problem (to double precision, you need to quadruple sample size) CAHPS* data: Five items to know 18 1. Use Top Box 2. Understand Percentiles 3. Interpret Correlations 4. Remember how the n matters 5. Plot your data in time order *also applies to every other formal patient experience survey data we know

19 Top Box Why does it matter? What are the implications? Top Box CAHPS Clinician & Group Surveys, version 12-month Survey 2.0, Population: Adult; Language: English, Last Updated September 1, 2011. For survey data, Top Box refers to most positive choice on a ordered scale.* In the CAHPS data, "Always" is often Top Box. *There are a few exceptions, for example on the HCAHPS survey question 21, top box refers to evaluation of a hospital as 9 or 10 out of 10 point scale, with 10 the best hospital possible.

Three Levels of Caring (Fred Lee) Correlation of Patient Care and Evaluation Staff Motivation Staff Performance Patient Evaluation Inspired Required Hired Fired Compassion Courtesy Competence 5 Very Satisfied 4 Satisfied 3 Neutral 1-2 Dissatisfied 21 Patients Perception of Overall Quality of Care 22 Likelihood to Recommend (Percent Excellent) 100.00% 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 87.00% Patients who rate Quality of Care as Excellent are four times more likely to recommend you than those who rate Quality of Care as Very Good 22.80% 20.00% 10.00% 7.90% 1.10% 2.80% 0.00% Excellent Very Good Good Fair Poor Source: HCAHPS Survey & PRC Loyalty Survey: Why measure with both? PRC Inc. January 2008

Reasons to Focus on Top Box 23 Common way for CMS and 3 rd -party vendors to report information Top-box responses are linked to strong customer loyalty Why do you want to focus on less than best service and performance? Partial accounting for "positive bias" in survey responses by patients 24 Percentiles Why does it matter? What are the implications?

Trend in Inpatient Experience Source: 2011 Press Ganey Hospital Pulse Report Four Year Trend in CG CAHPS Visit Survey Data 26 95% Top box Overall Rating of Provider source CAHPS database 2010-2013 90% 85% 80% 75% 70% 2009 2010 2011 2012 2013 2014 Overall 90th ptile 75th ptile 50th ptile 25th ptile

27 Example: Communications Composite, based on six questions https://www.cahpsdatabase.ahrq.gov/cahpsidb/public/cg/cg_topscores.aspx accessed 4 October 2014 28 Select the best answers based on the Percentile Top Box Scores Table from 2013 CG CAHPS patient responses: 1. For the top 10% of providers nationally, more than 97 out of 100 patients assessed the provider as "always listened carefully." T F 2. More than 50% of providers nationally had at least 9 out of 10 of their patients say the provider always spent enough time with them. T F 3. For the composite, a change of 3% in the top-box score translates into a change in the percentile score of 25 or more. T F

Overall Inpatient Experience Score by Hospital Size Hospital Size (Number of Beds) Source: 2010 Press Ganey Hospital Pulse Report Understanding Data--Not for excuses 30 By hospital size? By specialty? By insurance status

Patient Experience by Type of Insurance 31 Source: 2010 Press Ganey Hospital Pulse Report Why percentiles matter 32 You get a sense of the market and performance by competitors can be a shock! Overall organization performance masks service line performance (stratify!) The same top box percent for different service lines has different interpretation

33 Correlations Why do they matter? What are the implications? KL2 34 All survey questions are not created equal!!!

Slide 34 KL2 we have to make sure we don't miss a key point. Low correlations may arise from lack of scatter of skill of nurses or physicians in the analysis. Absence of correlation does not mean no relationship... Kevin Little, 9/30/2013

What is correlation? 35 Correlation, based on either scores or ranks, measures strength of association and ranges from 1 (perfect positive linear or rank order relationship) to 0 (no linear or rank relationship) to -1 (perfect negative linear or reverse rank order relationship.) Here s a picture that shows some invented data, with the correlation coefficient ranging from 0.96 to 0.55 Adult Inpatient How well your pain was controlled Staff addressed emotional needs Room Cleanliness Staff include decision re: treatment Noise level in and around room Staff attitude toward visitors Staff sensitivity to inconvenience Skill of physician Teach/instruct self-care, med, treatment Nurses kept you informed Press Ganey National database through June 30, 2012

Adult Inpatient: Overall Rating correlations Staff addressed emotional needs.79 Staff sensitivity to inconvenience.78 Teach/instruct self-care, med, treatment.78 Staff attitude toward visitors.74 Nurses kept you informed.73 How well your pain was controlled.69 Skill of physician.67 Room cleanliness.62 Noise level in and around room.52 Press Ganey National database through June 30, 2012 Ambulatory Surgery Nurse s courtesy toward family Degree staff worked together Convenience of parking Information given your family Our concern for privacy Information day of surgery Press Ganey National database through June 30, 2011

Ambulatory Surgery Degree staff worked together.79 Our concern for privacy.76 Information day of surgery.75 Information given your family.74 Nurses courtesy toward family.69 Convenience of parking.53 Press Ganey National database April 1, 2012-June 30, 2012 Why correlations matter 40 Correlations are a first step to making sense of relations among multiple survey questions The lowest score on a panel of questions may not be strongly associated with overall evaluation Tackling the lowest score may not be good use of organization resources See the Data Tools Self Assessment for more details about correlation

41 How n matters n should affect interpretation Suppose your top-box performance really is 80%. Variation from sampling effects gives this table as number n of surveys increases:* 42 80.0% "no surprise" range n Low High 10 50.0% 100.0% 20 60.0% 95.0% 30 63.3% 90.0% 50 70.0% 90.0% 100 73.0% 87.0% 200 74.0% 85.0% 400 75.8% 83.5% *using the binomial model and determining the range that you get 95% of the time on repeated sampling. CMS tables use the 95% threshold.

n should affect interpretation 43 Suppose your top-box performance really is 90%. Variation from sampling effects gives this table 90.0% "no surprise" range n Low High 10 70.0% 100.0% 20 75.0% 100.0% 30 76.7% 100.0% 50 80.0% 96.0% 100 84.0% 95.0% 200 84.0% 95.0% 400 87.0% 92.8% Connection to control charts 44 Given n and the percent of top-box responses, p, control chart calculations give a range of plausible per cents. Example If you observe p = 80%, here are the lower and upper control limits (LCL and UCL) for several n values n LCL UCL 10 42.1 100.0 25 56.0 100.0 50 63.0 97.0 100 68.0 92.0 200 71.5 88.5

Any evidence that the Unit's Overall Rating in 2014 is worse than 2012-2013 Baseline? 45 Overall Rating n TB pct 2012 Q1 12 58.33 2012 Q2 16 75 2012 Q3 10 90 2012 Q4 10 80 2013 Q1 3 100 2013 Q2 10 80 2013 Q3 7 42.86 2013 Q4 10 90 2014 Q1 6 50 2014 Q2 12 50 In review of survey data, an executive and unit leader were concerned that the 2014 survey data Top Box scores were lower than 75%, the 2012-2013 average. What actions are called for based on these data? Knowledge of the n effect should 46 Dampen or eliminate management cycles of despair or celebration, based on a single reported percent. Cause you to interpret unit-level results with great caution (unit level n may be 5 or fewer) Help make the case for plotting survey results in time order Inspire you to learn and use control charts--see Provost and Murray (2012)

47 Plot survey data in time order Start with run charts, move on to control charts Baseline Jan 2011 to Aug 2012 Median= 60.5 Is there any evidence that a management intervention begun September 2012 (dashed vertical line) had any impact on HCAHPS Overall Recommendation? Do you have any questions about the baseline period?

In 2011, 9 consecutive values below the baseline median is highly unusual; there appears to be a shift to better performance before the intervention starts. In 2013, 8 consecutive values above the reference median further (incremental) improvement. For more details on run chart rules, see Perla et al. (2011). Take Home Exercise: What interpretations should leaders of this hospital draw as they assess the apparent impact of the intervention begun in September 2013? (n ~ 30 each month)

Recap: Why plot data in time order? Single survey numbers provide no useful guidance for improvement You need time order to make before and after comparisons to assess progress If n is about the same for each survey number in the series, you can look for striking patterns over time to signal improvement or decay Again: See Perla et al. (2011) for run chart rules 52 A quick look at qualitative data

Accessible Qualitative Sources Patient stories and letters Formal leader patient rounding (alone, in pairs, with physicians, etc.) Informal rounding Conversations/discussion in staff meetings 53 Patient stories need interpretation 54 RULE: Always have a formal or informal leader prepared to interpret the meaning of a patient story This is what the story means to me. This is what the story means to our organization.

55 Fast versus Slow Feedback FEEDBACK IS FUNDAMENTAL Feedback is vital to Sex 56 2 July 2003, Science 299 (5615): 2054

57 FAST AND SLOW FEEDBACK: PHYSICIAN COMMUNICATIONS 58

3.1. Fast Feedback, Paper Method The feedback form Behaviors Patient perceptions How form is administered (steps) Feedback to physicians, how handled Summary feedback table and graph Interpretation of the data summary Use in engaging physicians 59 Fast Feedback 60 1. Feedback form on paper 2. First four questions yes or no on behaviors in the bundle 3. Next four questions patient perception of the conversation Note: Slow feedback means the monthly or quarterly numbers from formal HCAHPS or 3 rd party vendors.

Individual Events History 61 Example Logistics of Feedback Form (visit 62 -> feedback -> spreadsheet summary) REQUEST RANDOM 1. Phys asks UC/Charge 1 day or over 1wk 3pts selected 2. Visit occurs or already done FB within 1hr 3. FB w/in 1hr (usually min) Check boxes & send to Q 4. Catch phys & share FB Enter & update Review as group 5. Check boxes & send to Q 6. Enter & update q 2wks

63 3.2 Link Fast Feedback to PDSA 64

65 Summary Patient Experience Data: Summary There is no single perfect measure for patient experience and the experience of care There are multiple sources of good enough data that can drive improvement Formal Survey Data (Quantitative) advice: 1. Use Top Box 2. Understand Percentiles 3. Interpret Correlations 4. Remember the n does matter 5. Plot your data in time order Formal survey data are too slow to guide specific PDSA tests Patient Stories (Qualitative) never stand by themselves 66

OK- back to our symptoms of trouble 67 We pretty much just look at our performance internally and overall we feel pretty good about it. (Survey Percentiles: benchmark) We look only at organizational numbers rolled up, that s what matters at the end of the day. (Stratify by service lines) We regularly review our data and form teams around the lowest scores. (Correlations) Every month we review our scores and if we drop down, we form a team to fix it, and if it s up- we get a pizza party. ( n and plot data in time order) It s all so overwhelming- it s just so hard to know where to start. (Start with proper analysis of formal surveys and direct observations.) CAHPS has really changed our focus- it s really the only thing we are focusing on now in my organization (Remember: Multiple sources vita, CAHPS by itself is not enough) Best Practice Data Ideas 1. Opportunities from Data Self-Assessment 2. Surveys: Focus on Top Box 3. Surveys: Understand Percentiles 3. Surveys: Interpret correlations 4. Surveys: Remember how "n" matters 5. Plot and interpret measure values in time order 6. Deploy fast(er) feedback 7. Assure Patient Stories connect to meaning 8. Reflection Exercise Can you use this idea? Other Colleagues or Groups use this idea? PICKChart: map the ideas to the Impact by Difficulty Grid. Ideas with relatively high impact and low difficulty are immediate candidates for testing.

Reflections and Discussion Set a Goal 70 Don t be afraid to be bold and courageous IF the organization is ready. The conversation is important- have to know where you are heading.

71 So what? I attribute my success to this, I never took or gave an excuse. Don t be afraid of what s real Don t focus on why we re special- each has a unique set of challenges. To learn more 72 Lee, F. (2004), If Disney Ran Your Hospital: 9 ½ Things You Would Do Differently, Second River Healthcare Press. Perla, R., Provost, L., and Murray, S. (2011), The run chart: a simple analytical tool for learning from variation in healthcare processes, BMJ Quality & Safety. 2011 Jan; 20(1):46-51. Provost, L. and Murray, S. The Health Care Data Guide: Learning from Data for Improvement. Jossey-Bass Publishers, 2011, especially chapter 12. Kevin Little screencast, explanation of variation (effect of n) http://www.screencast.com/t/csuvvlbfr Data Tools Self-Assessment with Answers

Appendix 1: Informative Displays build on multiple run charts Small multiples: looking across units 74 Collab start Baseline median

75 Compressed Percentile scale is good news/bad news. Do you know which is which? Appendix 2: More on Fast Feedback Physician Communications Example

Lessons and Key Points 77 CMS Compliance Patients can observe presence/absence of physician behaviors Should patient responses be anonymous? Cognitively impaired patients--options Accept and work around positive bias CMS Compliance 78 activities and encounters that are intended to provide or assess clinical care or promote patient/family well-being are permissible. However, activities and encounters that are primarily intended to influence how patients, or which patients, respond to HCAHPS survey items must be avoided. Avoid wording of questions that closely resemble HCAHPS questions. In particular, you may not use the HCAHPS categories (e.g. Always Usually Sometimes Never ). Yes/No questions such as Did the physician ask what are you most worried about? do not violate HCHAPS protocols (see additional examples in the HCAHPS Quality Assurance Guidelines). Here is the link: http://www.hcahpsonline.org/files/hcahps%20qag%20v9%200%20march%202014.pdf Examples and language guidance on p. 22 of the CMS document

Patients as observers 79 Based on 2012-2013 experience, most patients can score YES/NO on behaviors with good reliability Have short time between encounter and form use (same hour) Patients with cognitive impairment a challenge-- family members or observers can score Initial practice cycles for physicians should focus on pts w/o cognitive issues Teams 2012-2013 tested both anonymous and id'd responses. If anonymous, difficult to engage physician specifically but response can be used for group monitoring and analysis. What about positive bias? 80 Our belief: Patients are vulnerable and loathe to criticize the care team. Positive bias likely Focus on top-box responses only