Are National Indicators Useful for Improvement Work? Exercises & Worksheets

Similar documents
Basic Skills for CAH Quality Managers

A Deeper Dive into the Science of Improvement

Practical Skills Building Session: Control Charts Worksheets

Advanced Measurement for Improvement Prework

UNC2 Practice Test. Select the correct response and jot down your rationale for choosing the answer.

Choosing and Prioritizing QI Project

NHS performance statistics

Population and Sampling Specifications

NHS Performance Statistics

Same day emergency care: clinical definition, patient selection and metrics

2018 African Forum on Quality and Safety in Healthcare. Better Quality Through Better Measurement. Session Objectives

Percentage of provider spells with an invalid primary diagnosis code

Page 1 of 26. Clinical Governance report prepared for NHS Lanarkshire Board Report title Clinical Governance Corporate Report - November 2014

Session 5: C. difficile LabID Event Analysis for Long-term Care Facilities Using NHSN

Milestones in the Quality Measurement Journey

Incentives and Penalties

Adopting Standardized Definitions The Future of Data Collection and Benchmarking in Alternate Site Infusion Must Start Now!

Identifying and Defining Improvement Measures

Commissioning for quality and innovation (CQUIN): 2014/15 guidance. February 2014

Decreasing the Unplanned Readmission Rate of Patients receiving Outpatient Antibiotic Therapy(OPAT)

The Pain or the Gain?

A23/B23: Patient Harm in US Hospitals: How Much? Objectives

AMBSCORE in action. Karen Smith AEC Advanced Nurse Practitioner Good Hope Hospital Heart of England NHS Foundation Trust

A Measurement Guide for Long Term Care

Inpatient, Day case and Outpatient Stage of Treatment Waiting Times

3. Does the institution have a dedicated hospital-wide committee geared towards the improvement of laboratory test stewardship? a. Yes b.

State Fiscal Year 2017 Validation of Performance Measures for Region 7 Detroit Wayne Mental Health Authority

Putting It All Together: Strategies to Achieve System-Wide Results

PG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes

Harm Across the Board Reporting: How your Hospital Can Get There

Application Guidelines

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/15/2016

Final. Andrew McMylor / Dr Nicola Jones. Jeremy Fenwick, Battersea Healthcare CIC

The Patient Experience at Florida Hospital Learning Module for Students

LVHN Sepsis Quality Improvement Project

Clinical Safety & Effectiveness Cohort # 8

Emergency admissions to hospital: managing the demand

Safety in Mental Health Collaborative

SCHEDULE 2 THE SERVICES Service Specifications

Step by step measurement guide

A Comprehensive Framework for Patient Safety

NHS performance statistics

Presenter Disclosure

Practical Quality Improvement Strategies in a Busy Community Clinic

Risk Adjustment Methods in Value-Based Reimbursement Strategies

Measured Implementation of an Accelerated Chest Pain Diagnostic Pathway in Rural Practice. Proof of concept

INCENTIVE OFDRG S? MARTTI VIRTANEN NORDIC CASEMIX CONFERENCE

Presented for the AAPC National Conference April 4, 2011

April Clinical Governance Corporate Report Narrative

National Update on Malnutrition

PSI-15 Lafayette General Health 2017 Nicholas E. Davies Enterprise Award of Excellence

INSERT ORGANIZATION NAME

Fiscal Year 2017 (10/01/16-9/30/17) ESRD CORE SURVEY DATA WORKSHEET

The Patient Protection and Affordable Care Act of 2010

How can we provide the same world class care to patients with psychiatric disorders? 11/27/2016. Dec 2016 Orlando, FL

UTILIZATION MANAGEMENT FOR ADULT MEMBERS

Sign up to Safety Drivers and Measurement

NHS Borders Feedback and Complaints Annual Report

Improving Clinical Outcomes The Case for Electronic ED Door to EKG Time Monitoring

Implementation Guide Version 4.0 Tools

Using PEPPER and CERT Reports to Reduce Improper Payment Vulnerability

Using Data for Proactive Patient Population Management

State FY2013 Hospital Pay-for-Performance (P4P) Guide

Hospital Inpatient Quality Reporting (IQR) Program

time to replace adjusted discharges

NHS TAYSIDE MORTALITY REVIEW PROGRAMME

NHS LANARKSHIRE QUALITY DASHBOARD Board Report October 2011 (Data available as at end August 2011)

Hospital Clinical Documentation Improvement

A New Clinical Operating Model Transforms Care Delivery and Improves Performance

COPPER COUNTRY MENTAL HEALTH SERVICES ANNUAL QUALITY IMPROVEMENT REPORT FY Introduction

UNIVERSITY OF ILLINOIS HOSPITAL & HEALTH SCIENCES SYSTEM HOSPITAL DASHBOARD

HMSA Physical and Occupational Therapy Utilization Management Guide

Inpatient Psychiatric Facility Quality Reporting Program

RescueNet Dispatch, epcr, Care Exchange. HL7 v2. Ellkay LK EMR-Archive Smart on FHIR SAML Ellkay to Epic

9/27/2017. Getting on the Path to Excellence. The path we are taking today! CMS Five Elements

QIO Care Transitions Activity: the Good News so far

Hospital Readmission Reduction: Not Just Nursing s Job

Home Health Value-Based Purchasing Series: HHVBP Model 101. Wednesday, February 3, 2016

Emergency Department Waiting Times

London CCG Neurology Profile

Report on Feasibility, Costs, and Potential Benefits of Scaling the Military Acuity Model

Pave Your Path: Improvement Science & Helpful Techniques

AccuReg For Supervisors. University of Mississippi Medical Center Access Management Patient Access Specialists I

STANDARD / ELEMENT EXPLANATION SCORING PROCEDURE SCORE

Tools & Resources for QI Success

CPC+ CHANGE PACKAGE January 2017

Worth a Thousand Words: Telling a Story with Data

NHS Dental Services Quarterly Vital Signs Reports

The Multidisciplinary aspects of JCI accreditation

EHR Enablement for Data Capture

Strategy Guide Specialty Care Practice Assessment

Executive Summary: Utilization Management for Adult Members

Clinical Safety & Effectiveness Cohort # 18 Follow up and tracking of EMR virology and microbiology test results in a Pediatric university-based

Hospital Inpatient Quality Reporting (IQR) Program

NHS Sickness Absence Rates

Alegent Health: Accelerating Innovation for Quality and Efficiency Gains

Scottish Hospital Standardised Mortality Ratio (HSMR)

Quality Indicators for Primary Care Out-of-Hours Services. July Evidence

New Jersey State Legislature Office of Legislative Services Office of the State Auditor. July 1, 2011 to September 7, 2016

Clinical Governance report prepared for NHS Lanarkshire Board Report title Clinical Governance Corporate Report - October 2015

Transcription:

Session L5 These presenters have nothing to disclose These presenters have nothing to disclose Are National Indicators Useful for Improvement Work? Exercises & Worksheets Robert Lloyd, PhD Göran Henriks, December 8, 2103 1:00 4:30 PM

Where do you stand on transparency? P2 Level Frequency of Transparency Strongly Agree Agree Not Sure Disagree Strongly Disagree 1. Greater transparency is needed across all healthcare settings and providers. 2. Patientsshould be able to compare hospitals as easily as they do cars and other products. 3. Results on hospital outcomes (mortality, infections, falls, med errors, etc.) should be made public once a year. 4. Results on hospital outcomes (mortality, infections, falls, med errors, etc.) should be made public twice a year. 5. Results on hospital outcomes (mortality, infections, falls, med errors, etc.) should be made public four times a year. 6. Results on groups of doctors(surgeons, GPs, intensivists, dentist, etc.) should be made public once a year. 7. Results on individual doctorsshould be made public once a year.

Where do you stand on transparency? P3 Content Topics for Transparency Strongly Agree Agree Not Sure Disagree Strongly Disagree 8. All clinical outcomes on hospital performance should be made available to the public. 9. Operational outcomes on hospital performance (wait times, referral times,access) should be made available to the public. 10. Patient satisfaction results for each hospital should be made available to the public. 11. Financial results (including salaries) for each hospital should be made available to the public. 12. Mortality rates for individual surgeons should be made available to the public. 13. Infection rates for individual physicians should be made available to the public. 14. Errors and harm rates for individual physicians should be made available to the public. 15. Salaries of individual physicians should be made available to the public.

Dialogue #1 Measurement Madness & Transparency P4 Do you know your data better than anyone else? Do you use data that are made available to the public to identify opportunities for improvement? Or, do you look for ways to deny the data and develop rationalizations as to why you think are actually better than the reported outcomes? Do you share all your results openly with staff? Do you share all your results openly with patients, family members and caregivers? If not, why not?

Dialogue #2 Assessing the Messiness of Life! Do people within your organization regularly view issues as being rather messy and complex or do they see them as simple problems that should be resolved quickly and easily? List a few of these messy problems that you are currently addressing and why they are this way. On a scale of 1-3, how messy is each of these problems? Do you have measures for these messy problems that allow you to determine just how complex and challenging each problem is? If you are measuring, do you feel that the measures you have are valid, reliable and appropriate?

Dialogue #2 Assessing the Messiness of Life! What is the topic of this Messy Problem? How Messy is this Problem? 1 = not very messy 2 = moderately messy 3 = very messy List the measures you have for this Messy Problem? Do you feel that these measures are valid, reliable and appropriate?

Dialogue #3 Why are you measuring? 1. What percentages of your organization s time and effort are aimed at measuring for improvement, accountability and research? 2. Does one form of performance measurement dominate your journey and discussions? 3. Is your organization building silos or a Rubik's cube when it comes to data collection and measurement?

Exercise Building a Driver Diagram Make a list of potential improvement drivers for your system of care Create a driver diagram for your project Aim & Outcomes Key drivers of improvement in the outcome(s) Hmmmm. am I a primary or a secondary driver? 8 Copyright 2013 IHI/R. Lloyd

Driver Diagram Worksheet Primary Drivers Secondary Drivers Aim: Outcome Measures: 1. 2. 3. Copyright 2013 IHI/R. Lloyd

Dialogue #4 Cascading Systems Does your organization approach improvement as an interrelated cascading system or as a bunch of singular events that are unrelated and fragmented? Do senior managers and the Board or Governance (Non-Execs) regularly discuss how your systems of care are driven by many interrelated factors? Or, do they approach issues of quality and safety as if one solution will produce better results? Does your organization have dashboards of measures that cascade from the macro, through the meso and down to the micro levels? Do your measures cascade down from the top or percolate up from the places where patient care is actually delivered (the inverted pyramid)?

11 Exercise Building a Cascading set of Driver Diagrams Review the Driver Diagram you just made to improve a particular outcome. Review the Secondary Drivers you identified on this initial Driver Diagram. Select one of the Secondary Drivers and make it the Outcome of your new Driver Diagram. Identify the Primary and Secondary Drivers of this new outcome.

What s The Status of This Driver/Process?. 0% Driver is documented. Driver D description includes all required participants (including families where appropriate). The driver is understood by all. Driver outcomes are predictable. Drivers are fully embedded in operational systems. The driver consistently meets the needs and expectations of all families and/or providers. 12 Copyright 2013 IHI/R. Lloyd

What Is It s Predicted Impact? This driver has no impact or does not apply to our system of care. This driver has only minimal or indirect impact on patient services and outcomes. This driver will improve services for our patients, but other drivers are more important. This driver has significant impact on outcomes for our patients. This is necessary for delivering patient services. It has a major, direct impact on the outcomes. This driver is absolutely essential for achieving results. Improvement in this driver alone will have a direct, immediate impact on outcomes. 13 Copyright 2013 IHI/R. Lloyd

Exercise Prioritizing Drivers Use the Prioritizing Drivers Worksheet. Plot your secondary drivers on the grid based on your assessment of: (1) how well the process is defined, and (2) the level of impact that the drive can have. Discuss and select the drivers that are most important for improving your system of care. Copyright 2013 IHI/R. Lloyd

Prioritizing Drivers Worksheet Process WELL defined 4 3.5 Risk Assess Prev Care 3 2.5 Pt Ed Self Mgmt Status 2 1.5 Ability Pay Scheduling Pt Involved Tx Timely Resore Process NOT defined 1 0.5 0 Diet Popn Mgmt 2.5 3 3.5 4 4.5 5 Impact Lower Impact High Impact Copyright 2013 IHI/R. Lloyd

Exercise Hanging Measures on your Driver Diagram! On your driver diagram, hang the outcome and process measures you will need to track improvement in your system or project. Make sure that these are the most appropriate measures for the concepts you wish to measure.

Exercise Hanging Measures on your Driver Diagram! Use the Driver Diagram you developed as a reference and guide to build your measures. At this point focus your attention on the Process and Outcome measures. If you come up with a few Balancing measures that is good but not necessary at this point. Use the Measurement Plan Worksheet to record your work Then select several of your identified measures and develop a clear operational definition for each measure. Use the Operational Definition Worksheet to record your work. The Questions for Building Operational Definitions will provide guidance on the specific issues which need to be addressed if you want to develop clear and concise operational definitions.

Measurement Plan Worksheet Measure Name Type of Measure (Process, Outcome or Balancing) Driver addressed by this measure 1. 2. 3. 4. 5. 6. 7. 8. 9. Source: R. Lloyd 2013

Operational Definition Worksheet Name of team: Date: Measure Name (Be sure to indicate if it is a count, percent, rate, days between, etc.) Operational Definition (Define the measure in very specific terms. Provide the numerator and the denominator if a percentage or rate. Be as clear and unambiguous as possible) Data Collection Plan (How will the data be collected? Who will do it? Frequency? Duration? What is to be excluded?) You do not need to complete this column right now. You will complete this column when you develop a data collection plan (see Appendix E) Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004.

Dialogue #5 Common and Special Causes of Variation Select several measures your organization tracks on a regular basis. Percent C-sections Percent of Cesarean Sections Performed Dec 95 - Jun 99 35.0 30.0 UCL=27.7018 25.0 20.0 CL=18.0246 15.0 10.0 LCL=8.3473 5.0 12/95 2/96 4/96 6/96 8/96 10/96 12/96 2/97 4/97 6/97 8/97 10/97 12/97 2/98 4/98 6/98 8/98 10/98 12/98 2/99 4/99 6/99 0.0 Do you and the leaders of your organization evaluate these measures according the criteria for common and special causes of variation? If not, what criteria do you use to determine if data are improving or getting worse? Number of Medications Errors per 1000 Patient Days 22.5 20.0 17.5 15.0 12.5 10.0 7.5 5.0 2.5 0.0 month Medication Error Rate Week UCL=13.39461 CL=4.42048 LCL=0.00000

How prepared is your Organization? Key Components* Will (to change) Ideas Execution Self-Assessment Low Medium High Low Medium High Low Medium High *All three components MUST be viewed together. Focusing on one or even two of the components will guarantee suboptimized performance. Systems thinking lies at the heart of CQI! 21

Appendix D Operational Definition Worksheet Team name: Date: Contact person: WHAT PROCESS DID YOU SELECT? WHAT SPECIFIC MEASURE DID YOU SELECT FOR THIS PROCESS? OPERATIONAL DEFINITION Define the specific components of this measure. Specify the numerator and denominator if it is a percent or a rate. If it is an average, identify the calculation for deriving the average. Include any special equipment needed to capture the data. If it is a score (such as a patient satisfaction score) describe how the score is derived. When a measure reflects concepts such as accuracy, complete, timely, or an error, describe the criteria to be used to determine accuracy. Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004.

Operational Definition Worksheet (cont d) Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004. DATA COLLECTION PLAN Who is responsible for actually collecting the data? How often will the data be collected? (e.g., hourly, daily, weekly or monthly?) What are the data sources (be specific)? What is to be included or excluded (e.g., only inpatients are to be included in this measure or only stat lab requests should be tracked). How will these data be collected? Manually From a log From an automated system BASELINE MEASUREMENT What is the actual baseline number? What time period was used to collect the baseline? TARGET(S) OR GOAL(S) FOR THIS MEASURE Do you have target(s) or goal(s) for this measure? Yes No Specify the External target(s) or Goal(s) (specify the number, rate or volume, etc., as well as the source of the target/goal.) Specify the Internal target(s) or Goal(s) (specify the number, rate or volume, etc., as well as the source of the target/goal.)

Dashboard Worksheet Name of team: Date: Measure Name (Provide a specific name such as medication error rate) Operational Definition (Define the measure in very specific terms. Provide the numerator and the denominator if a percentage or rate. Indicate what is to be included and excluded. Be as clear and unambiguous as possible) Data Source(s) (Indicate the sources of the data. These could include medical records, logs, surveys, etc.) Data Collection: Schedule (daily, weekly, monthly or quarterly) Method (automated systems, manual, telephone, etc.) Baseline Period Value Goals Short term Long term Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004.

Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett, 2004. Measure Name (Provide a specific name such as medication error rate) NON-SPECIFIC CHEST PAIN PATHWAY MEASUREMENT PLAN Operational Definition (Define the measure in very specific terms. Provide the numerator and the denominator if a percentage or rate. Indicate what is to be included and excluded. Be as clear and unambiguous as possible) Data Source(s) (Indicate the sources of the data. These could include medical records, logs, surveys, etc.) Data Collection: Schedule (daily, weekly, monthly or quarterly) Method (automated systems, manual, telephone, etc.) Baseline Period Value Goals Short term Long term Percent of patients who have MI or Unstable Angina as diagnosis Numerator = Patients entered into the NSCP path who have Acute MI or Unstable Angina as the discharge diagnosis Denominator = All patients entered into the NSCP path 1.Medical Records 2.Midas 3.Variance Tracking Form 1.Discharge diagnosis will be identified for all patients entered into the NSCP pathway 2.QA-URwill retrospectively review charts of all patients entered into the NSCP pathway. Data will be entered into MIDAS system 1.Currently collecting baseline data. 2.Baseline will be completed by end of 1 st Q 2010 Since this is essentially a descriptive indicator of process volume, goals are not appropriate. Number of patients who are admitted to the hospital or seen in an ED due to chest pain within one week of when we discharged them Operational Definition: A patient that we saw in our ED reports during the call-back interview that they have been admitted or seen in an ED (ours or some other ED) for chest pain during the past week All patients who have been managed within the NSCP protocol throughout their hospital stay 1.Patients will be contacted by phone one week after discharge 2.Call-back interview will be the method 1.Currently collecting baseline data. 2.Baseline will be completed by end of 1 st Q 2010 Ultimately the goal is to have no patients admitted or seen in the ED within a week after discharge. The baseline will be used to help establish initial goals. Total hospital costs per one cardiac diagnosis Numerator = Total costs per quarter for hospital care of NSCP pathway patients Denominator = Number of patients per quarter entered into the NSCP pathway with a discharge diagnosis of MI or Unstable Angina 1.Finance 2.Chart Review Can be calculated every three months from financial and clinical data already being collected 1.Calendar year 2010 2.Will be computed in June 2010 The initial goal will be to reduce the baseline by 5%within the first six months of initiating the project.