1 Catherine Porto, MPA, RHIA, CHP Executive Director HIM Madelyn Horn Noble 3M HIM Data Analyst
University of New Mexico Hospitals» The state s only academic medical center» The primary teaching hospital for the university's School of Medicine» UNMH Hospital is the state's only Level I Trauma Center» Locations: UNM Hospital ( 692 beds) UNM Sandoval Regional Medical Center (72 beds) UNM Children s Hospital (beds included with UNMH) UNM Cancer Center UNM Psychiatric Center ( 47 beds) UNM Children s Psychiatric Center ( 53 beds) UNM Carrie Tingley Hospital Outpatient Services UNM Hospital-Based Clinics (80+ outpatient clinics)» UNM Health System statistics include: 71,000 Inpatient Discharges 11,000 Observation Discharges 93,000 emergency visits 7,000 trauma cases 18,000 surgeries 568,016 outpatient visits» UNMH cares for a large, diverse population with complex and urgent health needs, providing more than $135 million of uncompensated care per year.» Thousands of patients receive advanced treatments in clinical trials.» Many more patients benefit from our telehealth network that gives providers audio and video access to rural communities statewide. 2
The Genie is out of the Bottle CMS Star Rating System, Issues and Concerns» UNM Hospital Identified Areas for Improvement» Main Focus for UNM Hospitals Patient Safety Focus» UNM Hospital Initial focus on 8 (Eight) Patient Safety Events» How Accurate is our Documentation, Coding, Publically Reported Data» Severe Patient Harm Elimination Teams (SPHEE Teams)» SPHEE Team use of 3M 360 Encompass Software» Quality Department Process» How we Put the Genie Back in the Bottle with use of 360 Encompass» How UNM Hospitals Use 360 Encompass to Affect Quality Measures 3
» July 2015: CMS announced a single rating system to help millions of patients and their families learn about the quality of hospitals, compare facilities in their area side-by-side, and ask important questions about care quality when visiting a hospital or other healthcare provider» Star Rating methodology Medicare Patients Started in 2015 with 10 measures; at 64 measures in 2016 Focus on individual performance measures with comparisons to state, national averages + Data is 9-12 months old when posted (covers 1-3 years prior) does not reflect improvements in that 1-3 year period Star Rating system ratings were first released July 27, 2016 4
» Lumps numerous and disparate hospital services into a single star score» Some hospitals report all measures but are compared to hospitals who report fewer (as low as 9) measures» Hospitals across our state transfer high acuity, complex patients to our facility for care available at our facility» No adjustment for social, economic, demographic challenges Which could influence health care outcomes These patients may require more resources and may present greater challenges for care Could be bias against tertiary care, referral hospitals» Replication of data is difficult» A one-size-fits-all approach to measure quality of care may mislead patients and/or steer them away from hospitals that can provide the best care for their specific conditions 5
» AEH is disappointed in CMS for releasing the ratings when so many questions remain about the data behind the ratings and their value to consumers» Incorporate measures that miss clinically relevant data, and fail to adjust for patient circumstances that influence health and health care outcomes circumstances outside a hospital s control» Consumers deserve accurate, comprehensive, and relevant information to make health care decisions. Hospitals deserve to be evaluated on a level playing field. The star ratings accomplish neither.» Bruce Siegel, MD, president and CEO of America s Essential Hospitals» The AAMC believes the underlying methodology behind the rating system is deeply flawed these new CMS ratings are misleading.» Overall Bottom Line: CMS s Hospital Compare Star Ratings are bad for patients and bad for the hospitals that care for them. American Association of Medical Colleges 6
CMS Star Ratings Our Academic Medical Center (and several others) University of New Mexico, Albuquerque, NM University of Mississippi, Jackson, MS University of Illinois, Chicago, IL University of Colorado, Denver, CO* Tufts Medical Center, Boston, MA* University of Arizona Medical Center, Tucson, AZ Denver Health, Denver, CO Kansas University Medical Center, Kansas City, KS* Cleveland Clinic, Cleveland, OH* University of Utah Medicine, Salt Lake City, UT* *Top decile 2015 UHC Q&A scorecard 7
30 Vizient Comparison Vizient (PSI 3,6,9,11,13) CMS PSI 90 (PSI 3, 6, 8, 9, 10,11,12,13,14,15) LeapFrog (PSI, 3,4,6,11,12,14,15, FT, FO,AE, 25 20 15 10 5 0 Jan-15 Feb-15 Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15 Sep-15 Oct-15 Nov-15 Dec-15 Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep- 8
Serious complications Patient Safety Readmissions 6 specific diagnoses, all patient composite Core measures, ED wait times Timeliness of Care Mortality 6 specific diagnoses 9
» Patient Harm Event:* Defined: an unintended physical injury resulting from or contributing to by medical care (including absence of necessary medical treatment) that requires additional monitoring, treatment, or hospitalization, or results in death. * Institute of Healthcare Improvement (IHI)» May also be HACs» Can include hospital acquired infections» According to CMS, approximately 40,000 incidences of harm occur in US hospitals every single day Tracked nationally and statistics are compiled and published in scorecards for public review (i.e., CMS Star Rating System) 10
» PSI-90 Composite Aggregate scoring for the most frequent occurrence rates of 8 metrics in our organization (Rates are risk-adjusted by expected rates calculated from coded diagnoses based on EMR documentation Stage 3 and 4 pressure ulcers Falls and hip fracture Post-op sepsis Accidental puncture/laceration Iatrogenic pneumothorax Peri-op DVT/PE Post-op wound dehiscence CLABSI (coded) 11
» How Accurate is your DOCUMENTATION?» How Accurate is your CODING?» How Accurate is your PUBLICALLY REPORTED DATA? 12
» Chartered in fall of 2015 by the Medical Executive Committee» Under the Leadership of the Health System Chief Quality Officer» Core SPHEE Team Members included: Office of Quality & Safety + CQO, RN Advisor Excellence Projects, and Sr. Quality Administrator HIM Leadership + Executive Director, 3M Data Analyst, CDI Director, and CDI Educator Individual Clinical Department Provider Champions (clinical expertise and documentation improvements) 13
» Learning and understanding CMS and AHRQs definitions for patient harm events» Understanding quality metrics on the publically reported scorecards (i.e., Leapfrog and Hospital Compare Star Rating)» Hours reviewing cases to gain a better understanding of opportunities for improvement Improvements in patient care processes; Improvements in documentation; and Improvements in coding 14
Initial review findings» 25-50% of cases reviewed by SPHEE team were inaccurately identified as harm events due to incomplete or confusing documentation!» Inaccurate and confusing/conflicting documentation leads to coding errors! 15
» Each SPHEE team is led by a provider champion» Each team performs concurrent and retrospective reviews: EMR alerts for acute VTE events trigger real-time reviews; Concurrent CDI reviews; and Retrospective coding reviews» CDI and Coding use 360 Encompass Indicators to identify observed events» HIM SPHEE Team creates a notification in 360 Encompass for communication between the HIM team and the provider champions» Provider champion reviews the event(s) using an internally developed case review template to determine: If a true harm event occurred; communicates this back to HIM SPHEE team If there is a diagnosis that should be captured to exclude the flagged harm event that has not been documented; Consults/queries the attending provider for additional documentation if needed Communicates true harm vs. need for additional documentation clarification back to the appropriate provider Communicates findings back to the HIM Data Analyst Identifies and communicates clinical/documentation process improvement opportunities and action plans to effect change 16
» CDI Team Responsible for sending Concurrent PSI/HACs notifications» Coding Team Responsible for sending Retrospective PSI/HACs notifications Ensures final coding reflects provider documentation for accurate patient harm event reporting» HIM Teams Maintain updates & revisions of internally developed SPHEE Team tip sheets (inclusion and exclusion criteria) for provider champion, providers, and coding staff alignment Responsible for timely transfer of information between CDI & Coding with Physician and Quality teams Tracking SPHEE team efforts using 3M 360 Encompass software 17
» Quality Team Facilitates Provider Champion Teams»Collect aggregate data for the quality office Determine if PSI/HAC Observed events are true or false» Retrospectively compare internal findings to Vizient s report card 18
We put the Genie back in the bottle! 19
20
21
360 Encompass is used to: Document SPHEE Team activity Monitor daily workflow between teams Produce Reports comparing 3M Encompass to Vizient s PSI/HAC activity Verify accurate data to Quality 22
Add a Finding 23
Add a Follow Up 24
Add a Notification 25
3 reports are used to compare 3M Observed Encounters to Vizient Reported Events 3M report 10: Notification Listing and Summary AHRQ 2: AHRQ Patient Listing HAC 2: HAC Patient Listing 26
10 Notification Listing and Summary 27
10 Notification Listing and Summary 28
AHRQ 2: AHRQ Patient Listing (for Observed encounters) 29
HAC 2: HAC Patient Listing 30
AHRQ2: AHRQ Patient Listing -AND- HAC 2: HAC Patient Listing 31
AHRQ2: AHRQ Patient Listing HAC 2: HAC Patient Listing 32
» Ability to further automate SPHEE team outcomes in 360 Encompass» Specifically ability to aggregate data from 360 Encompass.» How many team reviews for a review period with final outcomes:» Observed (True)» At Risk (False)» Pending» Reasons for the above outcomes:» Met Inclusion Criteria» Technical Inconsistency/Undetermined» No Provider Response» Documentation Amended» Coding Amending» In Progress» Date Visit ID was Final Coded (currently have to get this from 3M HDM)» Pending Enhancement Requests:» Add a follow up date column to show on the focus dashboard» Add a final coded as a parameter in the AHRQ 2 report» Capability to schedule reports to run on a daily, weekly, or monthly basis» AHRQ, HAC & PPC reports show Indicator variables as a column in the report. 33
»Questions 34
Catherine Porto, MPA, RHIA, CHP Executive Director HIM cporto@salud.unm.edu 505-925-4454 - and - Madelyn C Horn Noble HIM Data Analyst mhornnoble@salud.unm.edu 505-272-7038 35