Journey Towards Automated Click Data to Abstraction edit Master title of CMS style Core Measures at NYP Scott W. Possley, PA-C, MPAS
Objectives Describe our hospital Discuss rationale behind automation of perinatal care CMS core measures Describe the process of our automation project Address barriers Discuss next steps for future automation projects 2
Automation Team Scott W. Possley, PA-C, MPAS Director, QPS Peggy Liu, RN, MS Core Measures Manager, QPS Elsie Binns, MS Performance Improvement Specialist Women s Health Linda Georges, RN Data Abstractor Hillary Shaw, MPA Administrative Director Women s Health Karthik Natarajan, PhD Programmer 3
About NewYork-Presbyterian Hospital (NYP) Six main facilities located in and around New York City 2,600 patient beds (one license) 125,000 annual discharges; 2 million outpatient visits Affiliated with Columbia University and Weill Cornell College medical schools 15,000 live births at 4 of our 6 campus hospitals Inpatient EMR is Allscripts SCM 4
About NewYork-Presbyterian Hospital (NYP)
Terminology TJC: The Joint Commission NYS DOH: NY State Dept of Health CMS: Centers for Medicare & Medicaid Services NDNQI: National Database of Nursing Quality Indicators NHSN: National Healthcare Safety Network (CDC) NSQIP: National Surgical Quality Improvement Program STS: The Society of Thoracic Surgeons NCDR: National Cardiovascular Data Registry PCI: Percutaneous Coronary Intervention AJRR: American Joint Replacement Registry 6
Databases and Registries at NYP >35 RN/PA abstractors, many with advanced degrees and certifications abstracting for >25 databases and registries Database: A sampling of cases are abstracted Registry: 100% of cases are abstracted Growth of data is greater than the retirement of metrics Barrier: Amount of staff not always growing with increase in reporting 7
Data Burden Mandated Reporting to TJC, NYS DOH and CMS E.g. CMS Core Measures; NDNQI; NHSN; Stroke Certification; NYS Sepsis, Solid Organ Transplant (UNOS) Voluntary Reporting: E.g. NSQIP, STS, NCDR: PCI & ACTION, Cardiac Transfer Initiative, AJRR Senior Leadership tasked quality with automation of new and existing measures 8
Data Burden Perinatal Care Core Measure (PCCM) Set: Perinatal Care Mom Measures (PCM) & Perinatal Care Baby Measures (PCB) identified as first project. Starting in Jan 2014, PCCM went from 1 measure to 5 measures Gaining an additional FTE was not an option Made for an opportunity to automate the entire measure set Goal was for 100% automation of this measure set within 5 months 9
Data Burden PC-01 Elective Delivery PC-02 Cesarean Birth PC-03 Antenatal Steroids PC-04 Blood Stream Infections in Newborns PC-05 Exclusive Breast Milk Feeding 10
Data Burden for PCM Patient on a clinical trial? How many weeks of gestation? Was patient in labor? Documentation that patient had undergone prior uterine surgery? How many live births did the patient have prior to this delivery? Administration of antenatal steroids? If applicable and no, reason for not administering antenatal steroids? 11
Data Burden for PCB Discharge disposition? Patient on a clinical trial? Assessment if patient received treatment for a blood stream infection within 48hrs of admission (birth)? Newborn weight at delivery? Was the patient admitted to the NICU? Exclusive breastfeeding? Mother s Choice? 12
Data Burden Case volume for PCM/PCB doubled from 1000 cases to over 2000 cases/year 1 measure to 5 measures 7 individual metrics for PCM (15-20 min/case) 7 individual metrics for PCB (20-30 min/case) No one place to find all of the information 13
Abstraction Tool Source: Nuance Clintegrity 360 PMT 14
Abstraction Tool Source: Nuance Clintegrity 360 PMT 15
Abstraction Tool Source: Nuance Clintegrity 360 PMT 16
Pre-Automation State Increase in volume and more detailed abstraction needed New measures for the organization; E.g. documentation of mother s choice Complexity of chart Hundreds of pages of e-documentation to review to find data point Pinpointing in which note and by which provider (RN v. MD/PA/NP) Not consistently in the same note for each metric 17
Documentation 18
Automation Approach First steps: Identified a multidisciplinary team QPS, Clinical Service Line (SL), MDs, RNs and IT QPS, SL and IT met weekly to review metrics and identify where documented in EMR Quality Management Specialist (QM) who does the case abstraction instrumental in process 19
Automation Approach Each metric was identified Definitions per CMS guided where metrics abstracted from (RN v. MD/PA/NP notes) Quickly realized many redundancies of documentation Needed one source of truth What is the actual gestational age? What fields feed other fields? 20
Issue #1: Where s the documentation? Quickly realized the numerous places where documentation occurred First task was to standardize where information was documented and by who Each campus documented in different notes Standardized cross-campus note templates 21
Issue #1 Resolution Bring the campuses together to adopt standardized notes Define what would be documented where Define who would document, MD/PA/NP or RN or either which helped avoid redundancies Two prong approach Standardize notes between campuses Automate with current notes while awaiting standardization 22
Issue #1 Resolution To standardize documentation, needed buy-in from key stakeholders at each campus More real-time data Avoid redundancies Alignment of work being done at each campus Ability to share best practices with documentation 23
Issue #1 Resolution One campus had a large structured note that contained the necessary elements The document was reviewed with all of the campuses Modifications were made so all 4 sites agreed to parent note Decision was made to adopt note system wide 24
Documentation 25
Automation Approach IT Report Build EMR Documents IT Documentation Note Modification EMR Data Warehouse Per PMT Specs, Query for each measure Translate to allowable values PMT Data File PMT submits on behalf of NYP Validation 26
Automation Go-Live Started automation initiative in August 2013 February 24 th, 2014 was the date identified that we would no longer abstract manually Built report would be submitted electronically and validated by RN Abstractor First upload identified problem #2 27
Issue #2 Parity was not being picked up in a majority of the charts Drill-down with the help of the abstractor revealed a work-around and redundant fields This was identified and the medical teams were updated on where to document 28
Automation Validation RN abstractor validated 100% of charts listed as passing or failing 100% of charts flagged as incomplete or not meeting standards manually reabstracted Sometimes inaccurate or missing information needed to be updated Trend over time improved with a decrease in missing/inaccurate documentation Still checking passing cases for accuracy 29
Current State PCM: Percent Automation 100% 80% 60% 40% 20% 0% 95% 97% 84% 73% 77% 67% 1Q 2014 2Q 2014 3Q 2014 4Q 2014 1Q 2015 2Q 2015 30
Current State PCB: Percent Automation 100% 80% 75% 77% 70% 89% 60% 40% 43% 20% 0% 2Q 2014 3Q 2014 4Q 2014 1Q 2015 2Q 2015 31
Current State Standardization of notes cross-campus went live Jan 2015 Strengthened partnerships more evident and data more timely Charts now uploaded with metrics pre-populated Validation of charts confirms that automation is pulling data correctly Time to review and validate charts takes about 5 minutes - Error chart review also quick due to standardization of documentation and where report pulls data in from 32
Time Savings 15-30 min/chart time savings x 2000 charts PCM & PCB now takes ~5 min/chart RN Abstractor now able to do improvement and additional abstraction for other databases Goal is to perform spot checks of charts while patient still in-house to ensure quality of care and correct documentation of metrics 33
What we learned Still not at 100% By-product: Documentation Standardization Growing Pains Blood Stream Infections Clinical Trials still scanned paper 34
What we learned Automation project manager Project charter with early buy-in from respective parties Need clinical and IT buy-in/support Issue #3: Priority list of data to be automated Everyone wants their data automated and it pulls on resources Weekly meeting to discuss high priority and status 35
What we learned Be realistic with goals 100% vs. partial automation of measure Timeline: 5 months was a lofty goal for an organization of our size Note standardization changes timeline Programming notes and reports changes timeline Validation and tweaks to process changes timeline 36
Next Steps >25 databases and registeries Influenza Vaccination, ED, and Psychiatry Core Measures Neurosciences Core Measures ~50% of 260 measures for ~2000 cases/year (4.5 FTE RN/PA QM abstractors) Cardiac Reporting Registries and Databases 37
Questions? 38