Quality of Electronic Pathology (E-path) Records: A Function of Time, X Factors and One Constant

Similar documents
IMPLEMENTING STATEWIDE CANCER CASE REPORTING BY TARGETED PHYSICIAN SPECIALISTS IN NEW YORK

NAACCR Volume V Revisions

Challenges for National Large Laboratories to Ensure Implementation of ELR Meaningful Use

Supporting Public Health and Surveillance State Level Perspective

WEDNESDAY APRIL 27 TH 2011 OUTREACH & PILOT RECRUITMENT

Not the Same Old EOD: the New and Improved SEER EOD for 2018 and Beyond

Objectives. Cancer Registry Abstracting

Note: Every encounter type must have at least one value designated under the MU Details frame.

New York State Department of Health Innovation Initiatives

2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.

Thirty-three three Years of Rapid Case Ascertainment: Lessons Learned

The National Program of Cancer Registries (NPCR) Annual Program Evaluation: Ten Years of Partnership and Progress

Copyright All Rights Reserved.

HIDD 101 HOSPITAL INPATIENT AND DISCHARGE DATA IN NEW MEXICO

DATA QUALITY AND DATA USES. Agenda. Chicago, Illinois. Northwestern Memorial Hospital

Case-Mix Data for Case Ascertainment

Baby s First Message: Next Steps and Lessons Learned after Achieving Statewide Implementation of an Electronic System

Laura E Soloway, PhD MPH Francis P Boscoe PhD New York State Cancer Registry Bureau of Cancer Epidemiology

OUTPATIENT LIVER INTRODUCTION:

NYS Congenital Malformations Registry Converting to a Web-based System. NYS Congenital Malformations Registry Converting to a Web-based System

Meaningful Use Modified Stage 2 Roadmap Eligible Hospitals

Relevance of Meaningful Use Requirements for Pathologists and Laboratories Pathology Informatics 2011 October 5, 2011

National Program of Cancer Registries - Modeling Electronic Reporting Project (NPCR-MERP)

The Minnesota Statewide Quality Reporting and Measurement System (SQRMS)

Suicide Among Veterans and Other Americans Office of Suicide Prevention

Quality Data Model December 2012

Medical-legal Issues in Pathology

REQUIREMENTS GUIDE: How to Qualify for EHR Stimulus Funds under ARRA

2001 NAACCR DATA STANDARDS 6 th Edition, Version 9.1, March 2001 PATHOLOGY LABORATORY DATA DICTIONARY

Patient Safety in Ambulatory Care: Why Reporting Counts. August 11, 2010 Diane Schultz, RPh and Sheila Yates, MPH

Execution TIPS for Successful QCDR Reporting. Alicia Blakey, ACR Priya Sharma, ACR Cory Lydon, Mecklenburg Radiology Associates August 17, 2017

quarterly BOROUGH LABOR MARKET BRIEF Quarter 1

Activity Based Cost Accounting and Payment Bundling

Leveraging the Attention

Universal Public Health Node (UPHN): HIE and the Opportunities for Health Information Management

Avoiding the Cap Trap What Every Hospice Needs to Know. Matthew Gordon, CPA Principal Consultant / Founder Cap Doctor Associates, Inc.

Ontario Mental Health Reporting System

Jeannette Jackson-Thompson, MSPH, PhD Missouri Cancer Registry and Research Center (MCR-ARC) Department of Health Management & Informatics, School of

Initial Assessment, Survivorship Care Plans

<Insert Picture Here> Some Background and What You Should Know and Do Now to Prepare

Presented by Hannah Poczter, AVP, and Ed Giugliano, PhD

Registry of the Future Open Forum: Real-Time Reporting Update and Discussion

Oncology Data Management Systems

Electronic Health Record (EHR) Data Capture: Hopes, Fears, and Dreams

Measure #138: Melanoma: Coordination of Care National Quality Strategy Domain: Communication and Care Coordination

Conflict of Interest

Summary of Findings. Data Memo. John B. Horrigan, Associate Director for Research Aaron Smith, Research Specialist

2015 TQIP Data Submission Web Conference. February 11, 2015

Carolinas Collaborative Data Dictionary

Season s Greetings from all of us to all of you!

EHDI TSI Program Narrative

STATE OF CONNECTICUT

Care Management Policies

Cancer Incidence in US Hispanics/Latinos,

Total Cost of Care Technical Appendix April 2015

Learning Session 4: Required Infection Reporting for Minnesota CAH

Syndromic Surveillance in WA

Executive Summary MEDICARE FEE-FOR-SERVICE (FFS) HOSPITAL READMISSIONS: QUARTER 4 (Q4) 2012 Q STATE OF CALIFORNIA

Basic Concepts of Data Analysis for Community Health Assessment Module 5: Data Available to Public Health Professionals

Transforming Health Care with Health IT

Agenda (Items may be taken out of order)

Ontario Shores Journey to EMRAM Stage 7. October 21, 2015

Roundtable Discussion_Test Utilization_Zhang 7/29/2014

THE VALUE OF CAP S Q-PROBES & Q-TRACKS

How can oncology practices deliver better care? It starts with staying connected.

REGISTERING A PATIENT

Performance Management in Maternal and Child Health

2. What is the main similarity between quality assurance and quality improvement?

Improving transparency and reproducibility of evidence from large healthcare databases with specific reporting: a workshop

A. Encounter Data Submission Requirements

Improving Clinical Outcomes The Case for Electronic ED Door to EKG Time Monitoring

European Quality Assurance Scheme for Breast Cancer Services

NHS performance statistics

Updated 6/9/2009 RESIDENT SUPERVISION: A. Anatomic Pathology:

Public Health Workforce Assessment Report. North Carolina Health Directors

Program Highlights. A User s RQRS Experience Mildred Nunez Jones, BA, CTR Northside Hospital Cancer Institute

MACOMB COUNTY COMMUNITY MENTAL HEALTH QUALITY ASSESSMENT AND PERFORMANCE IMPROVEMENT PROGRAM ANNUAL EVALUATION, FISCAL YEAR 2009 ANNUAL PLAN, FISCAL

Deborah Mayer, PhD, RN, AOCN, FAAN School of Nursing Lineberger Comprehensive Cancer Center University of North Carolina-Chapel Hill

Changing the Way We Look at Survey Nonresponse

Measuring Digital Maturity. John Rayner Regional Director 8 th June 2016 Amsterdam

Handling Organisational Complaints

during the EHR reporting period.

2/20/2013. Statewide, Population-Based, Incidence. Dr. Youjie Huang DOH State Director of Cancer Registry Operations

Medicare Physician Payment Reform

10/16/2013. Presenter Disclosure. Today s Learning Objectives. Creating Learning Circles in Public Health:

Measure #137 (NQF 0650): Melanoma: Continuity of Care Recall System National Quality Strategy Domain: Communication and Care Coordination

STATE OF CONNECTICUT

Catherine Porto, MPA, RHIA, CHP Executive Director HIM. Madelyn Horn Noble 3M HIM Data Analyst

Improving Coordinate Accuracy for Cancer Cases in Oklahoma

NEW YORK state department of

2017 National NHS staff survey. Results from London North West Healthcare NHS Trust

Meaningful Use Participation Basics for the Small Provider

Merit-Based Incentive Payment System (MIPS) Advancing Care Information Performance Category Measure 2018 Performance Period

2017 National NHS staff survey. Results from North West Boroughs Healthcare NHS Foundation Trust

2017 National NHS staff survey. Results from Salford Royal NHS Foundation Trust

NHS performance statistics

Gynecologic or Annual Women s Exam Visit & Use of Q0091 (Pap, Pelvic, & Breast Visit)

Kaleida Health 2010 One-Year Community Service Plan Update September 2010

2017 National NHS staff survey. Results from Nottingham University Hospitals NHS Trust

Measure #137 (NQF 0650): Melanoma: Continuity of Care Recall System National Quality Strategy Domain: Communication and Care Coordination

Transcription:

Quality of Electronic Pathology (E-path) Records: A Function of Time, X Factors and One Constant Jovanka Harrison, Ph.D. New York State Cancer Registry North American Association of Central Cancer Registries Annual Conference, Albuquerque, New Mexico, June 16-23, 2017

July 13, 2017 2 Objective 1 Utilize a previously developed data quality assessment tool, the Quality Report Card, to evaluate change in data quality of pathology records received in the NAACCR Standards Volume V format, submitted by five hospital-based laboratories* to the NYSDOH Electronic Clinical Laboratory Reporting System (ECLRS). * Receiving e-path services from the same vendor. The services included daily electronic transmission of HL7 messages from the reporting facility to ECLRS.

July 13, 2017 3 Objective 2 Utilize the same assessment tool, covering the same time period, to evaluate change in data quality of pathology records received in the NAACCR Standards Volume V format, submitted by five independent laboratories.

July 13, 2017 4 Background ECLRS is a NYSDOH developed Web-based reporting system, within the Health Commerce System (HCS); a secure infrastructure for Mandatory Reportable Conditions (e.g., HIV/AIDS, Communicable Diseases, Lead poisoning and Cancer). Supported (Cancer) formats: HL-7 (v. 2.x), NAACCR ASCII, and Web data entry (HTML).

July 13, 2017 5 Background Continued The NYSCR Cancer-ECLRS Initiative started in 2001. Its mission: to recruit labs for e-path reporting and thereby improve compliance and timeliness. High volume reporters (>1,000 reportable cancer cases per year) are encouraged to use the NAACCR Standards for Cancer Registries Volume V (HL7 v. 2.x) format.

July 13, 2017 6 Background Cont d Initial recruiting efforts focused on onboarding independent pathology laboratories, with the aim of capturing cancer cases usually diagnosed in a nonhospital setting, such as malig. melanomas of the skin or prostate cancer. In recent years, the scope has been expanded to include hospital based laboratories.

July 13, 2017 7 140,000 120,000 100,000 80,000 60,000 40,000 20,000 At a Glance: Number of E-Path Lab Reports Received by NYSCR Cancer- 0 ECLRS, by Data Format: 2004-2016 Web File Total 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016

July 13, 2017 8 Onboarding of Hospital-based Labs Lab B 1/22/2015 The TimeLine Lab D 4/9/2015 Dec 2014 Jan 2015 Mar 2015 Apr 2015 Jun 2015 12/17/2014 Lab A 3/17/2015 Lab C 6/15/2015 Lab E

July 13, 2017 9 Background Cont d An important part of the onboarding of labs is the certification process. The goal is to ensure good quality data from the go live date (date certified for reporting into production). Subsequent monitoring (incl. through the QA Report Card) assures high data quality.

July 13, 2017 10 Certification Process The certification process involves two phases: 1) pre-certification: Is the HL7 message structure correct? (may use dummy data in the first test file/s); 2) certification= validation of content and structure (real data), optionality/usage, and reportability. Specifically, electronic and paper submission of pathology reports to include a sample of sites (e.g., prostate, breast, incl. FISH, ER/PR analyses, colon, malig.mel.of skin, hemes.)

July 13, 2017 11 Dissecting the Path Report Looking for LOINCs (Logical Observation Identifiers Names and Codes) in OBX-3 ( question code ) of the HL7 message (1) Relevant (Clinical) History 22636-5 (2) Site of Origin 22633-2 (3) Gross Description 22634-0 (4) Microscopic Observation 22635-7 (5) Final Diagnosis (narrative) 22637-3 (6) Pathologic Findings (by CAP) 33746-9 (7) Comments (Interpretation) 22638-1 (8) Supplemental Reports 22639-9

July 13, 2017 12 Evaluation of Cancer Lab Reporting: The Quality Report Card The QA Card a tool to monitor and review data quality of specific required and recommended data items in a systematic way. For each reporting lab, and for a selected period, the Card shows the number reports received, and calculates percentage of missing (no value submitted) and invalid values per data item.

July 13, 2017 13 The Quality Report Card Required data items: Patient Demographics (patient name, address, DOB, gender, race, SSN) Tumor-related Information (specimen collection date, relevant/clinical history, nature of specimen/site of origin, final dx) Recommended data items: Ordering Provider/Facility Information (name, address)

July 13, 2017 14 Methods The onboarding of five hospital-based labs within a short period, all using the same vendor ( the constant ) to create their HL7 feed was considered a controlled experiment, which provided an opportunity to evaluate whether data quality for specific data items changes of over time. For each lab, The Card was run at four different time points: 1) 30 days from certification, 90 days, and 18 months later. This process was repeated for the independent* labs. * Timeline starting point, the middle point of onboarding of hospital based labs.

July 13, 2017 15 Select Results: Percentage Missing Clinical History- Five Hosp.-based Labs Reporting to NYSCR: Four Time Points from Go Live % 20 30 days 90 days One Year 18 Months 15 10 5 0 Lab A Lab B Lab C Lab D Lab E

July 13, 2017 16 Select Results: Percentage Missing Clinical History- Five Independent Labs Reporting to NYSCR: Four Time Points from Go Live % 100 30 days 90 days One Year 18 Months 80 60 40 20 0 Lab AA Lab BB Lab CC Lab DD Lab EE

July 13, 2017 17 Select Results: Percentage Missing SSN- Five Hosp.-based Labs Reporting to NYSCR: Four Time Points from Go Live % 80 70 60 50 40 30 20 10 0 30 days 90 days One Year 18 Months Lab A Lab B Lab C Lab D Lab E

July 13, 2017 18 Select Results: Percentage Missing SSN- Five Independent Labs Reporting to NYSCR: Four Time Points from Go Live % 80 70 60 50 40 30 20 10 0 30 days 90 days One Year 18 Months Lab AA Lab BB Lab CC Lab DD Lab EE

July 13, 2017 19 Select Results: Percentage Invalid Values for SSN- Five Hosp.-based Labs Reporting to NYSCR: Four Time Points from Go Live % 35 30 25 20 15 10 5 0 30 days 90 days One Year 18 Months Lab A Lab B Lab C Lab D Lab E

July 13, 2017 20 Select Results: Percentage Invalid Values for Ord. Facility Name- Five Independent Labs Reporting to NYSCR: Four Time Points from Go Live % 40 35 30 25 20 15 10 5 0 30 days 90 days One Year 18 Months Lab AA Lab BB Lab CC Lab DD Lab EE

July 13, 2017 21 Summary of Select Results There is great variability among labs in terms of data quality. Patient demographics, especially address and race/ethnicity is more often provided by hospital-based laboratories versus independent path labs. Reference laboratories (e.g., those that perform genomic tests) tend to have less patient demographics. For example, 3 of 5 independent path labs, had 100% missing patient address. Two of the three are genomic labs and one is a dermatopathology lab. Labs sending local version of the ordering facility name, e.g. Lab B at Union Street 52, will get dinged by the algorithm.

July 13, 2017 22 Lessons Learned Quality Assurance (QA) processes to be re-evaluated and updated, if necessary, at the least on a yearly basis. The number and nature of on-boarding facilities/clients will drive the QA process. The more varied the facilities in terms of data quality (and type of information submitted), the more complex the evaluation. (E.g., algorithms for invalid values.) The more timely the submission of pathology reports (e.g., the path report is received by the central cancer registry within a day of it being entered into the submitting lab LIS), the greater the need for frequent QA.

July 13, 2017 23 Lessons Learned Continued The higher the e-path reporting volume, (even with generally good data quality) the greater the need for resources at the registry to assure data quality and consistency; for example, 2% of 100 path reports missing patient address is trivial, but 2% of 120,000 is 2,400 reports which require a different type of attention and resources. More frequent QA feedback to reporting labs, not just error driven re-education, might facilitate improved buy-in by labs.

July 13, 2017 24 Next Steps Improve the algorithm for handling invalid values Evaluate patterns of variable and/or decreasing data quality among labs Continue collaboration with NYS DOH in developing a comprehensive Timeliness and Compliance Report Card for path labs Conduct targeted QA education of labs with known issues Consider de-certification of non-compliant labs

July 13, 2017 25 Special Thanks To: My colleagues at the NYSCR & NYSDOH: Maria Schymura, Ph.D. (Director, Bureau of Cancer EPI), Amy R. Kahn, M.S., CTR Todd Szwetkowski, CTR Jovan Ormsby, CTR The Analysis and the Output Unit, NYSCR Staff (our invaluable CTRs), The NYS DOH Electronic Clinical Laboratory Reporting System Staff

July 13, 2017 26 Acknowledgement This work was funded in part by CDC s Cooperative Agreement U58/DP003879 awarded to the New York State Department of Health. The contents are solely the responsibility of the New York State Department of Health and do not necessarily represent the official views of the Centers for Disease Control and Prevention.