Chapter 1 INTRODUCTION TO THE ACS NSQIP PEDIATRIC. 1.1 Overview

Similar documents
ACS NSQIP Modeling and Data, July 14, Mark E. Cohen, PhD Continuous Quality Improvement American College of Surgeons

Reliability of Evaluating Hospital Quality by Surgical Site Infection Type. ACS NSQIP Conference July 22, 2012

Risk Factor Analysis for Postoperative Unplanned Intubation and Ventilator Dependence

Data Collection and Reporting: Why and How

ACS NSQIP Tools for Success. Pre-Conference Session July 25, 2015

EHR Enablement for Data Capture

HETEROGENEITY among providers (i.e., clinicians

Surgeon Champion: Getting Started, What You Need to Know

University of Michigan Health System Analysis of Wait Times Through the Patient Preoperative Process. Final Report

The Basics of the Quality In-Training Initiative (QITI)

ACS NSQIP Pediatric Participant Use Data File (PUF)

ORs in facilities that adopted team training had a lower rate of deaths for

Case-mix Analysis Across Patient Populations and Boundaries: A Refined Classification System

Enhanced Recovery After Surgery (ERAS) for Elective Colon Resection Surgery at Vancouver General Hospital. What is Possible?

9/29/2017. Enhanced Recovery After Surgery at the University of Virginia Medical Center. Disclosures. Objectives. None

Over the past decade, the number of quality measurement programs has grown

J Michael Henderson Chief Quality Officer Cleveland Clinic Health System

W. Douglas Weaver, MD, MACC. American College of Cardiology SENATE FINANCE COMMITTEE

GENERAL PROGRAM GOALS AND OBJECTIVES

Cost Effectiveness of Physician Anesthesia J.P. Abenstein, M.S.E.E., M.D. Mayo Clinic Rochester, MN

CALYPSO clinical & analytic learning platform for surgical outcomes

Reliability of Superficial Surgical Site Infections as a Hospital Quality Measure

LANCASTER GENERAL HEALTH

The Glasgow Admission Prediction Score. Allan Cameron Consultant Physician, Glasgow Royal Infirmary

AMERICAN COLLEGE OF SURGEONS Inspiring Quality: Highest Standards, Better Outcomes

ACS NSQIP Tools for Success. National Conference July 21, 2012

Quality Improvement Program (ACS NSQIP )

Frequently Asked Questions (FAQ) Updated September 2007

Perioperative Surgical Home

Healthcare- Associated Infections in North Carolina

College of American Pathologists. Senior Director, Legislation and Political Action Position Profile October 2012

The Website Revealed

Our SAR Looks Great, Now What? ACS NSQIP Pediatric

Surgical Performance Tracking in a Multisource Data Environment

devoted physicians. collaborative partners. metrics-driven quality. jlrmedicalgroup.com

Committee on Interdisciplinary Practice Policy and Procedures

State of California Health and Human Services Agency California Department of Public Health

4/10/2013. Learning Objective. Quality-Based Payment Models

Challenges of Sustaining Momentum in Quality Improvement: Lessons from a Multidisciplinary Postoperative Pulmonary Care Program

INFORMATION SYSTEM CONTEXTUAL DATA QUALITY: A CASE STUDY

Lessons learned from VASM cases. Barry Beiles Clinical Director VASM

Provincial Surveillance

What s next? Joint Commission Center for Transforming Healthcare Colorectal Surgical Site Infections (SSIs) Copyright, The Joint Commission

CAREER PLANS SURVEY School of Nursing BSN Class of 2017

U.S. Patents Awarded in 2005 Top 20 Universities

BOOKLET ON RECERTIFICATION MAINTENANCE OF CERTIFICATION

Disclosures. Platforms for Performance: Clinical Dashboards to Improve Quality and Safety. Learning Objectives

Nielsen ICD-9. Healthcare Data

Table 1: ICWP and Shepherd Care Program Differences. Shepherd Care RN / Professional Certification. No Formalized Training.

Why Focus on Perioperative Services?

Psychology Productivity wrvus per FTE(C), VISN Averages FY 2010

STS offers the following comments regarding the proposed changes outlined in the Notice of Proposed Rulemaking.

CAIR Conference Anaheim, CA, Nov. 6-9, 2012

Teamwork, Communication, Briefing, Checklists, & O.R. Safety

The dawn of hospital pay for quality has arrived. Hospitals have been reporting

Bundled Payments to Align Providers and Increase Value to Patients

AHRQ Safety Program for Improving Surgical Care and Recovery. ACS Quality and Safety Conference New York City July 21, 2017

The Changing Face of the Employer-Provider Relationship

Implementing Surgeon Use of a Patient Safety Checklist in Ophthalmic Surgery

ICEBP International Registry. Concept, Vision, and Function

Historically, measuring the quality of surgical

Abstract Session G3: Hospital-Based Medicine

BUILDING THE PATIENT-CENTERED HOSPITAL HOME

Malpractice Litigation & Human Errors. National Practitioners Data Bank. Judging Clinical Competence. Judging Physician Competence.

3M Health Information Systems. The standard for yesterday, today and tomorrow: 3M All Patient Refined DRGs

CURRICULUM VITAE Georgia Persky RN, MBA, PhD, CLNC Harrison, New York. Summary of Qualifications

COMPETENCY-BASED RESPONSIBILITIES FOR ALL RESIDENTS

Lead the way Your guide to Aexcel

Clinical Program Cost Leadership Improvement

EXECUTIVE SUMMARY. The Military Health System. Military Health System Review Final Report August 29, 2014

National Priorities for Improvement:

The How to Guide for Reducing Surgical Complications

Statement of the American College of Surgeons. Presented by David Hoyt, MD, FACS

NQF-Endorsed Measures for Surgical Procedures,

3. Does the institution have a dedicated hospital-wide committee geared towards the improvement of laboratory test stewardship? a. Yes b.

Same Day/Same Service Policy, Professional

Using the Trauma Quality Improvement Program (TQIP) Metrics Data to Change Clinical Practice Abigail R. Blackmore, MSN, RN Pamela W.

The Adult Cardiothoracic Anesthesiology Milestone Project

Optimal Resources for Children s Surgical Care. Keith T. Oldham, MD. ACS Quality and Safety Conference New York, New York July 22, 2017

Lahey Health and Cleveland Clinic: Building a Primary Care Strategy out of a Surgical Legacy

Basic Standards for Residency Training in Orthopedic Surgery

Predicting Transitions in the Nursing Workforce: Professional Transitions from LPN to RN

Physician Executive Council. Using the Perioperative Surgical Home to Improve Joint Replacement

Evidence for Accreditation in Bariatric Surgery Hospitals

Aggregating Physician Performance Data Across Health Plans

Program Selection Criteria: Bariatric Surgery

The Determinants of Patient Satisfaction in the United States

Healthgrades 2016 Report to the Nation

QualityPath Cardiac Bypass (CABG) Maintenance of Designation

PG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes

2017 Call for Abstracts

ORIGINAL ARTICLE. Incorrect Surgical Procedures Within and Outside of the Operating Room

Oscar Guillamondegui, MD, MPH, FACS Associate Professor of Surgery Tennessee Surgical Quality Collaborative

Training Requirements for the Specialty of. Paediatric Surgery

The Health Care Improvement Foundation 2017 Delaware Valley Patient Safety and Quality Award Entry Form 1. Hospital Name Jefferson Health

Measurability of Patient Safety

U.S. Army Civilian Personnel Evaluation Agency

ORIGINAL ARTICLE. Evaluating Popular Media and Internet-Based Hospital Quality Ratings for Cancer Surgery

MOCQI APPROVAL PROCESS AND REQUIREMENTS FOR QUALITY IMPROVEMENT PROJECTS

Variation in Hospital Mortality Associated with Inpatient Surgery

Transcription:

Chapter 1 INTRODUCTION TO THE ACS NSQIP PEDIATRIC 1.1 Overview A highly visible and important issue facing the medical profession and the healthcare industry today is the quality of care provided to patients. To that end numerous stakeholders; providers, payors and private industry are all investing large resources in efforts to measure, report and improve clinical care. The National Surgical Quality Improvement Program (NSQIP), which was singled out by the Institute of Medicine s Report, Leadership by Example, as one of the most highly regarded VHA initiatives employing performance measures is well aligned to meet these demands. The American College of Surgeons is confident that the ACS NSQIP is one of the best ways to benchmark and improve surgical care. Surgical procedures vary from medical treatments of diseases, lending themselves to observational outcome studies. Whether patients live or die, have complications, are cured, have their symptoms relieved, return to work or play, and are satisfied with their care are very important issues vital to the assessment of the quality of surgical care. The Department of Veterans Affairs Health System (VHA) addressed surgical quality improvement by developing the National Surgical Quality Improvement Program (NSQIP) a prospective, peer-controlled, validated methodology that computes and reports 30-day risk-adjusted surgical outcomes. After years of study and testing in the private sector, the American College of Surgeons has made ACS NSQIP available to eligible hospitals in the United States and Canada. 1.2 Background and History NSQIP started in the Veteran s Health Administration (VHA). In the mid-1980 s the VHA came under a barrage of criticism, from various media sources, regarding high mortality rates associated with operative procedures. As a result, the United States Congress passed Public Law 99-166, in December 1985, stating that among other measures the VHA should report their outcomes in comparison to national averages and that they must be risk-adjustment to account for the severity of illness of the VHA surgical patient population. Initial efforts focused on analyzing existing administrative databases and by conducting limited reviews of data solicited from the field. It was recognized by an ad-hoc advisory group that (1) it was impossible to make meaningful comparisons of the VA surgical results to national standards because those standards did not exist, and (2) any attempt to compare surgical results should take into account the preoperative surgical risk of the patients whose outcomes are being compared. A Steering Committee was established and charged with the task of developing a system of outcome reporting for all types of surgery in the VHA that would account for a patient s risk factors and would allow for an Version Date: July 1, 2012 1-1 ACS NSQIP-P

assessment of the quality of surgical results. Additionally the Steering Committee realized the requirement for meaningful comparison of these results between the VA medical centers, and, potentially, other health care sectors in the US. The development of patient risk models was the necessary first step in this endeavor. The VHA was in a unique position to develop the NSQIP because of the centralized authority of all the VHA Hospitals and the existence of a uniform electronic medical record in the VHA. As of December 31, 1993, information about patient preoperative risk and postoperative outcomes had been recorded for 500,000 non-cardiac surgical procedures. Risk assessment models were developed for surgical mortality and morbidity for groups of common surgical procedures as well as for seven surgical subspecialties (general, vascular, urology, orthopedics, neurosurgery, ENT, and thoracic surgery). In 1994 the NVASRS expanded to all 128 VHA hospitals that performed surgery and became the National Surgical Quality Improvement Program (NSQIP). In 1995 a validation study was conducted to determine the validity of the risk-adjusted surgical morbidity and mortality rates as measures of quality of care. This study focused on assessing the processes and structures of care in surgical services in order to determine which sites had higher- or lower-than-expected risk-adjusted mortality and morbidity rates. As of 2003, there are over 1.3 million major surgical cases in the VHA database. Impressive results from the NSQIP in the VHA have demonstrated a 27% decrease in 30-day surgical mortality and a 45% decrease in 30-day surgical morbidity. 1 Beginning in 1999, the NSQIP expanded into the private sector with an initiative involving three alpha sites: Emory University in Atlanta, University of Kentucky in Lexington, and the University of Michigan in Ann Arbor. The goal of this initiative was to determine the feasibility and applicability of the NSQIP in the private sector. This alpha test determined that the risk-adjustment models were predictive of outcomes in the private sector. The NSQIP expanded again to a beta test involving 18 private sector sites: Emory University in Atlanta, University of Kentucky in Lexington, University of Michigan in Ann Arbor, Washington University in St. Louis, University of Florida in Gainesville, St. Louis University, University of Utah, Brigham & Women s Hospital in Boston, University of California, San Francisco, University of Maryland, Baltimore, New York Presbyterian Hospital (Columbia and Cornell campuses), University of Virginia, and Massachusetts General Hospital, as well as four community hospitals - Faulkner Hospital, Newton-Wellesley Hospital, Salem Hospital and Union Hospital. Results from studies to date have demonstrated that the NSQIP models are extremely applicable to the private sector patient population and the data collection methods are equally applicable. 2 As of September 2004, the American College of Surgeons is supporting and directing the private sector expansion of the ACS NSQIP to all qualified sites. Version Date: July 1, 2012 1-2 ACS NSQIP-P

1.3 Pediatric Pilot Program The ACS began a collaborative effort with the American Pediatric Surgical Association (APSA) in 2007, to develop the ACS Pediatric NSQIP. The alpha phase of the Pediatric NSQIP began in 2008, with the beta phase commencing in 2009. The objective of the pilot program was to assess all aspects of the program including, but not limited to, development of data definitions, data collection worksheets, workstation for data entry, and data analysis methodology. 1.4 Data Collection The ACS NSQIP-P collects over 1300 data points from none pediatric surgical specialties, on patients less than 18 of age. The ACS NSQIP-P collects the following data points on every patient: Demographics - 7 variables (Includes IDN) Surgical Profile - 11 variables Pre-operative Data - 45 clinical variables and 13 laboratory variables Intra-operative Data - 11 clinical variables and 4 occurrence variables Post-operative Occurrence Data - 21 clinical variables Discharge Date - 12 clinical variables Neonatal Data -10 clinical variables At each participating hospital, a specially trained, dedicated Surgical Clinical Reviewer (SCR) collects the preoperative, intraoperative, and postoperative outcome data on all cases that meet program criteria, and enters this information into a secure database. This data is transmitted to Outcome Sciences where after performing data checks, for valid data entry, the data is subjected to statistical analysis. 1 1.5 Risk-Adjustment (detailed description) Risk-adjusted models for ACS NSQIP participating hospitals are computed every 6 months, using data from the previous 12-month period (with a 6 month lag due to case locking and time required for the analysis) and are reported semi-annually in the SAR (Semi-annual Report). Risk-adjusted results are computed for 30-Day postoperative Mortality, Morbidity for Overall (Non-Multispecialty), Overall (Multispecialty), General Surgery, Colorectal Surgery, Vascular Surgery, additional surgical subspecialty cases, and other models as new ACS NSQIP programs come online (e.g. Targeted). Also included in the SAR are risk-adjusted results focusing on specific surgical occurrences. Risk adjustment is important because it takes into account differences in patient and procedure mix between hospitals, thus permitting fair comparisons of hospitals. ACS NSQIP modeling techniques have evolved and improved over time as new statistical techniques have become available. Our current risk-adjustment modeling process generally follows the below described process, though certain though certain models might be approached somewhat differently. Version Date: July 1, 2012 1-3 ACS NSQIP-P

For each outcome, a linear risk is assigned to each clinical grouping of CPT codes. This is usually accomplished by modeling CPT group as a categorical predictor for each binary outcome (e.g., death). The linear transformed (logit) predicted probability associated with each CPT category is then used in follow- on models. Forward stepwise logistic regression is then used to develop a predictive model for each outcome. At the first step the predictor that is most strongly associated with the outcome is selected. After that, correlations between all remaining unselected variables and the outcome are adjusted for predictors already in the model, and the variable with the next strongest association is entered. This continues until no remaining variable is capable of significantly improving the model. This process chooses a useful predictive set from 50 or more potential predictive variables. The variables selected by the logistic regression are then used in a hierarchical model. One important advantage of hierarchical models, as we have implemented them, is that they are able to pool information about what is known about all hospitals with what is known about a specific hospital to achieve a best prediction about hospital quality. Particularly when the hospital sample size is small, this pooling results in the hospital quality estimate being shrunk towards the mean of all hospitals. For technical reasons, the best hospital quality metric derived from a hierarchical model is the odds ratio. The odds ratio is ratio of odds (number of patients exhibiting the event/number of patients not exhibiting the event) for a target group to the odds for a comparison group. An odds ratio of 1.0 implies that the event is equally likely in both groups. An odds ratio greater than 1.0 implies that the event is more likely in the first group and an odds ratio less than 1.0 implies that the event is less likely in the first group. For example, an odds ratio of 1.5 for ASA class 3 versus ASA class 1, means that there is a 50% increase in odds for the event for patients with ASA class 3. For ACS NSQIP hierarchical modeling, an odds ratio is also constructed for the effect of hospital on the outcome. For these odds ratios, the numerator is the odds for the hospital (where the observed number of events and number of nonevents are adjusted by the statistical model) and the denominator is the average odds for all hospitals. If the odds ratio is equal to 1.0, the hospital is doing as expected. If the odds ratio is greater than 1.0, the hospital is doing worse than expected. If the odds ratio is less than 1.0, the hospital is doing better than expected. 1.6 Use of ACS NSQIP for Quality Improvement Bi-annually, the statistical results of the data are reported back to the participating sites for their review and utilization in quality improvement. This comprehensive report includes the ORs and the hierarchical regression models. These ORs provide an overview of the assessment of quality of care in the surgical service. Through detailed study of the additional reports, the comparative outcome statistics start to guide the user towards areas for process improvement. Additionally, a suite of continuously updated and continuously available, online reports are on the website for each site to use and review. Some of the reports that are currently available are postoperative mortality and morbidity, as well as for preoperative risk factors, wound class, and CPT codes, all of which are benchmarked against the average of all the private sector sites participating in the ACS NSQIP. These reports are currently utilized by the participating sites for early and on-going identification of areas for quality improvement. Version Date: July 1, 2012 1-4 ACS NSQIP-P

The ACS NSQIP also provides best practice guidelines and hospital case studies to all participating sites for utilization towards quality improvement initiatives at the hospital level. 1.7 Reliable Data The ACS NSQIP s strength is reliable data. Appropriate decisions cannot be made on data if the reliability of that data is questionable. The ACS NSQIP goes to great lengths to assure the reliability of the data through a variety of means, such as, consistent, detailed initial training, continuing education modules, inter-rater reliability site visits and testing, regular online case studies, regular conference calls, and an annual conference to review all aspects of the Program and the data collection process. Participants in the Program have come to trust the reliability of the data and use the online reports in conjunction with the bi-annual reports to identify opportunities to improve of the processes and outcome of surgical care. 1.8 Structure of the ACS Pediatric NSQIP The American College of Surgeons has the overall responsibility for the ACS Pediatric NSQIP and specifically manages all of the contractual, financial, and marketing efforts of the Program. The ACS provides stewardship and education about the Program to interested participants and regulatory bodies and provides oversight of all of the functions of the Program. Oversight for the ACS Pediatric NSQIP is provided by the Pediatric Measurement and Evaluation, and Steering committees, in conjunction with the ACS. The roles of these governing bodies is to maintain, and oversee the processes, structures, and functions of the ACS NSQIP-P including but not limited to, the reliable collection of the data, annual review of the risk-adjusted outcomes with provision of feedback, strategic planning, scientific mining of the data, and peer review. The day-to-day management of the ACS NSQIP-P resides within the American College of Surgeons. The American College of Surgeons provides program, and clinical oversight of the SCRs in the field, while Outcome Sciences provides the IT infrastructure. Each SCR has access to a dedicated team of Clinical Support Specialists available to assist with clinical issues. Technical support staff, from Outcome Sciences, is available to provide support for all information technology issues. Statistical analysis of all Program data is performed by the American College of Surgeons. 1.9 About this Operations Manual This operations manual is intended to provide detailed guidelines for the SCR to assess cases and collect data for the ACS NSQIP-P. While not all questions can be answered within a single source, this manual is a compendium of years of experience with the NSQIP-P. Questions not answered in this source should be addressed with Version Date: July 1, 2012 1-5 ACS NSQIP-P

the Clinical Support Specialists. We hope that you find this manual a useful reference to guide your practice and we welcome feedback for future improvements. 1. Khuri SF, Daley J, Henderson WG. (2002) The Comparative Assessment and Improvement of Quality of Surgical Care in the Department of Veterans Affairs. Archives of Surgery, 137:20-27. 2. Khuri SF, Daley J, Henderson WG, Hur K, Demakis J, Aust JB, Chong V, Fabri PJ, Gibbs JO, Grover F, Hammermeister K, Irvin G, McDonald G, Passaro E, Phillips L, Scamman F, Spencer J, Stremple JF and the participants in the National VA Surgical Quality Improvement Program. (1998) The Department of Veterans Affairs NSQIP: The First National, Validated, Outcome-Based, Risk-Adjusted, and Peer-Controlled Program for the Measurement and Enhancement of the Quality of Surgical Care. Annals of Surgery, 228(4):491-507. 3. Daley J, Forbes MG, Young GJ, Charns MP, Gibbs JO, Hur K, Henderson WG, Khuri SF. (1997) Validating Risk-Adjusted Surgical Outcomes: Site Visit Assessment of Process and Structure. Journal of the American College of Surgeons, 185:341-351. 4. Fink AS, Campbell DA, Mentzer RM, Henderson WG, Daley J, Bannister J, Hur K, Khuri SF. (2002) the National Surgical Quality Improvement Program in non-veterans Administration Hospitals: Initial Demonstration of Feasibility. Annals of Surgery, 236(3):344-354. Version Date: July 1, 2012 1-6 ACS NSQIP-P