Hospital Service Accountability Agreement. Indicator Technical Specifications

Size: px
Start display at page:

Download "Hospital Service Accountability Agreement. Indicator Technical Specifications"

Transcription

1 Hospital Service Accountability Agreement Indicator Technical Specifications October 2015

2 TABLE OF CONTENTS PATIENT EXPERIENCE ACCESS, EFFECTIVE, SAFE, PERSON-CENTERED... 5 PERFORMANCE th Percentile Emergency Department (ED) length of stay for Complex Patients th Percentile ED Length of Stay for Minor/Uncomplicated Patients... 8 Percent of Priority 2, 3 and 4 Cases Completed within Access Targets for Hip Replacements Percent of Priority 2, 3 and 4 Cases Completed within Access Targets for Knee Replacements Percent of Priority 2, 3 and 4 Cases Completed within Access Targets for MRI Percent of Priority 2, 3 and 4 Cases Completed within Access Targets for CT Scans Rate of Hospital Acquired Clostridium Difficile Infections EXPLANATORY Percent of Stroke/TIA Patients Admitted to a Stroke Unit During Their Inpatient Stay Hospital Standardized Mortality Ratio (HSMR) Rate of Ventilator-Associated Pneumonia Central Line Infection Rate Rate of Hospital Acquired Methicillin Resistant Staphylococcus Aureus Bacteremia Percent of Priority 2, 3, and 4 cases completed within Access targets for Cardiac By-Pass Surgery Percent of Priority 2, 3, and 4 cases completed within Access targets for Cancer Surgery Percent of Priority 2, 3 and 4 Cases Completed within Access Targets for Cataract Surgery CLASSIFICATIONS TO BE DETERMINED Readmissions within 30 days for selected HBAM Inpatient Grouper (HIG) Conditions ORGANIZATIONAL HEALTH - EFFICIENT, APPROPRIATELY RESOURCED, EMPLOYEE EXPERIENCE, GOVERNANCE PERFORMANCE Current Ratio (Consolidated all sector codes and fund types) Total Margin (Consolidated all sector codes and fund types) EXPLANATORY Total Margin (Hospital Sector Only) Adjusted Working Funds / Total Revenue % SYSTEM PERSPECTIVE INTEGRATION, COMMUNITY ENGAGEMENT, EHEALTH PERFORMANCE Alternate Level of Care (ALC) Rate EXPLANATORY Percentage of Alternate Level of Care (ALC) Days /17 HSAA Technical Specifications Page 2

3 CLASSIFICATIONS TO BE DETERMINED Repeat Unscheduled Emergency Visits Within 30 Days for Mental Health Conditions Repeat Unscheduled Emergency Visits Within 30 Days for Substance Abuse Conditions APPENDIX: SERVICE VOLUME METRICS GLOBAL VOLUMES (SCHEDULE C2 PART I) Ambulatory Care Visits Complex Continuing Care Weighted Patient Days Day Surgery Weighted Visits Elderly Capital Assistance Program (ELDCAP) Inpatient Days ED Weighted Cases Emergency Department and Urgent Care Visits Inpatient Mental Health Weighted Days Inpatient Mental Health Days Inpatient Rehabilitation Days Rehabilitation Separations Total Inpatient Acute Weighted Cases HOSPITAL SPECIALIZED SERVICES (SCHEDULE C2 PART II) Cochlear Implants (Cases) Sexual Assault/Domestic Violence Treatment Clinics (Patients) WAIT TIME VOLUMES (SCHEDULE C2 PART III) General Surgery (Base & incremental) Paediatric Surgery (Base & incremental) Hip & Knee Replacement - Revisions (Cases) Magnetic Resonance Imaging (MRI) Total Hours Ontario Breast Screening Program (OBSP) Magnetic Resonance Imaging (MRI) Total Hours Computed Tomography (CT) Total Hours PROVINCIAL PROGRAMS (SCHEDULE C2 PART IV) Automatic Inplantable Cardiac Defib's (# of New Implants) Bariatric Surgery (Procedures) QUALITY BASED PROCEDURES (SCHEDULE C2 PART V) Congestive heart failure (CHF) Stroke - Hemorrhage Stroke Ischemic or Unspecified Stroke Transient Ischemic Attack (TIA) Non-Cardiac Vascular Aortic Aneurysm (AA) Non-Cardiac Vascular Lower Extremity Occlusive Disease (LEOD) Acute Primary Unilateral Hip Replacement Inpatient Rehab for Primary Unilateral Hip Replacement Acute Primary Unilateral Knee Replacement Inpatient Rehab for Primary Unilateral Knee Replacement Acute Primary Bilateral Joint Replacement (Hip/Knee) Inpatient Rehab Primary Bilateral Joint Replacement (Hip/Knee) /17 HSAA Technical Specifications Page 3

4 Hip Fracture Knee Arthroscopy Neonatal Jaundice (Hyperbilirubinemia) Tonsillectomy Chronic Obstructive Pulmonary Disease (COPD) Pneumonia Cataract Copyright 2015, Queens Printer, Ontario. All rights reserved. 2016/17 HSAA Technical Specifications Page 4

5 PATIENT EXPERIENCE Access, Effective, Safe, Person-Centered Performance INDICATOR NAME Detailed description of indicator INDICATOR CLASSIFICATION PERFORMANCE STANDARD 90TH PERCENTILE EMERGENCY DEPARTMENT (ED) LENGTH OF STAY FOR COMPLEX PATIENTS The total ED length of stay* where 9 out of 10 complex patients completed their visits. *ED Length of Stay defined as the time from triage or registration, whichever comes first, to the time the patient leaves the ED. Performance Target: (i) For hospitals performing at the provincial target or better: Performance target= maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. (ii) For hospitals performing below the provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. INDICATOR CALCULATION Corridor: Upper corridor = performance target + 10% Step 1: Calculate ED length of stay in hours for each patient. Step 2: Apply inclusion and exclusion criteria. Step 3: Sort the cases by ED length of stay from shortest to longest. Step 4: The 90th percentile is the case where 9 out of 10 complex patients have completed their visits. NACRS, Canadian Institute for Health Information (CIHI) via Ontario s ER NACRS Initiative (ERNI-Level 1). Inclusion Criteria: Admitted patients Disposition Codes 06 and 07 Non-Admitted Patients (Disposition Codes 01, and 08 15) with assigned CTAS I, II, or III 2016/17 HSAA Technical Specifications Page 5

6 Exclusion Criteria: 1. ED visits where Registration Date/Time and Triage Date/Time are both blank/unknown (9999) 2. ED visits where the MIS functional centre is under Emergency Trauma, Observation or Emergency Mental Health Services (as of January 2015 data) 3. Duplicate cases within the same functional center where all ER data elements have the same values except for Abstract ID number 4. ED visits where the ED visit Indicator is = '0' 5. ED visits where patient has left without being seen by a physician during his/her visit (Disposition Code 02 and 03) 6. ED Length of Stay is greater than or equal to minutes (1666 hours) 7. Non-Admitted Patients (Disposition Codes and 08 15) with assigned CTAS IV or V 8. Non-Admitted Patients (Disposition Codes and 08 15) with missing CTAS TIMING/FREQUENCY OF RELEASE How often, and when, are data being released GEOGRAPHY & TIMING ADDITIONAL INFORMATION LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Access to Care at Cancer Care Ontario (CCO) has ongoing data quality improvement process in place for ER Level 1 NACRS. Timeliness and accuracy are key data quality dimensions that are monitored closely on a regular basis. Access to Care has developed a data quality matrix to assess the quality of ED data at three levels: Province, LHIN and facility. ED data in Ontario via CIHI-NACRS have been collected since July However, NACRS was modified to accept different reporting levels as of April The calculated 90th percentile ED length of stay at the provincial level, for the latest month can be compared with the baseline of April 2008 in the Emergency Room Wait Times Government of Ontario website and in the ED Reports provided to LHINs and hospitals every month (Provincial ER Highlights Report, ER LHIN Highlight Report, and the LHIN ER Pay for Results Report). Historical trend data from April 2008 onwards for all ED facilities in Ontario are available on request to ATCDataRequest@cancercare.on.ca. A small percentage of records are excluded from the analysis every month, due to missing/invalid values for the relevant wait time fields (such as Time patient left ED or Registration time etc.). Calculated indicator value is based on ED visits submitted by 126 sites participating in the ER National Ambulatory Care Reporting System (NACRS) Initiative (ERNI) reporting to the NACRS database. Approximately 90% of ED Visits in Ontario are captured by hospital sites participating in ERNI (based on NACRS 13/14 data released July 2014). As of April 2009, patient s stay in a designated Clinical Decision Unit (CDU) will be excluded in the total time spent in ED. 2016/17 HSAA Technical Specifications Page 6

7 Due to the introduction of newly designated CDUs in select hospitals, there may be a difference in the calculation methodology of the baseline and the quarterly indicators. Access to Care (ATC) Informatics regularly informs the LHINs of the CDU impact on the overall time spent in the Emergency Department through the monthly ED reports. LHINs are provided with the list of hospitals with designated CDUs. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING Cancer Care Ontario DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 7

8 INDICATOR NAME Detailed description of indicator INDICATOR CLASSIFICATION PERFORMANCE STANDARD 90TH PERCENTILE ED LENGTH OF STAY FOR MINOR/UNCOMPLICATED PATIENTS The total ED length of stay* where 9 out of 10 minor/uncomplicated patients completed their visits. *ED Length of Stay defined as the time from triage or registration, whichever comes first, to the time the patient leaves the ED. Performance Target: (i) For hospitals performing at the provincial target or better: Performance target= maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. (ii) For hospitals performing below the provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. CALCULATION Corridor: Upper corridor = performance target + 10% Step 1: Calculate ED length of stay in hours for each patient. Step 2: Apply Inclusion and Exclusion Criteria. Step 3: Sort the cases by ED length of stay from shortest to highest. Step 4: The 90th percentile is the case where 9 out of 10 minor/uncomplicated patients have completed their visits. NACRS, CIHI via Ontario s ER NACRS Initiative (ERNI-Level 1). INDICATOR Inclusion Criteria: Non-Admitted Patients Disposition Codes 01, and with assigned CTAS IV and V. Exclusion Criteria: 1. ED visits where Registration Date/Time and Triage Date/Time are both blank/unknown (9999) 2. ED visits where the MIS functional centre is under Emergency Trauma, Observation or Emergency Mental Health Services (as of January 2015 data) 2016/17 HSAA Technical Specifications Page 8

9 TIMING/FREQUENCY OF RELEASE How often, and when, are data being released 3. Duplicate cases within the same functional center where all ER data elements have the same values except for Abstract ID number 4. ED visits where the ED visit Indicator is = '0' 5. ED visits where patient has left without being seen by a physician during his/her visit (Disposition Code 02 & 03) 6. ED Length of Stay is greater than or equal to minutes (1666 hours) 7. Admitted Patients (Disposition Codes 06 and 07) 8. Non-Admitted Patients (Disposition Codes 01, and 08 15) with assigned CTAS I, II and II 9. Non-Admitted Patients (Disposition Codes 01, and 08 15) with missing CTAS GEOGRAPHY & TIMING ADDITIONAL INFORMATION LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Access to Care at CCO has ongoing data quality improvement process in place for ER Level 1 NACRS. Timeliness and accuracy are key data quality dimensions that are monitored closely on a regular basis. Access to Care has developed a data quality matrix to assess the quality of ED data at three levels: Province, LHIN and facility. ED data in Ontario via CIHI-NACRS have been collected since July However, NACRS was modified to accept different reporting levels as of April The calculated 90th percentile ED LOS at the provincial level, for the latest month can be compared with the baseline of April 2008 in the Emergency Room Wait Times Government of Ontario website and in the ED Reports provided to LHIN s and hospitals every month (Provincial ER Highlights Report, ER LHIN Highlight Report, and the LHIN ER Pay for Results Report). Historical trend data from April 2008 onwards for all ED facilities in Ontario are available on request to ATCDataRequest@cancercare.on.ca. A small percentage of records are excluded from the analysis every month, due to missing/invalid values for the relevant wait time fields (such as Time patient left ED or Registration time etc.). Calculated indicator value is based on ED visits submitted by 126 sites participating in the ER NACRS Initiative ERNI reporting to the NACRS database. Approximately 90% of ED Visits in Ontario are captured by hospital sites participating in ERNI (based on NACRS 13/14 data released July 2014). As of April 2009, patient s stay in a designated CDU will be excluded in the total time spent in ER. Due to the introduction of newly designated CDUs in select hospitals, there may be a difference in the calculation methodology of the baseline and the quarterly indicators. ATC Informatics regularly informs the LHINs of the CDU 2016/17 HSAA Technical Specifications Page 9

10 impact on the overall time spent in the ED through the monthly ED reports. LHINs are provided with the list of hospitals with designated CDUs. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING Cancer Care Ontario DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 10

11 INDICATOR NAME Detailed description of indicator PERCENT OF PRIORITY 2, 3 AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR HIP REPLACEMENTS Descriptions of priority levels can be found in the link within the Reference section. INDICATOR CLASSIFICATION PERFORMANCE STANDARD Performance Target: (i) For hospitals performing at the provincial target or better: Performance target = maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. (ii) For hospitals performing below provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. INDICATOR CALCULATION Corridor: Greater than or equal to the performance target to 100%. Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. Sample calculation: WTIS, ATC, Cancer Care Ontario. 2016/17 HSAA Technical Specifications Page 11

12 GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. All closed wait list entries with procedure dates within date range submitted by hospitals through the WTIS. Patient age greater than or equal to 18 years old on the day the procedure was completed. Procedures no longer required (or cancelled cases) are excluded from wait time calculation. Procedures assigned as priority level 1 cases are excluded from wait time calculation. Cases with missing priority levels are also excluded. Wait list entries identified by hospitals as data entry errors are excluded. If unavailable days fall outside the decision to treat date up to procedure date, unavailable days are not deducted from patients wait days. These are considered data entry errors. As part of ATC s on-going data quality and compliance processes with hospitals, accuracy is one of the key areas that have been monitored closely over time and assessed on a regular basis along with other data quality dimensions. ATC monitors data quality with hospitals on a weekly basis and works closely with hospitals WTIS coordinators to ensure data quality indicators and thresholds are met. If hospitals have challenges meeting data quality thresholds they may be excluded from wait time reporting calculations as well as public reporting. In these circumstances hospital, LHIN and/or ministry leadership is engaged to ensure hospital data quality is enhanced to become reportable as soon as possible. The calculated percent of cases completed within priority target can be compared with the historical trend published in the Government of Ontario wait time s website. All inclusions/exclusions criteria used are similar. Also, historical wait times trend for low volume hospitals/lhins will show as NV (no or low volume) instead of a calculated percent of cases completed within priority target. ADDITIONAL INFORMATION LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, Hospitals submitting wait time data voluntarily (not required to report) are included in wait time calculation. Calculated percent of cases completed within priority targets is based only on the number of cases entered in the system. Logically, hospitals not reporting cases promptly are excluded at the 2016/17 HSAA Technical Specifications Page 12

13 interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING time of data extraction. Volumes submitted by hospitals are checked monthly for completeness. Hospital volume is compared against the expected monthly average. Outliers are validated with hospitals if the wait days are not accurate. px Cancer Care Ontario DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 13

14 INDICATOR NAME Detailed description of indicator PERCENT OF PRIORITY 2, 3 AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR KNEE REPLACEMENTS Descriptions of priority levels can be found in the link within the Reference section. INDICATOR CLASSIFICATION PERFORMANCE STANDARD Performance Target: (i) For hospitals performing at the provincial target or better: Performance target = maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. (ii) For hospitals performing below provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. INDICATOR CALCULATION Corridor: Greater than or equal to the performance target to 100%. Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. Sample calculation: WTIS, ATC, Cancer Care Ontario. 2016/17 HSAA Technical Specifications Page 14

15 Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. All closed wait list entries with procedure dates within date range submitted by hospitals through the WTIS. Patient age greater than or equal to 18 years old on the day the procedure was completed. Procedures no longer required (or cancelled cases) are excluded from wait time calculation. Procedures assigned as priority level 1 cases are excluded from wait time calculation. Cases with missing priority levels are also excluded. Wait list entries identified by hospitals as data entry errors are excluded. If unavailable days fall outside the decision to treat date up to procedure date, unavailable days are not deducted from patients wait days. These are considered data entry errors. TIMING/FREQUENCY OF RELEASE How often, and when, are data being released GEOGRAPHY & TIMING ADDITIONAL INFORMATION LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, As part of ATC s on-going data quality and compliance processes with hospitals, accuracy is one of the key areas that have been monitored closely over time and assessed on a regular basis along with other data quality dimensions. ATC monitors data quality with hospitals on a weekly basis and works closely with hospitals WTIS coordinators to ensure data quality indicators and thresholds are met. If hospitals have challenges meeting data quality thresholds they may be excluded from wait time reporting calculations as well as public reporting. In these circumstances hospital, LHIN and/or ministry leadership is engaged to ensure hospital data quality is enhanced to become reportable as soon as possible. The calculated percent of cases completed within priority target can be compared with the historical trend published in the Government of Ontario wait time s website. All inclusions/exclusions criteria used are similar. Also, historical wait times trend for low volume hospitals/lhins will show as NV (no or low volume) instead of a calculated percent of cases completed within priority target. Hospitals submitting wait time data voluntarily (not required to report) are included in wait time calculation. Calculated percent of cases completed within priority targets is based only on the number of cases entered in the 2016/17 HSAA Technical Specifications Page 15

16 interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING system. Logically, hospitals not reporting cases promptly are excluded at the time of data extraction. Volumes submitted by hospitals are checked monthly for completeness. Hospital volume is compared against the expected monthly average. Outliers are validated with hospitals if the wait days are not accurate. px Cancer Care Ontario DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 16

17 INDICATOR NAME Detailed description of indicator PERCENT OF PRIORITY 2, 3 AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR MRI Descriptions of priority levels can be found in the link within the Reference section. INDICATOR CLASSIFICATION PERFORMANCE STANDARD Performance Target: (i) For hospitals performing at the provincial target or better: Performance target = maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. (ii) For hospitals performing below provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. INDICATOR CALCULATION Corridor: Greater than or equal to the performance target to 100%. Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. Sample calculation: 2016/17 HSAA Technical Specifications Page 17

18 Wait Time Information System (WTIS), Access to Care (ATC), Cancer Care Ontario. GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. All closed wait list entries with procedure dates within date range submitted by hospitals through the Wait Time Information System. Patient age greater than or equal to 18 years old on the day the procedure was completed. Procedures no longer required (or cancelled cases) are excluded from wait time calculation. Procedures assigned as priority level 1 cases are excluded from wait time calculation. Cases with missing priority levels are also excluded. Wait list entries identified by hospitals as data entry errors are excluded. If unavailable days fall outside the decision to treat date up to procedure date, unavailable days are not deducted from patients wait days. These are considered data entry errors. Diagnostic Imaging (DI) cases classified as specified date procedures (timed procedures) are excluded from wait time calculation. As part of ATC s on-going data quality and compliance processes with hospitals, accuracy is one of the key areas that have been monitored closely over time and assessed on a regular basis along with other data quality dimensions. ATC monitors data quality with hospitals on a weekly basis and works closely with hospitals WTIS coordinators to ensure data quality indicators and thresholds are met. If hospitals have challenges meeting data quality thresholds they may be excluded from wait time reporting calculations as well as public reporting. In these circumstances hospital, LHIN and/or ministry leadership is engaged to ensure hospital data quality is enhanced to become reportable as soon as possible. The calculated percent of cases completed within priority target can be compared with the historical trend published in the Government of Ontario wait time s website. All inclusions/exclusions criteria used are similar. Also, historical wait times trend for low volume hospitals/lhins will show as NV (no or low volume) instead of a calculated percent of cases completed within priority target. 2016/17 HSAA Technical Specifications Page 18

19 ADDITIONAL INFORMATION LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Hospitals submitting wait time data voluntarily (not required to report) are included in wait time calculation. Calculated percent of cases completed within priority targets is based only on the number of cases entered in the system. Logically, hospitals not reporting cases promptly are excluded at the time of data extraction. Volumes submitted by hospitals are checked monthly for completeness. Hospital volume is compared against the expected monthly average. Outliers are validated with hospitals if the wait days are not accurate. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) px Cancer Care Ontario /17 HSAA Technical Specifications Page 19

20 INDICATOR NAME PERCENT OF PRIORITY 2, 3 AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR CT SCANS Detailed description of indicator Descriptions of priority levels can be found in the link within the Reference section. INDICATOR CLASSIFICATION Performance Target: (i) For hospitals performing at the provincial target or better: Performance target = maintain or improve current performance. Historical performance should be reviewed prior to target setting, and plans to maintain or improve the performance target are to be discussed. PERFORMANCE STANDARD (ii) For hospitals performing below provincial target: Performance target= provincial target or better. Historical performance should be reviewed prior to target setting, and plans to achieve the performance target are to be discussed. INDICATOR CALCULATION Corridor: Greater than or equal to the performance target to 100%. Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. Sample calculation: 2016/17 HSAA Technical Specifications Page 20

21 Wait Time Information System (WTIS), Access to Care (ATC), Cancer Care Ontario. Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. All closed wait list entries with procedure dates within date range submitted by hospitals through the Wait Time Information System. Patient age greater than or equal to 18 years old on the day the procedure was completed. Procedures no longer required (or cancelled cases) are excluded from wait time calculation. Procedures assigned as priority level 1 cases are excluded from wait time calculation. Cases with missing priority levels are also excluded. Wait list entries identified by hospitals as data entry errors are excluded. If unavailable days fall outside the decision to treat date up to procedure date, unavailable days are not deducted from patients wait days. These are considered data entry errors. Diagnostic Imaging (DI) cases classified as specified date procedures (timed procedures) are excluded from wait time calculation. TIMING/FREQUENCY OF RELEASE How often, and when, are data being released GEOGRAPHY & TIMING LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending As part of ATC s on-going data quality and compliance processes with hospitals, accuracy is one of the key areas that have been monitored closely over time and assessed on a regular basis along with other data quality dimensions. ATC monitors data quality with hospitals on a weekly basis and works closely with hospitals WTIS coordinators to ensure data quality indicators and thresholds are met. If hospitals have challenges meeting data quality thresholds they may be excluded from wait time reporting calculations as well as public reporting. In these circumstances hospital, LHIN and/or ministry leadership is engaged to ensure hospital data quality is enhanced to become reportable as soon as possible. The calculated percent of cases completed within priority target can be compared with the historical trend published in the Government of Ontario wait time s website. All inclusions/exclusions criteria used are similar. Also, historical wait times trend for low volume hospitals/lhins will show as NV 2016/17 HSAA Technical Specifications Page 21

22 (no or low volume) instead of a calculated percent of cases completed within priority target. ADDITIONAL INFORMATION LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Hospitals submitting wait time data voluntarily (not required to report) are included in wait time calculation. Calculated percent of cases completed within priority targets is based only on the number of cases entered in the system. Logically, hospitals not reporting cases promptly are excluded at the time of data extraction. Volumes submitted by hospitals are checked monthly for completeness. Hospital volume is compared against the expected monthly average. Outliers are validated with hospitals if the wait days are not accurate. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) px Cancer Care Ontario /17 HSAA Technical Specifications Page 22

23 INDICATOR NAME RATE OF HOSPITAL ACQUIRED CLOSTRIDIUM DIFFICILE INFECTIONS Detailed description of indicator INDICATOR CLASSIFICATION PERFORMANCE STANDARD The rate of hospital acquired Clostridium difficile infections (CDI) is a measure of the incidence of disease and is the number of CDI cases per 1,000 patient days. Performance Target: 0 or hospital Quality Improvement Plan (QIP) target Corridor: Upper corridor = 10% improvement on current rate or the submitted Health Quality Ontario QIP target, whichever is greater LHINs and hospitals should review current rates and identify achievable performance improvement or maintenance of current performance levels CALCULATION The total number of new nosocomial (i.e. hospital acquired) CDI cases in the reporting period multiplied by 1,000 Self Reporting Initiative (SRI), Ontario Ministry of Health and Long-Term Care (MOHLTC) NUMERATOR Includes: 1. All publicly funded hospitals 2. Inpatient beds 3. Laboratory-confirmed CDI cases (i.e. confirmation of a positive toxin assay (A/B) for Clostridium difficile together with diarrhea OR visualization of pseudomembranes on sigmoidoscopy or colonoscopy, or histological/pathological diagnosis of pseudomembranous colitis) 4. New nosocomial cases associated with the reporting facility is where the infection was not present on admission (i.e., onset of symptoms > 72 hours after admission) or the infection was present at the time of admission but was related to a previous admission to the same facility within the last 4 weeks and the case has not had CDI in the past 8 weeks. Excludes: 1. Patients less than 1 year of age 2. Long-term care beds 2016/17 HSAA Technical Specifications Page 23

24 CALCULATION The total number of patient days spent in-hospital in a reporting period DENOMINATOR Self Reporting Initiative (SRI), Ontario Ministry of Health and Long-Term Care (MOHLTC) Includes: 1. All publicly funded hospitals 2. Inpatient beds Excludes: 1. Patients less than 1 year of age 2. Long-term care beds GEOGRAPHY & TIMING ADDITIONAL INFORMATION TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending LIMITATIONS Specific limitations COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Data are available each month for the previous month s data. Data may be aggregated across reporting periods to generate more stable rates. Data are available at provincial, LHIN and hospital levels Data are available from September 2008 Data are self-reported by hospital No individual patient data are available, therefore indicator cannot be broken down by age, gender, income or education Baseline data should be generated on 1-years worth of data since CDI is expected to fluctuate seasonally Trending and comparisons are most valid by hospital type (e.g. small, large community, acute teaching, chronic care and rehab and mental health). This is in order to make limited adjustment for patient case mix. The CDI rate calculation allows the level of hospital activity to be taken into account because this will fluctuate over time and is different across hospitals. Hospital rates can also fluctuate significantly from one reporting 2016/17 HSAA Technical Specifications Page 24

25 period to another for a variety of reasons. For example, a small hospital with relatively few patient days when compared to larger institutions could see its rates vary dramatically based on one or two cases in any given month. These types of fluctuations will level out over a longer period of time. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING Patient Safety Website Health Service Providers DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 25

26 Explanatory INDICATOR NAME Detailed description of indicator INDICATOR CLASSIFICATION PERCENT OF STROKE/TIA PATIENTS ADMITTED TO A STROKE UNIT DURING THEIR INPATIENT STAY All stroke/tia patients should be admitted to a stroke unit for acute stroke management for improved outcomes Explanatory Stroke/TIA Patients (Most Responsible Diagnosis = I60, I61, I63, I64, G45, H34.0, H34.1) admitted to a Stroke unit at any point during their inpatient stay, multiplied by 100 NUMERATOR CALCULATION A stroke unit is a geographical unit with identifiable co-located beds (e.g., 5A-7, 5A-8, 5A-9) that are occupied by stroke patients 75% of the time and has a dedicated interprofessional team with expertise in stroke care including, at a minimum, nursing, physiotherapy, occupational therapy and speech-language pathology. Ontario Stroke Registry, Ontario Stroke Audit Hospitals participating in CIHI Special Project #340 via the Discharge Abstract Database (DAD), Canadian Institute for Health Information (CIHI) Excludes: 1. Diagnostic code G45.4, I63.6, I60.8 CALCULATION Stroke/TIA Patients (Most Responsible Diagnosis = I60, I61, I63, I64, G45, H34.0, H34.1) admitted Ontario Stroke Registry, Ontario Stroke Audit DENOMINATOR Hospitals participating in CIHI Special Project #340 via the DAD, CIHI Excludes: 1. Diagnostic code excludes I63.6, I60.8, G Patients with palliative measures as part of the initial treatment plan. 3. OSA: In-hospital strokes 4. OSA 2012/13 excludes Subarachnoid hemorrhages. Includes adult patients only (18 and over) 2016/17 HSAA Technical Specifications Page 26

27 GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending Data are available from Ontario Stroke Audit Data are available annually in December for all hospitals beginning in 2012/13. Data are available at the level of the facility LHIN Data are available as of 2004 LIMITATIONS Specific limitations There may be limitations as a result of the DAD being the only data source ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING The Ontario Stroke Network s Ontario Stroke Report Cards report stroke unit at a population level. Analysis is based on a patient s postal code, not facility. 1. Canadian Best Practices SU section ( 2. QBP Handbook ( page 91) 3. Ontario Stroke Report Cards, indicator 8 ( Stroke-Report-Cards) ICES DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 27

28 INDICATOR NAME HOSPITAL STANDARDIZED MORTALITY RATIO (HSMR) The hospital standardized mortality ratio (HSMR) is a big-dot summary measure that is used to track a hospital s mortality over time. The HSMR is a tool that allows hospitals to measure and monitor their progress in quality of care. HSMR is a ratio of the actual number of in-hospital deaths in a region or hospital to the number that would have been expected based on the types of patients a region or hospital treats. It focuses on the diagnosis groups that account for the majority of in-hospital deaths. Detailed description of indicator Observed deaths HSMR= x 100 Expected deaths Using a logistic regression model, HSMR is adjusted for several factors that affect in-hospital mortality, including age, sex, length of stay, admission category, diagnosis group, co morbidity and transfer from another acute care institution. A ratio equal to 100 suggests that there is no difference between a local mortality rate and the average national experience, given the types of patients cared for. An HSMR greater or less than 100 suggests that a local mortality rate is higher or lower, respectively, than the national experience. The confidence intervals describe the precision of the HSMR estimate. HSMR values are estimated to be accurate within the upper and lower confidence interval, 19 times out of % confidence interval is calculated using Byar s approximation. A confidence interval that includes 100 suggests that the HSMR is not statistically different from the baseline of 100. HSMR results whose confidence interval does not include 100 are statistically different from the baseline. INDICATOR CLASSIFICATION Explanatory NUMERATOR CALCULATION Observed deaths, or actual number of in-hospital deaths that occurred in a hospital or region (among patients who satisfy HSMR inclusion and exclusion criteria). Discharge Abstract Database (DAD), Canadian Institute of Health Information (CIHI) 2016/17 HSAA Technical Specifications Page 28

29 Inclusion criteria: 1. Discharge between April 1 of a given year and March 31 of the following year 2. Admission to an acute care institution 3. Discharge with diagnosis group of interest (that is, one of the diagnosis groups that account for approximately 80% of in-hospital deaths) 4. Age at admission between 0 and 120 years 5. Sex recorded as male or female 6. Length of stay of up to 365 consecutive days 7. Admission category is elective or emergent/urgent 8. Canadian resident Exclusion criteria: 1. Cadavers 2. Stillborns 3. Sign-outs (that is, discharged against medical advice) 4. Patients who do not return from a pass 5. Neonates, with age at admission less than or equal to 28 days 6. Records with brain death as most responsible diagnosis code 7. Records with palliative care as most responsible diagnosis code Expected deaths, or number of deaths that would have occurred in a hospital or region had the mortality of these patients been the same as the mortality of similar patients across the country, based on the reference year ( ). DENOMINATOR CALCULATION The HSMR logistic regression model is fitted with age, sex, length-of-stay (LOS) group, admission category, diagnosis group, co morbidity group and transfers as independent variables and is based on data from all acute hospitals in Canada (excluding Quebec). Coefficients derived from a logistic regression model are used to calculate the probability of inhospital death. The expected number of deaths for a hospital, corporation or region is based on the sum of the probabilities of inhospital death for eligible discharges from that organization. DAD, CIHI Includes: 1. Discharge between April 1 of a given year and March 31 of the following year 2. Admission to an acute care institution 3. Discharge with diagnosis group of interest (that is, one of the diagnosis groups that account for approximately 80% of in-hospital deaths) 4. Age at admission between 0 and 120 years 5. Sex recorded as male or female 6. Length of stay of up to 365 consecutive days 7. Admission category is elective or emergent/urgent 8. Canadian resident 2016/17 HSAA Technical Specifications Page 29

30 Excludes: 1. Cadavers 2. Stillborns 3. Sign-outs 4. Patients who did not return from a pass 5. Neonates, with age at admission less than or equal to 28 days 6. Records with brain death as most responsible diagnosis code 7. Records with palliative care as most responsible diagnosis GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison Results are available on a quarterly (Q1 and Q2 in February, Q3 in May and Q4 in September) and annual (in September, together with Q4 reports) basis. Results are available for hospitals or hospital corporations (where applicable) and Local Health Integration Networks (LHINs). TRENDING Years available for trending LIMITATIONS Specific limitations FY till present Currently no information has been provided. ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. REFERENCES The reference year for HSMR calculations is To allow for comparisons over time, the coefficients derived from the model using the reference year are used to determine expected deaths for all reported years. While HSMR adjusts for a number of factors affecting the risk of in-hospital mortality, it does not control for every factor. Therefore, HSMR results are most useful in tracking trends over time. More information about HSMR calculation can be found at the Methodology section of HSMR web-page. 1. HSMR web-page Provide URLs of any key references E.g. Diabetes in Canada, 2. Canadian Institute for Health Information. HSMR: A New Approach for Measuring Hospital Mortality Trends in Canada. Ottawa, Ont.: CIHI, /17 HSAA Technical Specifications Page 30

31 3. Jarman, B., A. Bottle and P. Aylin. Monitoring Changes in Hospital Standardised Mortality Ratios. BMJ 330 (2005): p Breslow, N. E. and N. E. Day. Statistical Methods in Cancer Research: Volume II The Design and Analysis of Cohort Studies. Lyon, France: International Agency for Research on Cancer, Quan, H., V. Sundararajan, P. Halfon, A. Fong, B. Burnand, J. C. Luthi, L. D. Saunders, C. A. Beck, T. E. Feasby and W. A. Ghali. Coding Algorithms for Defining Comorbidities in ICD-9-CM and ICD-10 Administrative Data. Medical Care 43, 11 (2005): pp RESPONSIBILITY FOR REPORTING CIHI DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 31

32 INDICATOR NAME RATE OF VENTILATOR-ASSOCIATED PNEUMONIA INDICATOR DESCRIPTION Detailed description of indicator INDICATOR CLASSIFICATION Pneumonia occurring in patients requiring mechanical ventilation, intermittently or continuously, through a tracheostomy or endotracheal tube for more than 48 hours Explanatory CALCULATION Total number of VAP cases age 18 and older that have required at least 48 hours of mechanical ventilation during the reporting period NUMERATOR Critical Care Information System (CCIS), Ontario Ministry of Health and Long- Term Care (MOHLTC) Includes: 1. All publicly funded hospitals 2. ICU beds 3. Patients diagnosed with VAP and being treated with antibiotics for VAP CALCULATION Excludes: 1. Patients age 17 and younger Total number of ventilator days for Intensive Care Unit (ICU) patients age 18 and older during the reporting period DENOMINATOR Critical Care Information System (CCIS), Ontario Ministry of Health and Long- Term Care (MOHLTC) Includes: 1. All publicly funded hospitals 2. ICU beds Excludes: 1. Patients age 17 and younger GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are Data are available each quarter for the previous quarter s data 2016/17 HSAA Technical Specifications Page 32

33 released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison Data are available at provincial, LHIN and hospital levels TRENDING Years available for trending LIMITATIONS Specific limitations Data are available for the previous quarter as of April 2009 Data are self-reported by hospital. No individual patient data are available; therefore this indicator cannot be broken down by socio-demographic characteristics. ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, Trending and comparisons are most valid by hospital type (e.g. small, large community, acute teaching, chronic care and rehab and mental health). This is in order to make limited adjustment for patient case mix. Patient Safety Website RESPONSBILITY FOR REPORTING Health Service Providers DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 33

34 INDICATOR NAME PROGRAM SPECIFIC INDICATOR NAME(S) How the indicator is named by specific programs Detailed description of indicator INDICATOR CLASSIFICATION CENTRAL LINE INFECTION RATE Central Line-Associated Primary Bloodstream Infection (CLI) Rate Number of intensive care unit (ICU) patients with new central line bloodstream infection (BSI)(CLI) per 1,000 central line days Explanatory CALCULATION Total number of laboratory confirmed BSI developing in patients age 18 and older in the ICU after 48 hours of placement of a central line NUMERATOR DENOMINATOR GEOGRAPHY & TIMING CALCULATION TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending Critical Care Information System (CCIS), Ministry of Health and Long-Term Care (MOHLTC) Includes: 1. Patients in the ICU 2. Patients age 18 and older Total number of central line days for patients age 18 and older in the ICU with a central line in place CCIS, MOHLTC Includes: 1. Patients in the ICU 2. Patients age 18 and older Data are available quarterly Data are collected at hospital institution level; can be aggregated up to Local Health Integration Network (LHIN) and provincial levels. Initial reporting started April 30, 2009 and included cumulative data for the three-month period January 01 to March 31, /17 HSAA Technical Specifications Page 34

35 LIMITATIONS Specific limitations Currently, no information has been provided ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING O Grady NP, Alexander M, Dellinger EP, et al. Guidelines for the prevention of intravascular catheter-related infections. Centers for Disease Control and Prevention. MMWR Recomm Rep. Aug ; 51(RR-10): Pittet D, Tarara D, Wemze; RP, Nosocomial bloodstream infection in critically ill patient. Excess length of stay, extra cost, and attributable mortality. JAMA 1994; (20): 271: Health Service Providers DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 35

36 INDICATOR DESCRIPTION INDICATOR NAME Detailed description of indicator INDICATOR CLASSIFICATION RATE OF HOSPITAL ACQUIRED METHICILLIN RESISTANT STAPHYLOCOCCUS AUREUS BACTEREMIA The rate of MRSA bacteremia is a measure of the incidence of laboratory confirmed bloodstream MRSA infection per 1,000 patient days Explanatory CALCULATION The total number of new nosocomial (i.e. hospital acquired) MRSA bacteremia cases in the reporting period multiplied by 1,000 NUMERATOR Self Reporting Initiative (SRI), Ontario Ministry of Health and Long-Term Care (MOHLTC) Includes: 1. All publicly funded hospitals 2. Inpatient beds 3. Laboratory-confirmed MRSA bacteremia cases (i.e. confirmation through a single positive blood culture for MRSA) 4. New nosocomial cases associated with the reporting facility is where the infection was not present on admission (i.e., onset of symptoms > 72 hours after admission) or the infection was present at the time of admission but was related to a previous admission to the same facility within the last 72 hrs. CALCULATION Excludes: 1. Long-term care beds The total number of patient days spent in-hospital in a reporting period DENOMINATOR GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are Self Reporting Initiative (SRI), Ontario Ministry of Health and Long-Term Care (MOHLTC) Includes: 1. All publicly funded hospitals 2. Inpatient beds Excludes: 1. Long-term care beds Data are available each quarter for the previous quarter s data 2016/17 HSAA Technical Specifications Page 36

37 released annually in mid- May LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending Data are available at provincial, LHIN and hospital levels Data are available for the previous quarter as of December 2008 Data are self-reported by hospital. LIMITATIONS Specific limitations No individual patient data are available; therefore indicator cannot be broken down by socio-demographic characteristics. ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSIBILITY FOR REPORTING Trending and comparisons are most valid by hospital type (e.g. small, large community, acute teaching, chronic care and rehab and mental health). This is in order to make limited adjustment for patient case mix. Patient Safety Website Health Service Providers DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 37

38 INDICATOR NAME INDICATOR DESCRIPTION Detailed description of indicator INDICATOR CLASSIFICATION PERCENT OF PRIORITY 2, 3, AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR CARDIAC BY-PASS SURGERY Descriptions of priority levels can be found in the link within the Reference section. Explanatory Step 1: Count the total number of cases completed for the reporting period by urgency category (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial access target. Do this for all of the three urgency categories (Priority 2, 3, and 4). Step 3: The weighted percent of case completed within Priority 2, 3 and 4 access targets = sum of the counts by Urgency in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. CALCULATION Sample calculation: INDICATOR CCN Cardiac Registry. All Wait List Entries, off-listed as Procedure Started with procedure dates within date range submitted by hospitals to the CCN Cardiac Registry. Only isolated CABG surgery cases are included in the calculation. CABG surgery cases done in conjunction with other cardiac surgery procedures (i.e., valve surgery) are excluded. Cases with missing priority levels are excluded from the wait time calculation. Dates during which time a procedure is unable to take place for patient related and/or clinical reasons (Dates Affecting Readiness to Treat [DARTs]) are excluded from the wait time calculation. 2016/17 HSAA Technical Specifications Page 38

39 GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid-may LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending CCN has provided the MOHLTC with historical data on this indicator quarterly for FYs 2013/14 and 2014/15 to allow analysis of historical trends. LIMITATIONS Specific limitations ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. All hospitals performing CABG surgery in Ontario submit their wait time data to the Cardiac Care Network (CCN) Cardiac Registry and are included in the wait time calculation. Calculated percent of cases completed within access targets is based only on the number of cases entered in the system on data cut date scheduled on the 3rd business day after month end. CCN applies rigorous compliance, data quality and resubmission processes to check on completeness, validity and accuracy. For any questions, please contact Garth Oakes, Senior Lead Knowledge Translation and Privacy, CCN As part of CCN s on-going data quality improvement effort, accuracy is one of the key areas monitored and assessed on a regular basis along with other data quality dimensions. CCN provides monthly missing data reports to assist hospital data entry staff to ensure the completeness of data at the facility level. Data are reviewed at each month end by CCN staff to assess data quality. For questions regarding data quality, please contact Garth Oakes, Senior Lead Knowledge Translation and Privacy, CCN (goakes@ccn.on.ca). REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSBILITY FOR REPORTING Cardiac Care Network of Ontario 2016/17 HSAA Technical Specifications Page 39

40 DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 40

41 INDICATOR NAME INDICATOR DESCRIPTION Detailed description of indicator INDICATOR CLASSIFICATION PERCENT OF PRIORITY 2, 3, AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR CANCER SURGERY Descriptions of priority levels can be found in the link within the Reference section. Explanatory Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. Sample calculation: CALCULATION INDICATOR Cancer Care Ontario (ATC, 2015). Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. Wait Time is calculated based on closed cases submitted by hospitals through the WTIS. 1. All closed wait list entries with procedure dates within date range. 2. Must be 18 and older on the day the procedure was completed. 3. Procedures no longer required are excluded from wait time calculation. 4. Includes treatment for cancer procedures only. Procedures classified as NA are currently included. Diagnostic, palliative and reconstructive cancer 2016/17 HSAA Technical Specifications Page 41

42 procedures are excluded. Procedures on skin - carcinoma, skin-melanoma, and lymphomas are also excluded. 5. Procedures assigned as priority level 1 are excluded from wait time calculation. Cases with missing priority levels are also excluded. 6. Wait list entries identified by hospitals as data entry errors are also excluded. 7. If unavailable days fall outside the decision to treat date up to procedure date, unavailable days are not deducted from patients wait days. These are considered data entry errors. GEOGRAPHY & TIMING TIMING/FREQUENCY OF RELEASE How often, and when, are data being released E.g. Be as specific as possible..data are released annually in mid-may LEVELS OF COMPARABILITY Levels of geography for comparison TRENDING Years available for trending The calculated percent of cases completed within priority target can be compared with the historical trend published in the Government of Ontario wait time s website. All inclusions/exclusions criteria used are similar. Also, historical wait times trend for low volume hospitals/lhins will show as NV (no or low volume) instead of a calculated percent of cases completed within priority target. LIMITATIONS Specific limitations ADDITIONAL INFORMATION COMMENTS Additional information regarding the calculation, interpretation, data source, etc. Hospitals submitting wait time data voluntarily (not required to report) are included in wait time calculation. Calculated percent of cases completed within priority target is based only on the cases entered in the system. Logically, hospitals not reporting cases promptly are excluded at the time of data extraction. Volumes submitted by hospitals are checked monthly for completeness. Hospital volume is compared against the expected monthly average. Outliers are validated with hospitals if the wait days are not accurate. As part of ATC s on-going data quality and compliance processes with hospitals, accuracy is one of the key areas that have been monitored closely over time and assessed on a regular basis along with other data quality dimensions. ATC monitors data quality with hospitals on a weekly basis and works closely with 2016/17 HSAA Technical Specifications Page 42

43 hospitals WTIS coordinators to ensure data quality indicators and thresholds are met. If hospitals have challenges meeting data quality thresholds they may be excluded from wait time reporting calculations as well as public reporting. In these circumstances hospital, LHIN and/or ministry leadership is engaged to ensure hospital data quality is enhanced to become reportable as soon as possible. REFERENCES Provide URLs of any key references E.g. Diabetes in Canada, RESPONSBILITY FOR REPORTING Cancer Care Ontario DATE CREATED (YYYY- MM-DD) DATE LAST REVIEWED (YYYY-MM-DD) /17 HSAA Technical Specifications Page 43

44 INDICATOR DESCRIPTION INDICATOR NAME Detailed description of indicator INDICATOR CLASSIFICATION PERCENT OF PRIORITY 2, 3 AND 4 CASES COMPLETED WITHIN ACCESS TARGETS FOR CATARACT SURGERY Descriptions of priority levels can be found in the link within the Reference section. Explanatory Step 1: Count the total number of cases completed for the reporting period by priority level (Priority 2, 3 and 4). Please refer to the inclusion/exclusion criteria listed below. Step 2: Of the total count in step 1, count the number of cases where wait times are less than or equal to the provincial targets. Do this for all of the three priority levels (Priority 2, 3, and 4). Step 3: The weighted percent of cases completed within Priority 2, 3 and 4 access targets = sum of the counts by Priority in step 2 divided by the sum of the counts by Priority in step 1 x 100 to get the percentage. CALCULATION Sample calculation: INDICATOR WTIS, ATC, Cancer Care Ontario. Access to Care has implemented a robust data quality and compliance framework to assess data quality under four key dimensions: timeliness, validity, reliability and usability. The complete framework is available in the Wait Time Conditions of Funding and further information is available upon request at: ATCsupport@cancercare.on.ca. All closed Wait List Entries with procedure dates within date range submitted by hospitals through the Wait Time Information System. Patient age greater than equals to 18 years old on the day the procedure was completed. Procedures No Longer required (or cancelled cases) are excluded from wait time calculation. 2016/17 HSAA Technical Specifications Page 44

HOSPITAL SERVICE ACCOUNTABILITY AGREEMENT: Indicator Technical Specifications

HOSPITAL SERVICE ACCOUNTABILITY AGREEMENT: Indicator Technical Specifications 2015-16 HOSPITAL SERVICE ACCOUNTABILITY AGREEMENT: Indicator Technical Specifications November 2014 2015/16 HSAA Technical Specifications Page 1 TABLE OF CONTENTS PATIENT EXPERIENCE ACCESS, EFFECTIVE,

More information

Hospital Service Accountability Agreement. Indicator Technical Specifications

Hospital Service Accountability Agreement. Indicator Technical Specifications 2018-19 Hospital Service Accountability Agreement Indicator Technical Specifications October 2017 TABLE OF CONTENTS PATIENT EXPERIENCE ACCESS, EFFECTIVE, SAFE, PERSON-CENTERED... 5 PERFORMANCE... 5 90th

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of October, 2016

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of October, 2016 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of October, 216 B E T W E E N: SOUTH WEST LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND St. Joseph's Health

More information

H-SAA AMENDING AGREEMENT B E T W E E N: TORONTO CENTRAL LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND

H-SAA AMENDING AGREEMENT B E T W E E N: TORONTO CENTRAL LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 216 B E T W E E N: TORONTO CENTRAL LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND WOMEN'S COLLEGE

More information

H-SAA AMENDING AGREEMENT

H-SAA AMENDING AGREEMENT H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 216 B E T W E E N: NORTH EAST LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND (the Hospital ) WHEREAS

More information

Hospital Service Accountability Agreements

Hospital Service Accountability Agreements 2017-2018 Schedule A Funding Allocation 2017-2018 [1] Estimated Funding Allocation Section 1: FUNDING SUMMARY LHIN FUNDING LHIN Global Allocation (Includes Sec. 3) Health System Funding Reform: HBAM Funding

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of July, 2017

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of July, 2017 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of July, 2017 B E T W E E N: CHAMPLAIN LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND University of Ottawa

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2017

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2017 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2017 B E T W E E N: CHAMPLAIN LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND Deep River and District

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016 B E T W E E N: CHAMPLAIN LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND Pembroke Regional Hospital

More information

H-SAA Monitoring & Assessment Process & Overview 2012/13 Q4

H-SAA Monitoring & Assessment Process & Overview 2012/13 Q4 H-SAA Monitoring & Assessment Process & Overview H-SAA MONITORING & ASSESSMENT PROCESS & OVERVIEW The Hospital Service Accountability Agreement (H-SAA) has been developed to monitor and analyze the current

More information

Frequently Asked Questions (FAQ) Updated September 2007

Frequently Asked Questions (FAQ) Updated September 2007 Frequently Asked Questions (FAQ) Updated September 2007 This document answers the most frequently asked questions posed by participating organizations since the first HSMR reports were sent. The questions

More information

2016/17 Quality Improvement Plan "Improvement Targets and Initiatives"

2016/17 Quality Improvement Plan Improvement Targets and Initiatives 2016/17 Quality Improvement Plan "Improvement Targets and Initiatives" Queensway-Carleton Hospital 3045 Baseline Road AIM Measure Quality dimension Objective Measure/Indicator Unit / Population Source

More information

AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2010 B E T W E E N: NORTH SIMCOE MUSKOKA LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) - and - MUSKOKA ALGONQUIN

More information

Indicator description

Indicator description Patients with a primary care visit within 7 days of acute discharge for Quality Improvement Plans - Primary Care Resource for Indicator Standards (RIS) Health Analytics Branch, Ministry of Health and Long-Term

More information

Benchmarking variation in coding across hospitals in Canada: A data surveillance approach

Benchmarking variation in coding across hospitals in Canada: A data surveillance approach Benchmarking variation in coding across hospitals in Canada: A data surveillance approach Lori Kirby Canadian Institute for Health Information October 11, 2017 lkirby@cihi.ca cihi.ca @cihi_icis Outline

More information

Hospitalizations for Ambulatory Care Sensitive Conditions (ACSC)

Hospitalizations for Ambulatory Care Sensitive Conditions (ACSC) Hospitalizations for Ambulatory Care Sensitive Conditions (ACSC) Resource for Indicator Standards (RIS) Health Analytics Branch, Ministry of Health and Long-Term Care Indicator description RIS indicator

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016 B E T W E E N: CHAMPLAIN LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND Queensway Carleton

More information

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016

H-SAA AMENDING AGREEMENT. THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 2016 H-SAA AMENDING AGREEMENT THIS AMENDING AGREEMENT (the Agreement ) is made as of the 1 st day of April, 216 B E T W E E N: CHAMPLAIN LOCAL HEALTH INTEGRATION NETWORK (the LHIN ) AND Cornwall Community Hospital

More information

Deaths by care setting

Deaths by care setting Deaths by care setting Resource for Indicator Standards (RIS) Health Analytics Branch, Ministry of Health and Long-Term Care Indicator description RIS indicator name Deaths by care setting Other names

More information

Health Quality Ontario

Health Quality Ontario Health Quality Ontario The provincial advisor on the quality of health care in Ontario November 15, 2016 Under Pressure: Emergency department performance in Ontario Technical Appendix Table of Contents

More information

Balanced Scorecard Highlights

Balanced Scorecard Highlights Balanced Scorecard Highlights Highlights from 2011-12 fourth quarter (January to March) Sick Time The average sick hours per employee remains above target this quarter at 58. Human Resources has formed

More information

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/28/2014 This document is intended to provide health care organizations in Ontario with guidance as to how they can develop

More information

CKHA Quality Improvement Plan (QIP) Scorecard

CKHA Quality Improvement Plan (QIP) Scorecard CKHA Quality Improvement Plan () Scorecard 217-18 Quality dimension Performance Indicator 217-18 Performance Goals results where available Current Value Page Safety Medication Reconciliation completed

More information

Excellent Care for All Quality Improvement Plans (QIP): Progress Report for the 2015/16 QIP

Excellent Care for All Quality Improvement Plans (QIP): Progress Report for the 2015/16 QIP Excellent Care for All Quality Improvement Plans (QIP): Progress Report for the 2015/16 QIP ID Measure/Indicator from 2015/16 1 Overall, how would you rate the care and services you received at the hospital?

More information

Developmental /Category III Explanatory/Category II Not Defined Explanatory/Category II Defined Proposed Priority

Developmental /Category III Explanatory/Category II Not Defined Explanatory/Category II Defined Proposed Priority The Rehabilitative Care System supports high quality patient experiences through the utilization of best practices to enhance outcomes for individuals with functional goals. This evaluationframework has

More information

Scottish Hospital Standardised Mortality Ratio (HSMR)

Scottish Hospital Standardised Mortality Ratio (HSMR) ` 2016 Scottish Hospital Standardised Mortality Ratio (HSMR) Methodology & Specification Document Page 1 of 14 Document Control Version 0.1 Date Issued July 2016 Author(s) Quality Indicators Team Comments

More information

Mississauga Hospital 100 Queensway West Mississauga, ON L5B 1B8

Mississauga Hospital 100 Queensway West Mississauga, ON L5B 1B8 Credit Valley Hospital 2200 Eglinton Avenue West Mississauga, ON L5M 2N1 Mississauga Hospital 100 Queensway West Mississauga, ON L5B 1B8 Queensway Health Centre 150 Sherway Drive Toronto, ON M9C 1A5 This

More information

FOCUS on Emergency Departments DATA DICTIONARY

FOCUS on Emergency Departments DATA DICTIONARY FOCUS on Emergency Departments DATA DICTIONARY Table of Contents Contents Patient time to see an emergency doctor... 1 Patient emergency department total length of stay (LOS)... 3 Length of time emergency

More information

Home care clients with complex needs who received personal support service within five days

Home care clients with complex needs who received personal support service within five days Home care clients with complex needs who received personal support service within five days Resource for Indicator Standards (RIS) Health Analytics Branch, Ministry of Health and Long-Term Care Indicator

More information

North Wellington Health Care April 1, 2012

North Wellington Health Care April 1, 2012 North Wellington Health Care April, 202 This document is intended to provide public hospitals with guidance as to how they can satisfy the requirements related to quality improvement plans in the Excellent

More information

Northeastern Ontario Clinical Services Review

Northeastern Ontario Clinical Services Review Northeastern Ontario Clinical Services Review FINAL PROJECT REPORT Hay Group Health Care Consulting March, 2014 2014 Hay Group Limited. All rights reserved Contents 1.0 EXECUTIVE SUMMARY... 1 1.1 BACKGROUND

More information

2014/15 Quality Improvement Plan (QIP) Narrative

2014/15 Quality Improvement Plan (QIP) Narrative 2014/15 Quality Improvement Plan (QIP) Narrative 4/1/2014 This document is intended to provide health care organizations in Ontario with guidance as to how they can develop a quality improvement plan.

More information

2017/18 Quality Improvement Plan "Improvement Targets and Initiatives"

2017/18 Quality Improvement Plan Improvement Targets and Initiatives 2017/18 Quality Improvement Plan "Improvement Targets and Initiatives" St. Mary's General Hospital 911 Queen's Boulevard AIM Measure Quality dimension Issue Measure/Indicator Unit / Population Source /

More information

Services. Progress to date. Comments. Goal. Hours ED patients to our medicall. Maintainn. this year. excluding the. (consolidated) expense,

Services. Progress to date. Comments. Goal. Hours ED patients to our medicall. Maintainn. this year. excluding the. (consolidated) expense, Progress Report for 201/ /14 Quality ment Plan: Grey Bruce Health Services Priority Indicator ED Wait times: 90th percentile ED length of stay for Admitted patients. Hours ED patients Q4 2011/12 Q / /1

More information

Wait Time Information in Priority Areas: Definitions

Wait Time Information in Priority Areas: Definitions Wait Time Information in Priority Areas: Definitions 1 Background In 2004, Canada's first ministers agreed to work towards reducing wait times for five priority areas: cancer treatment, cardiac care, diagnostic

More information

Health System Funding Reform: Driving Change using Technology Presentation to Canadian Health Informatics Association

Health System Funding Reform: Driving Change using Technology Presentation to Canadian Health Informatics Association Health System Funding Reform: Driving Change using Technology Presentation to Canadian Health Informatics Association April 2014 Ministry of Health and Long-Term Care V2.4 (2014-04-28) Session Objectives

More information

Revisiting the inpatient rehabilitation case-mix and funding model in Ontario, Canada: lessons learned

Revisiting the inpatient rehabilitation case-mix and funding model in Ontario, Canada: lessons learned Revisiting the inpatient rehabilitation case-mix and funding model in Ontario, Canada: lessons learned Kristen Pitzul, Emitis Moshirzadeh, Jan Walker, Kevin Yu, Sandro Serino, Imtiaz Daniel Quick Facts

More information

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 2015-2016 3/31/2015 This document is intended to provide health care organizations in Ontario with guidance as to how they

More information

About the Data: Adult Health and Disease - Chronic Illness 2016/17, 2014/15 (archived) Last Updated: August 29, 2018

About the Data: Adult Health and Disease - Chronic Illness 2016/17, 2014/15 (archived) Last Updated: August 29, 2018 About the Data: Adult Health and Disease - Chronic Illness 2016/17, 2014/15 (archived) Last Updated: August 29, 2018 Adult Health and Disease: 2016/17 Denominator: Ontario Ministry of Health and Long-Term

More information

CENTRAL EAST LHIN MLPA PERFORMANCE INDICATOR DASHBOARD Performance effective as of August 2011

CENTRAL EAST LHIN MLPA PERFORMANCE INDICATOR DASHBOARD Performance effective as of August 2011 LHIN Starting LHIN Indicator Provincial Point or Actual LHIN Current LHIN Reporting PI No. Performance Indicator (PI) FY211/12 Trend Data Source Type Target Baseline Performance Status Ranking Period Target

More information

Service Accountability Agreements Update

Service Accountability Agreements Update Service Accountability Agreements Update Central East Local Health Integration Network Board Meeting Date: December 21, 2016 Presented By: System Finance and Performance Management Overview Context Service

More information

MINISTRY/LHIN ACCOUNTABILITY AGREEMENT (MLAA) MLAA Performance Assessment Dashboard /10 Q3

MINISTRY/LHIN ACCOUNTABILITY AGREEMENT (MLAA) MLAA Performance Assessment Dashboard /10 Q3 MINISTRY/LHIN ACCOUNTABILITY AGREEMENT (MLAA) MLAA Performance Assessment Dashboard - 29/1 Q3 README The 29/1 MLAA Dashboard has been designed to reflect various reporting fiscal periods as well as the

More information

Health Sciences North Horizon Santé-Nord (QIP) Quality Improvement Plan

Health Sciences North Horizon Santé-Nord (QIP) Quality Improvement Plan Health Sciences North Horizon Santé-Nord 2015 2016 (QIP) Quality Improvement Plan March 31, 2015 Overview HSN 2015-2016 Quality Improvement Plan Introduction Health Sciences North/Horizon Santé-Nord (HSN)

More information

March 29, Bluewater Health 1 89 Norman Street, Sarnia ON, N7T 6S3

March 29, Bluewater Health 1 89 Norman Street, Sarnia ON, N7T 6S3 March 29, 202 This document is intended to provide public hospitals with guidance as to how they can satisfy the requirements related to quality improvement plans in the Excellent Care for All Act, 200

More information

Health Quality Ontario

Health Quality Ontario Health Quality Ontario The provincial advisor on the quality of health care in Ontario Indicator Technical Specifications 2018/19 Quality Plans Revised January 2018 ISSN 2371-6002 (PDF) ISBN 978-1-4868-1154-0

More information

Indicator Definition

Indicator Definition Patients Discharged from Emergency Department within 4 hours Full data definition sign-off complete. Name of Measure Name of Measure (short) Domain Type of Measure Emergency Department Length of Stay:

More information

Quality Improvement Plans (QIP): Progress Report for 2013/14 QIP

Quality Improvement Plans (QIP): Progress Report for 2013/14 QIP Excellent Care for All Quality Improvement Plans (QIP): Report for 201/14 QIP The following template has been provided to assist with completion of reporting on the progress of your organization s QIP.

More information

TC LHIN Quality Indicators: Big Dot (System) and Small Dot (Sector Specific) Indicators. November 29, 2013

TC LHIN Quality Indicators: Big Dot (System) and Small Dot (Sector Specific) Indicators. November 29, 2013 TC LHIN Quality Indicators: Big Dot (System) and Small Dot (Sector Specific) Indicators November 29, 2013 1 Contents 1. TC LHIN Quality Framework, Themes and Focus Areas 2. Big Dot System Indicators 3.

More information

HOSPITAL QUALITY MEASURES. Overview of QM s

HOSPITAL QUALITY MEASURES. Overview of QM s HOSPITAL QUALITY MEASURES Overview of QM s QUALITY MEASURES FOR HOSPITALS The overall rating defined by Hospital Compare summarizes up to 57 quality measures reflecting common conditions that hospitals

More information

Children s Hospital of Eastern Ontario

Children s Hospital of Eastern Ontario Children s Hospital of Eastern Ontario April 1, 2011 Children s Hospital of Eastern Ontario 1 Part A: Overview of Our Hospital s Quality Improvement Plan 1. Overview of our quality improvement plan for

More information

Current Performance as stated on QIP2016/17

Current Performance as stated on QIP2016/17 Excellent Care for All Quality Improvement Plans (): Progress Report for The Progress Report is a tool that will help organizations make linkages between change ideas and improvement, and gain insight

More information

Provincial Surveillance

Provincial Surveillance Provincial Surveillance Provincial Surveillance 2011/12 Launched first provincial surveillance protocols Establishment of provincial data entry & start of formal surveillance reports Partnership with AB

More information

Hospital-Acquired Condition Reduction Program. Hospital-Specific Report User Guide Fiscal Year 2017

Hospital-Acquired Condition Reduction Program. Hospital-Specific Report User Guide Fiscal Year 2017 Hospital-Acquired Condition Reduction Program Hospital-Specific Report User Guide Fiscal Year 2017 Contents Overview... 4 September 2016 Error Notice... 4 Background and Resources... 6 Updates for FY 2017...

More information

Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings

Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings Executive Summary The Alliance for Home Health Quality and

More information

Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings

Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings Medicare Spending and Rehospitalization for Chronically Ill Medicare Beneficiaries: Home Health Use Compared to Other Post-Acute Care Settings May 11, 2009 Avalere Health LLC Avalere Health LLC The intersection

More information

Ministry-LHIN Performance Agreement (MLPA) Patient Flow Report

Ministry-LHIN Performance Agreement (MLPA) Patient Flow Report Ministry-LHIN Performance Agreement (MLPA) Patient Flow Report Quality and Safety Committee Hamilton Niagara Haldimand Brant (HNHB) Local Health Integration Network (LHIN) November 21, 2012 Agenda 2012-13

More information

Bluewater Health April 1, 2011

Bluewater Health April 1, 2011 Bluewater Health April 1, 2011 This document is intended to provide public hospitals with guidance as to how they can satisfy the requirements related to quality improvement plans in the Excellent Care

More information

Minnesota Statewide Quality Reporting and Measurement System: APPENDICES TO MINNESOTA ADMINISTRATIVE RULES, CHAPTER 4654

Minnesota Statewide Quality Reporting and Measurement System: APPENDICES TO MINNESOTA ADMINISTRATIVE RULES, CHAPTER 4654 Minnesota Statewide Quality Reporting and Measurement System: APPENDICES TO MINNESOTA ADMINISTRATIVE RULES, CHAPTER 4654 DECEMBER 2017 APPENDICES TO MINNESOTA ADMINISTRATIVE RULES, CHAPTER 4654 Minnesota

More information

Predicting 30-day Readmissions is THRILing

Predicting 30-day Readmissions is THRILing 2016 CLINICAL INFORMATICS SYMPOSIUM - CONNECTING CARE THROUGH TECHNOLOGY - Predicting 30-day Readmissions is THRILing OUT OF AN OLD MODEL COMES A NEW Texas Health Resources 25 hospitals in North Texas

More information

Quality Improvement Plan (QIP): 2014/15 Progress Report

Quality Improvement Plan (QIP): 2014/15 Progress Report Quality Improvement Plan (QIP): 2014/15 Progress Report ED Wait Times ID 1 Measure/Indicator from 2014/ ED Wait Times: 90th percentile ED length of stay for Admitted patients. Hours ED patients Q4 2012/13

More information

Case Mix - Putting HIMs in the Mix. HealthAchieve November 3, 2014 Greg Zinck Manager, Case Mix Canadian Institute for Health Information

Case Mix - Putting HIMs in the Mix. HealthAchieve November 3, 2014 Greg Zinck Manager, Case Mix Canadian Institute for Health Information Case Mix - Putting HIMs in the Mix HealthAchieve November 3, 2014 Greg Zinck Manager, Case Mix Canadian Institute for Health Information 1 Objectives Case mix in general How do HIM professionals affect

More information

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/15/2016

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/15/2016 Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/15/2016 This document is intended to provide health care organizations in Ontario with guidance as to how they can develop

More information

This profile provides an overview of the services provided at the Royal Inland Hospital in the areas of:

This profile provides an overview of the services provided at the Royal Inland Hospital in the areas of: Facility Profile This profile provides an overview of the services provided at the in the areas of: Inpatient Cases & Days Inpatient Surgery & Surgical Day Care Emergency Department The information provided

More information

Toronto Central LHIN 2016/2017 QIP Snapshot Report. Health Quality Ontario The provincial advisor on the quality of health care in Ontario

Toronto Central LHIN 2016/2017 QIP Snapshot Report. Health Quality Ontario The provincial advisor on the quality of health care in Ontario Toronto Central LHIN 2016/2017 QIP Snapshot Report Health Quality Ontario The provincial advisor on the quality of health care in Ontario INTRODUCTION Purpose To give each Local Health Integration Network

More information

Report on Provincial Wait Time Strategy

Report on Provincial Wait Time Strategy Hôpital régional de Sudbury Regional Hospital Report on Provincial Wait Time Strategy May 2007 Provincial Wait-time Strategy Announced by Minister of Health in November 2004 Focus is to increase access

More information

Quality Improvement Plans (QIP): Progress Report for 2016/17 QIP

Quality Improvement Plans (QIP): Progress Report for 2016/17 QIP Quality Improvement Plans (QIP): Progress Report for 2016/17 QIP Positive Patient Experience Overall, how would you rate the care and services you received at the hospital? (inpatient), add the number

More information

2017 Quality Reporting: Claims and Administrative Data-Based Quality Measures For Medicare Shared Savings Program and Next Generation ACO Model ACOs

2017 Quality Reporting: Claims and Administrative Data-Based Quality Measures For Medicare Shared Savings Program and Next Generation ACO Model ACOs 2017 Quality Reporting: Claims and Administrative Data-Based Quality Measures For Medicare Shared Savings Program and Next Generation ACO Model ACOs June 15, 2017 Rabia Khan, MPH, CMS Chris Beadles, MD,

More information

Methodology Notes. Identifying Indicator Top Results and Trends for Regions/Facilities

Methodology Notes. Identifying Indicator Top Results and Trends for Regions/Facilities Methodology Notes Identifying Indicator Top Results and Trends for Regions/Facilities Production of this document is made possible by financial contributions from Health Canada and provincial and territorial

More information

Champlain Health System Performance and Accomplishments

Champlain Health System Performance and Accomplishments hamplain Health System Performance and Accomplishments Technical Report November 2015 Table of ontents Page Number(s) Section A Overview Status of All Indicators A1-A2 Section B Ministry LHIN Accountability

More information

Quality Improvement Plans (QIP): Progress Report for the 2016/17 QIP

Quality Improvement Plans (QIP): Progress Report for the 2016/17 QIP Quality Improvement Plans (QIP): Progress Report for the QIP Medication Reconciliation ID Measure/Indicator from as stated on QIP 2017 1 Best possible medication history(bpmh) completion: The total number

More information

How BC s Health System Matrix Project Met the Challenges of Health Data

How BC s Health System Matrix Project Met the Challenges of Health Data Big Data: Privacy, Governance and Data Linkage in Health Information How BC s Health System Matrix Project Met the Challenges of Health Data Martha Burd, Health System Planning and Innovation Division

More information

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 2015-16 This document is intended to provide health care organizations in Ontario with guidance as to how they can develop

More information

Ontario Mental Health Reporting System

Ontario Mental Health Reporting System Ontario Mental Health Reporting System Data Quality Documentation 2016 2017 All rights reserved. The contents of this publication may be reproduced unaltered, in whole or in part and by any means, solely

More information

Hospital Care Indicators

Hospital Care Indicators Hospital Care Indicators Common Quality Agenda DRAFT - DO NOT CIRCULATE 1 Hospital Care Indicators There are 23 Common Quality Agenda indicators that are relevant to the hospital care sector, the largest

More information

Methodology Notes. Cost of a Standard Hospital Stay: Appendices to Indicator Library

Methodology Notes. Cost of a Standard Hospital Stay: Appendices to Indicator Library Methodology Notes Cost of a Standard Hospital Stay: Appendices to Indicator Library February 2018 Production of this document is made possible by financial contributions from Health Canada and provincial

More information

Analyzing Readmissions Patterns: Assessment of the LACE Tool Impact

Analyzing Readmissions Patterns: Assessment of the LACE Tool Impact Health Informatics Meets ehealth G. Schreier et al. (Eds.) 2016 The authors and IOS Press. This article is published online with Open Access by IOS Press and distributed under the terms of the Creative

More information

London CCG Neurology Profile

London CCG Neurology Profile CCG Neurology Profile November 214 Summary NHS Hammersmith And Fulham CCG Difference from Details Comments Admissions Neurology admissions per 1, 2,13 1,94 227 p.1 Emergency admissions per 1, 1,661 1,258

More information

Rural-Relevant Quality Measures for Critical Access Hospitals

Rural-Relevant Quality Measures for Critical Access Hospitals Rural-Relevant Quality Measures for Critical Access Hospitals Ira Moscovice PhD Michelle Casey MS University of Minnesota Rural Health Research Center Minnesota Rural Health Conference Duluth, Minnesota

More information

FY 2014 Inpatient Prospective Payment System Proposed Rule

FY 2014 Inpatient Prospective Payment System Proposed Rule FY 2014 Inpatient Prospective Payment System Proposed Rule Summary of Provisions Potentially Impacting EPs On April 26, 2013, the Centers for Medicare and Medicaid Services (CMS) released its Fiscal Year

More information

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario

Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario Quality Improvement Plan (QIP) Narrative for Health Care Organizations in Ontario 3/29/2017 This document is intended to provide health care organizations in Ontario with guidance as to how they can develop

More information

Health Quality Ontario

Health Quality Ontario The provincial advisor on the quality of health care in Ontario November 2016 Patient Safety Indicator Review: Summary Report Contents Introduction... 2 Background... 2 Indicator Review Principles... 3

More information

COMMITTEE REPORTS TO THE BOARD

COMMITTEE REPORTS TO THE BOARD Item # 9 F i COMMITTEE REPORTS TO THE BOARD To From South East LHIN Board Members Quality Committee Reviewed by Quality Committee Committee Members of the Committee were given the opportunity to review

More information

Star Rating Method for Single and Composite Measures

Star Rating Method for Single and Composite Measures Star Rating Method for Single and Composite Measures CheckPoint uses three-star ratings to enable consumers to more quickly and easily interpret information about hospital quality measures. Composite ratings

More information

Accountability Agreements in Ontario s Health System: How Can They Accelerate Quality Improvement and Enhance Public Reporting?

Accountability Agreements in Ontario s Health System: How Can They Accelerate Quality Improvement and Enhance Public Reporting? Accountability Agreements in Ontario s Health System: How Can They Accelerate Quality Accountability Agreements in Ontario s Health System: How Can They Accelerate Quality Improvement and Enhance Public

More information

The Role of Analytics in the Development of a Successful Readmissions Program

The Role of Analytics in the Development of a Successful Readmissions Program The Role of Analytics in the Development of a Successful Readmissions Program Pierre Yong, MD, MPH Director, Quality Measurement & Value-Based Incentives Group Centers for Medicare & Medicaid Services

More information

Waterloo Wellington Community Care Access Centre. Community Needs Assessment

Waterloo Wellington Community Care Access Centre. Community Needs Assessment Waterloo Wellington Community Care Access Centre Community Needs Assessment Table of Contents 1. Geography & Demographics 2. Socio-Economic Status & Population Health Community Needs Assessment 3. Community

More information

(202) or CMS Proposals to Improve Quality of Care during Hospital Inpatient Stays

(202) or CMS Proposals to Improve Quality of Care during Hospital Inpatient Stays DEPARTMENT OF HEALTH & HUMAN SERVICES Centers for Medicare & Medicaid Services Room 352-G 200 Independence Avenue, SW Washington, DC 20201 FACT SHEET FOR IMMEDIATE RELEASE April 30, 2014 Contact: CMS Media

More information

How to Calculate CIHI s Cost of a Standard Hospital Stay Indicator

How to Calculate CIHI s Cost of a Standard Hospital Stay Indicator Job Aid December 2016 How to Calculate CIHI s Cost of a Standard Hospital Stay Indicator This handout is intended as a quick reference. For more detailed information on the Cost of a Standard Hospital

More information

Trenton Memorial Hospital. Presentation to

Trenton Memorial Hospital. Presentation to Our TMH Resource Committee Trenton Memorial Hospital Facts and Figures Presentation to Quinte West Council 12 August 2015 1 Overview OurTMH Resource Committee projects: Provincial Organization of Health

More information

Supporting Best Practice for COPD Care Across the System

Supporting Best Practice for COPD Care Across the System Supporting Best Practice for COPD Care Across the System May 3, 2017 Health Quality Ontario The provincial advisor on the quality of health care in Ontario Overview Health Quality Ontario background QBP

More information

Centers for Medicare & Medicaid Services (CMS) Quality Improvement Program Measures for Acute Care Hospitals - Fiscal Year (FY) 2020 Payment Update

Centers for Medicare & Medicaid Services (CMS) Quality Improvement Program Measures for Acute Care Hospitals - Fiscal Year (FY) 2020 Payment Update ID Me asure Name NQF # Value- (VBP) - (HACRP) (HRRP) ID Me asure Name NQF # Value- (VBP) - (HACRP) (HRRP) CMS s - Fiscal Year 2020 Centers for Medicare & Medicaid Services (CMS) Improvement s for Acute

More information

2018 Press Ganey Award Criteria

2018 Press Ganey Award Criteria 2018 Press Ganey Award Criteria Guardian of Excellence Award SM This award honors clients who have reached the 95th percentile for patient experience, engagement or clinical quality performance. Guardian

More information

Disparities in Primary Health Care Experiences Among Canadians With Ambulatory Care Sensitive Conditions

Disparities in Primary Health Care Experiences Among Canadians With Ambulatory Care Sensitive Conditions March 2012 Disparities in Primary Health Care Experiences Among Canadians With Ambulatory Care Sensitive Conditions Highlights This report uses the 2008 Canadian Survey of Experiences With Primary Health

More information

Medicare Value Based Purchasing August 14, 2012

Medicare Value Based Purchasing August 14, 2012 Medicare Value Based Purchasing August 14, 2012 Wes Champion Senior Vice President Premier Performance Partners Copyright 2012 PREMIER INC, ALL RIGHTS RESERVED Premier is the nation s largest healthcare

More information

Data Quality Study of the Discharge Abstract Database

Data Quality Study of the Discharge Abstract Database Data Quality Study of the 2015 2016 Discharge Abstract Database A Focus on Hospital Harm Production of this document is made possible by financial contributions from Health Canada and provincial and territorial

More information

ICU Research Using Administrative Databases: What It s Good For, How to Use It

ICU Research Using Administrative Databases: What It s Good For, How to Use It ICU Research Using Administrative Databases: What It s Good For, How to Use It Allan Garland, MD, MA Associate Professor of Medicine and Community Health Sciences University of Manitoba None Disclosures

More information

Medicare P4P -- Medicare Quality Reporting, Incentive and Penalty Programs

Medicare P4P -- Medicare Quality Reporting, Incentive and Penalty Programs Medicare P4P -- Medicare Quality Reporting, Incentive and Penalty Programs Presenter: Daniel J. Hettich King & Spalding; Washington, DC dhettich@kslaw.com 1 I. Introduction Evolution of Medicare as a Purchaser

More information

Campbellford Memorial Hospital

Campbellford Memorial Hospital Campbellford Memorial Hospital Our Vision Campbellford Memorial Hospital's vision is to be a recognized leader in rural health care, creating a healthy community through service excellence, effective partnerships

More information

Welcome and Instructions

Welcome and Instructions Welcome and Instructions For audio, join by telephone at 877-594-8353, participant code 56350822# Your line is OPEN. Please do not use the hold feature on your phone but do mute your line by dialing *6.

More information

Hospital Strength INDEX Methodology

Hospital Strength INDEX Methodology 2017 Hospital Strength INDEX 2017 The Chartis Group, LLC. Table of Contents Research and Analytic Team... 2 Hospital Strength INDEX Summary... 3 Figure 1. Summary... 3 Summary... 4 Hospitals in the Study

More information