SHEFFIELD TEACHING HOSPITALS NHS FOUNDATION TRUST EXECUTIVE SUMMARY REPORT TO THE TRUST BOARD HELD ON 18 NOVEMBER 2015

Similar documents
INTEGRATED PERFORMANCE REPORT. BOARD OF DIRECTORS 20 September 2017

INTEGRATED PERFORMANCE REPORT

Integrated Performance Report

Integrated Performance Report

BSUH INTEGRATED PERFORMANCE REPORT. 1) Responsive Domain 2) Safe Domain 3) Effective Domain 4) Caring Domain 5) Well Led Domain

Integrated Performance Report

The Royal Wolverhampton NHS Trust

EXECUTIVE SUMMARY REPORT TO THE BOARD OF DIRECTORS HELD ON 22 MAY Anne Gibbs, Director of Strategy & Planning

SHEFFIELD TEACHING HOSPITALS NHS FOUNDATION TRUST EXECUTIVE SUMMARY BOARD OF DIRECTORS 17 MAY Kirsten Major, Deputy Chief Executive

Trust Key Performance Indicators

COVENTRY AND RUGBY CLINICAL COMMISSIONING GROUP

Committee is requested to action as follows: Richard Walker. Dylan Williams

SUPPORTING PLANNING 2013/14 FOR CLINICAL COMMISSIONING GROUPs

Operational Focus: Performance

Quality & Performance Report. Public Board

REFERRAL TO TREATMENT ACCESS POLICY

Author: Kelvin Grabham, Associate Director of Performance & Information

Policy for Patient Access

NHS Wales Delivery Framework 2011/12 1

SOUTHAMPTON UNIVERSITY HOSPITALS NHS TRUST Trust Key Performance Indicators May Regular report to Trust Board

Quality & Performance Report Author: John Adler Sponsor: Chief Executive Date: FIC, PPP + QAC 28 th September Executive Summary from CEO

NHS 111 Clinical Governance Information Pack

WAITING TIMES 1. PURPOSE

Integrated Performance Report

Newham Borough Summary report

NOTTINGHAM UNIVERSITY HOSPITAL NHS TRUST. PATIENT ACCESS MANAGEMENT POLICY (Previously known as Waiting List Management Policy) Documentation Control

2017/18 Trust Balanced Scorecard

Performance and Quality Report Sean Morgan Director of Performance and Delivery Mary Hopper Director of Quality Dino Pardhanani, Clinical Director

SOUTHAMPTON UNIVERSITY HOSPITALS NHS TRUST Trust Key Performance Indicators April Regular report to Trust Board

SHEFFIELD TEACHING HOSPITALS NHS FOUNDATION TRUST EXECUTIVE SUMMARY REPORT TO THE HEALTHCARE GOVERNANCE COMMITTEE HELD ON 24 JULY 2017

Review of Follow-up Outpatient Appointments Hywel Dda University Health Board. Audit year: Issued: October 2015 Document reference: 491A2015

Referral to Treatment (RTT) Validation and Assurance Standard Operating Procedure (SOP) Contents

Quality Accounts: Corroborative Statements from Commissioning Groups. Nottingham NHS Treatment Centre - Corroborative Statement

NHS England London Southside 4th Floor 105 Victoria Street London SW1E 6QT. 24 th July Dear Daniel, Fiona and Louise. Re: CCG Annual Assurance

Annual Complaints Report 2014/15

Integrated Quality and Operational Compliance Report. 01/03/18 10:30 Final Report v1.4. February 2018

Performance, Quality and Outcomes Report: Position Statement

Lanarkshire NHS board 14 Beckford Street Hamilton ML3 0TA Telephone Fax

Improve, Inspire, Innovate Quality Improvement Plan

Policy for the repatriation of patients from Sheffield Teaching Hospitals NHS Foundation Trust

Document Management Section (if applicable) Previous policy number NA Previous version

Appendix 1. Quality Update Report for Salford CCG Open Board. Salford Royal, Oaklands and other providers of clinical services November 2013

Appendix A: University Hospitals Birmingham NHS Foundation Trust Draft Action Plan in Response to CQC Recommendations

Integrated Quality and Performance Report (IQPR)

Richard Wilson, Quality Insight and Intelligence Director

NHS Fylde and Wyre CCG Performance Dashboard

Governing Body. TITLE OF REPORT: Performance Report for period ending 31st December 2012

Integrated Performance Report Executive Summary (for NHS Fife Board Meeting) Produced in February 2018

Overall rating for this trust Good. Inspection report. Ratings. Are services safe? Requires improvement. Are services effective?

PATIENT RIGHTS ACT (SCOTLAND) 2011 ACCESS POLICY FOR TREATMENT TIME GUARANTEE

Did Not Attend (DNA) and Cancellation Policy and Operational Guidelines

Mental Health Crisis Pathway Analysis

Aligning the Publication of Performance Data: Outcome of Consultation

July (Month 4) Integrated Performance Report. John Grinnell, Director of Finance. Executive Directors. For Information For Discussion For Approval

Nottingham University Hospitals Emergency Department Quality Issues Related to Performance

Minutes of the BOARD OF DIRECTORS held on Wednesday 20th May 2015 in the Undergraduate Common Room, Northern General Hospital

Physiotherapy outpatient services survey 2012

Quality and Efficiency Support Team (QuEST) Directorate for Health Workforce and Performance

Aintree University Hospital NHS Foundation Trust Corporate Strategy

APPENDIX 7C BENEFITS REALISATION PLAN

Ambulatory emergency care Reimbursement under the national tariff

Revalidation Annual Report

Care Quality Commission (CQC) Inspection Briefing

Integrated Quality and Performance Report

Purpose of the Report: Update to the Trust Board on the clinically-led Trauma and Orthopaedic GIRFT review. Information Assurance X

This paper aims to provide the Board with a clear picture of how Waiting Lists are managed within NHS Borders.

London CCG Neurology Profile

Best Care Clinical Strategy Principles for the next 10 years of Best Care. Dr Caroline Allum, Executive Medical Director

ESHT Our ambition to be outstanding by 2020

Integrated Performance Report

National Audit Office Audit Programme

21 March NHS Providers ON THE DAY BRIEFING Page 1

Prevention and control of healthcare-associated infections

Redesign of Front Door

Report to: Trust Board 25 th April Enclosure 4. Title Integrated Performance Report March Sponsoring Executive Director

Emergency admissions to hospital: managing the demand

Quality Improvement Strategy

NHS WALES INFORMATICS SERVICE DATA QUALITY STATUS REPORT ADMITTED PATIENT CARE DATA SET

CQC INSPECTION. Ann Marr Chief Executive July 2016

Section 1 - Key Performance Indicators

The PCT Guide to Applying the 10 High Impact Changes

Patient Experience & Patient Information. Amy Sherman, Macmillan Project Manager, LCA

Follow-up Outpatient Appointments Summary of Local Audit Findings

GOVERNING BODY MEETING in Public 27 September 2017 Agenda Item 5.2

Delivering our Vision How are we doing? August 2018

5. Does this paper provide evidence of assurance against the Governing Body Assurance Framework?

How to write and review an access policy in line with best practice for referral to treatment and cancer pathways. July 2018

NHS WALES INFORMATICS SERVICE DATA QUALITY STATUS REPORT ADMITTED PATIENT CARE DATA SET

Trust Operational Policy. Elective Access

NHS performance statistics

Clinical Audit Strategy 2015/ /18

Quality Strategy (Refreshed March 2015)

Performance and Delivery/ Chief Nurse

Policy on Learning from Deaths

Public health guideline Published: 11 November 2011 nice.org.uk/guidance/ph36

SHEFFIELD TEACHING HOSPITALS NHS FOUNDATION TRUST EXECUTIVE SUMMARY REPORT TO THE BOARD OF DIRECTORS BOARD OF DIRECTORS 21 FEBRUARY 2018

NHS Pathways and Directory of Services

RTT Assurance Paper. 1. Introduction. 2. Background. 3. Waiting List Management for Elective Care. a. Planning

Agenda Item number: 9.1. Maggie Bayley, Director of Nursing and Quality

Newham Borough Summary report

Transcription:

SHEFFIELD TEACHING HOSPITALS NHS FOUNDATION TRUST EXECUTIVE SUMMARY REPORT TO THE TRUST BOARD HELD ON 18 NOVEMBER 2015 Subject: Supporting TEG Member: Authors: Status 1 Data Quality Baseline Assessment Kirsten Major, Director of Strategy and Operations Balbir Bhogal, Performance and Information Director Paul Buckley, Deputy Director of Strategy and Planning D PURPOSE OF THE REPORT: The report outlines the approach and methodology that will be used to assess data quality in line with the Trust s Data Quality Policy. It also sets out the progress of undertaking a self-assessment of the data quality of the performance indicators within the Integrated Performance Report (IPR). KEY POINTS: The Trust has recently revised its Data Quality Policy and it was ratified by the Board on 21 October 2015. The new policy emphasises that ensuring good data quality is everyone s business and departments will be expected to have standard procedures for the collection, validation and entry of data, which will be subject to future audit. As part of the implementation of the Data Quality Policy, regular reviews of data quality will be undertaken to provide assurance on the quality of the data. It was agreed that the indicators contained in the IPR would be the initial area of assessment. All 71 indicators in the IPR have been self-assessed. The current findings indicate that there are three indicators where there is an element of data quality rated as red. These relate to: Serious Untoward Incidents - Number of serious untoward incidents (SUI) Serious Untoward Incidents - Approved SUI Report submitted within timescales Complaints - Percentage of complaints answered within 25 working days The stage 1 assessment has been completed and all areas that are red rated will be followed up with indicator leads to develop improvement actions. The stage 2 and 3 assessment will be completed by March 2016. IMPLICATIONS 2 : AIM OF THE STHFT CORPORATE STRATEGY 2012-2017 TICK AS APPROPRIATE 1 Deliver the Best Clinical Outcomes 2 Provide Patient Centred Services 3 Employ Caring and Cared for Staff 4 Spend Public Money Wisely 5 Deliver Excellent Research, Education & Innovation RECOMMENDATION(S): The Board is asked to: a) Debate the initial assessment of the data quality of the performance indicators within the Integrated Performance Report. 1

APPROVAL PROCESS: Meeting Date Approved Y/N Trust Executive Group 4 November 2015 Y Finance, Workforce and Performance Committee 9 November 2015 Y Board of Directors 18 November 2015 2

Data Quality Baseline Assessment 1. Introduction The importance of good data quality is acknowledged widely across the NHS as critical to service delivery. Good data underpins successful organisational performance management. It is possibly of even greater value in that it provides an evidence base for the quality of care being provided by an organisation. The Trust has recently updated the Data Quality Policy and is introducing a framework to assess the quality of data used across the organisation and ensure that it is fit for purpose. This paper presents the initial assessment of the data quality of the performance indicators within the Integrated Performance Report (IPR). 2. Data Quality Assurance Framework Previously the Trust has used a local subjective method to assess the quality of data used within the IPR. It is recognised that data quality is made up of a number of inter-related components. The Data Quality Policy for the Trust outlines that in order to be meaningful data must meet the following criteria: Accuracy Is the data sufficiently accurate for the intended purposes? Validity is the data recorded and used in compliance with relevant requirements? Consistency Does the data reflect stable and consistent collection processes across collection points and over time? Timeliness is the data up to date and has it been captured as quickly as possible after the event or activity? Relevance Is the data captured applicable to the purposes for which they are used? Is all the relevant data included? The Trust has adopted the Data Quality Diamond methodology (recommended by the National Audit Office) as an approach to assess data quality. The methodology is outlined in Appendix 1 to fully assure data quality it is important that the assessment is captured at the various stages of information management; Stage 1 Data collection Stage 2 Data extraction Stage 3 Data upload 3. Data Quality Assessment The data quality assessment undertaken has focused on stage 1 the data collection process. The Trust IPR contains 71 separate indicators which are sourced from 12 separate systems. Indicator leads were requested to complete the self-assessment using the diamond methodology. A full assessment of responses received to date is contained within Appendix 2. 3

A baseline self-assessment has been undertaken on all 71 indicators in the IPR. The findings from the assessment indicate that three indicators where there is an element of data quality rated as red (Figure 1). Figure 1 Indicators that have been assessed with a RAG rating of red in one or more dimension Indicator Measure Accuracy Validity Current IPR RAG rating Serious Untoward Incidents Number of serious untoward incidents (SUI) Serious Untoward Incidents Approved SUI Report submitted within timescales Complaints Percentage of complaints answered within 25 working days Figure 2 below identifies all the indicators have been self-assessed as green across all dimensions. It is worth noting that some of these indicators previously had an amber rating for data quality. Figure 2 Indicators that have been assessed with a RAG rating of green across all dimensions Indicator Measure Accuracy Validity Current IPR RAG rating Hospital Mortality HSMR Hospital Mortality SHMI MRSA bacteraemia Actual numbers MSSA bacteraemia Actual numbers C Diff Actual numbers Staff Friends & Family Recommend as a place to be treated Sickness Absence All days lost as a percentage of those available Appraisals Completed appraisals in last year Mandatory Training Overall percentage of completed mandatory training 4

Indicator Measure Accuracy Validity Registered Nurses/midwives during the day Registered Nurses/midwives during the night Staff Friends & Family Agency spend I&E Contract performance Cash Capital expenditure Recruitment to trials Recruitment to trials Safety Thermometer Quality recommendation Work recommendation Staff Engagement Clinical Support Workers during the day Clinical Support Workers during the night Recommend as a place to work Agency and bank spend as a percentage of total pay budget Variance from plan Variance from plan Actual Variance from plan Total number of patient accruals to portfolio studies 70 Day Benchmark for recruitment of first patient to a clinical trial Harm free % staff who would recommend STH to a friend / relative for treatment % staff who would recommend STH as a place to work Staff engagement score Current IPR RAG rating The table below identifies the areas where there is a concern regarding the accuracy of the information being captured. 5

Figure 3 - Indicators where accuracy of the data is not robust Indicator Measure Accuracy Validity A&E 4-hour wait Patients seen within 4 hours 18 week waits referral to treatment time 18 week waits referral to treatment time 18 week waits referral to treatment time Percentage of admitted patients treated within 18 weeks Percentage of non-admitted patients treated within 18 weeks Percentage of patients on incomplete pathways waiting less than 18 weeks 52 week waits Actual numbers Patient seen within 2 weeks Breast symptomatic seen within 2 weeks 62 days from referral to treatment (GP referral) 31 day first treatment 31 day subsequent treatment (Surgery) 31 day subsequent treatment (Radiotherapy) 31 day subsequent treatment (Drugs) Increased response rates for inpatient FFT Response Rates areas FFT Response Rates Increased response rates for A&E Community care information completeness RTT information completeness Community care information completeness Referral information completeness Community care information completeness Activity information completeness Current IPR RAG rating 6

4. Key themes emerging from baseline assessment The baseline assessment has identified that a number of areas where there is strong assurance on data quality (25 of the 71 indicators assessed). There are a number of areas where gaps in assurance have been identified. a) SUI s and complaints data The system for recording SUI s is datix and the team have identified that not all SUI s are initially notified through the datix system. A significant amount of effort is currently expended reconciling the data from different sources. All formal complaints are managed centrally in the Patient Partnership department with the exception of Emergency Care, SYRS and LEGION, which are managed locally. For centrally managed complaints, there are no concerns with regard to reliability and consistency. For complaints that are managed by the other three care groups, processes are varied depending on who has entered the data. There are no standard operating procedures for complaints received in those departments and this results in the Patient Partnership department having to follow-up complaints logged with incomplete or inconsistent data. It is for these reasons that this set of indicators has aspects of data quality rated as red. b) Indicators derived from PAS data The accuracy of data quality from Lorenzo has been temporarily compromised as users familiarise themselves with the system affecting A&E reporting and Referral to Treatment indicators. The cancelled operations accuracy is amber as is a need to revisit the data definitions to ensure data is being collected in accordance with the national measures. c) Cancer data accuracy The accuracy of cancer data has been rated as amber whilst there are national guidelines in place the Trust does not have local fully comprehensive procedures for all areas. This cancer team are working to establish these. d) Reliability and consistency The reliability and consistency of 35 indicators was rated as either red or amber. In a number of these cases the procedures required to support data collection were not robust enough. 5. Next Steps The stage 1 assessment has been completed and all areas that are red rated will be followed up with indicator leads to develop improvement actions. The stage 2 and 3 assessment will then be completed by March 2016. In parallel, an ongoing review of the accuracy of data quality following the implementation of Lorenzo will be taking place. The findings from the completed assessment in March 2016 will form the basis of a data quality improvement plan. It will include a proposal on changes to the current RAG rating of data quality for the indicators in the IPR, a new rating methodology for each of the data quality elements and a timeline for the review of all other data returns that are submitted externally. 6. Recommendations The Board is asked to; a) Debate the initial assessment of the data quality of the performance indicators within the Integrated Performance Report. 7

APPENDIX 1 - SCORING OF DATA QUALTIY FOR THE INTEGRATED PERFORMANCE REPORT INDICATORS The owner of the indicator will be asked to provide details of the following against the criteria set out above: Dimension Data Collection Computer Systems National Standards Local Standards Accuracy Data reflects what actually happened /NOT Procedures are available to assist with data entry Local reference tables are validated and updated regularly Every opportunity is taken to ensure data accuracy (e.g. checking with patient) Validity Data is collected to a pre-defined code-set consistency Relationships between data items are correct (e.g. sequential, correct context) The system collects only valid codes There is validation to ensure conflicting data cannot be entered Codes comply with national standards, rules and definitions are applied correctly Progress towards performance targets reflects real changes rather than variations in data collection /NOT APPLICABLE /NOT All local categories are mapped to a distinct national category Staff have procedures and data collection is not subject to personal interpretation Timeliness Data is collected as real-time as possible (timely data is beneficial to the treatment of the patient) Timely data recording makes information widely available Data is collected to meet the deadlines for statutory returns /NOT APPLICABLE Timeliness takes priority over accuracy for urgent treatment of patients /NOT APPLICABLE Relevance Data is relevant to the purpose for which it is used Data collection is periodically reviewed to ensure changing needs are accommodated There is an understandin g of the bigger picture of why certain data is required /NOT 8

Dimension Data Collection Computer Systems National Standards Local Standards Completenes s & coverage At record level all mandatory data is collected Data reflects all work done Mandatory data items cannot be by-passed. Electronic records are an accurate reflection of manual records /NOT APPLICABLE/NOT The NHS Number is used in all identifiable references to patients External data submissions are an accurate reflection of work done Default codes are used where appropriate and not to cover missing data /NOT APPLICCABLE These responses will then be scored as set out below The scores will then be used to determine the quality of the data and any dimension where the score is amber or red will be asked to produce an action plan to improve the data quality. Accuracy 4 YES answers 3 YES, 1 NO/NOT 2 YES, 2 NO/NOT 1 YES, 3 NO/NOT 0 YES, 4 NO/NOT Validity 4 YES answers 3 YES, 1 NO 2 YES, 2 NO 1 YES, 3 NO 0 YES, 4 NO consistency 4 T APPLICABLE answers 3 T APPLICABLE, 1 NO 2 T APPLICABLE, 2 NO 1 T APPLICABLE, 3 NO 0 TAPLLICABLE, 4 NO Timeliness 4 T APPLICABLE answers 3 T APPLICABLE, 1 NO 2 T APPLICABLE, 2 NO 1 T APPLICABLE, 3 NO 0 TAPLLICABLE, 4 NO Relevance 3 T APPLICABLE answers 2 T APPLICABLE, 1 NO 1 T APPLICABLE, 2 NO 0 T APPLICABLE, 3 NO 9

& coverage 7 T APPLICABLE answers 6 T APPLICABLE, 1 NO/NOT 5 T APPLICABLE, 2 NO/NOT 4 T APPLICABLE, 3 NO/NOT 3 T APPLICABLE, 4 NO/NOT 2T APPLICABLE, 5 NO/NOT 1 T APPLICABLE, 6 NO/NOT 0 T APPLICABLE, 7 NO/NOT 10

APPENDIX 2 FULL LIST OF SELF ASSESSMENTS COMPLETED AS AT 29 TH OCTOBER 2015 Indicator Measure Accuracy Validity Hospital Mortality HSMR Hospital Mortality SHMI MRSA bacteraemia Actual numbers MSSA bacteraemia Actual numbers C Diff Actual numbers Serious Untoward Number of serious untoward incidents Incidents (SUI) Serious Untoward Approved SUI Report submitted within Incidents timescales Incidents Increase in incident reporting levels Incidents Incidents not approved after 35 days Average Length of Stay (by discharges) Average LOS Elective Average Length of Stay (by discharges) Average LOS Non Elective Staff Friends & Family Recommend as a place to be treated Patient Falls Number of patient falls Never Events Number of never events All days lost as a percentage of those Sickness Absence available Appraisals Completed appraisals in last year Mandatory Training Overall percentage of completed mandatory training Registered Nurses/midwives during the day Registered Nurses/midwives during the night Current IPR RAG rating 11

Indicator Measure Accuracy Validity Staff Friends & Family Agency spend I & E Contract performance Efficiency Cash Capital expenditure A&E 4-hour wait >12 hr Trolley waits in A&E Ambulance turnaround Ambulance turnaround 18 week waits referral to treatment time 18 week waits referral to treatment time 18 week waits referral to treatment time Clinical Support Workers during the day Clinical Support Workers during the night Recommend as a place to work Agency and bank spend as a percentage of total pay budget Variance from plan Variance from plan Variance from plan Actual Variance from plan Patients seen within 4 hours No. of patients waiting > 12 hours Time taken for ambulance handover of patient Time taken for ambulance handover of patient Percentage of admitted patients treated within 18 weeks Percentage of non-admitted patients treated within 18 weeks Percentage of patients on incomplete pathways waiting less than 18 weeks 52 week waits Actual numbers 6 week diagnostic Percentage of patients seen within 6 waiting weeks Cancelled Operations Number of operations cancelled on the day for non-clinical reasons Current IPR RAG rating 12

Indicator Measure Accuracy Validity Cancelled Operations Cancelled Outpatient appointments Cancelled Outpatient appointments DNA rate DNA rate Choose & Book Utilisation Ethnic Origin data collection Elective Inpatient activity Non elective inpatient activity New outpatient attendances Follow up op attendances Number of patients cancelled on the day and not readmitted within 28 days Percentage of out-patient appointments cancelled by hospital Percentage of out-patient appointments cancelled by patient Percentage of new out-patient appointments where patients DNA Percentage of follow-up out-patient appointments where patients DNA Patient seen within 2 weeks Breast symptomatic seen within 2 weeks 62 days from referral to treatment (GP referral) 31 day first treatment 31 day subsequent treatment (Surgery) 31 day subsequent treatment (Radiotherapy) 31 day subsequent treatment (Drugs) Percentage appointments booked through C&B % valid ethnic group Variance from contract schedules Variance from contract schedules Variance from contract schedules Variance from contract schedules Current IPR RAG rating 13

Indicator Measure Accuracy Validity A&E attendances Variance from contract schedules Complaints FFT Response Rates FFT Response Rates Community care information completeness Community care information completeness Community care information completeness Day surgery rates Mixed Sex Accommodation Recruitment to trials Recruitment to trials Safety Thermometer Quality recommendation Work recommendation Staff Engagement Percentage of complaints answered within 25 working days Increased response rates for inpatient areas Increased response rates for A&E RTT information completeness Referral information completeness Activity information completeness BADS - day surgery rates Number of breaches of Mixed Sex Accommodation standard Total number of patient accruals to portfolio studies 70 Day Benchmark for recruitment of first patient to a clinical trial Harm free % staff who would recommend STH to a friend / relative for treatment % staff who would recommend STH as a place to work Staff engagement score Current IPR RAG rating 14