NHS Patient Survey Programme. Statement of Administrative Sources: quality of sample data

Similar documents
NHS Patient Survey Programme Adult Inpatient Survey: Quality and Methodology Report

Engaging clinicians in improving data quality in the NHS

Monthly and Quarterly Activity Returns Statistics Consultation

NHS Patient Survey Programme Emergency Department Survey: Quality and Methodology Report

Patient survey report Survey of people who use community mental health services 2011 Pennine Care NHS Foundation Trust

Survey of people who use community mental health services Leicestershire Partnership NHS Trust

TRUST BOARD MEETING JUNE Data Quality Metrics

Review of Follow-up Outpatient Appointments Hywel Dda University Health Board. Audit year: Issued: October 2015 Document reference: 491A2015

Patient survey report Outpatient Department Survey 2009 Airedale NHS Trust

Patient survey report Survey of adult inpatients in the NHS 2010 Yeovil District Hospital NHS Foundation Trust

Patient survey report National children's inpatient and day case survey 2014 The Mid Yorkshire Hospitals NHS Trust

Hospital Maternity Activity

Policy Summary. Policy Title: Policy and Procedure for Clinical Coding

Patient survey report Survey of adult inpatients 2011 The Royal Bournemouth and Christchurch Hospitals NHS Foundation Trust

NHS WALES INFORMATICS SERVICE DATA QUALITY STATUS REPORT ADMITTED PATIENT CARE DATA SET

The non-executive director s guide to NHS data Part one: Hospital activity, data sets and performance

NHS Grampian. Intensive Psychiatric Care Units

Intensive Psychiatric Care Units

Patient survey report Inpatient survey 2008 Royal Devon and Exeter NHS Foundation Trust

Patient survey report Survey of adult inpatients in the NHS 2009 Airedale NHS Trust

Patient survey report Outpatient Department Survey 2011 County Durham and Darlington NHS Foundation Trust

NHS WALES INFORMATICS SERVICE DATA QUALITY STATUS REPORT ADMITTED PATIENT CARE DATA SET

Document Control Page Version number as from December 2004: 2. Title: Information Quality Assurance Policy

Patient survey report Accident and emergency department survey 2012 North Cumbria University Hospitals NHS Trust

DATA QUALITY STRATEGY IM&T DEPARTMENT

Patient survey report Survey of people who use community mental health services Boroughs Partnership NHS Foundation Trust

Scottish Hospital Standardised Mortality Ratio (HSMR)

NHS Vacancy Statistics. England, February 2015 to October 2015 Provisional experimental statistics

Patient survey report Survey of people who use community mental health services gether NHS Foundation Trust

Mental Health Community Service User Survey 2017 Management Report

CLINICAL AND CARE GOVERNANCE STRATEGY

NOTTINGHAM UNIVERSITY HOSPITALS NHS TRUST. Documentation Control PATIENT DATA QUALITY POLICY

National Cancer Patient Experience Survey Programme Guidance Manual 2015

Patient survey report Survey of adult inpatients 2013 North Bristol NHS Trust

Patient survey report Survey of adult inpatients 2012 Sheffield Teaching Hospitals NHS Foundation Trust

NHS Borders. Intensive Psychiatric Care Units

Patient survey report Survey of adult inpatients 2016 Chesterfield Royal Hospital NHS Foundation Trust

Patient survey report Mental health acute inpatient service users survey gether NHS Foundation Trust

SUPPORTING DATA QUALITY NJR STRATEGY 2014/16

A Case Review Process for NHS Trusts and Foundation Trusts

GUIDANCE ON SUPPORTING INFORMATION FOR REVALIDATION FOR SURGERY

Patient survey report 2004

Medicine Reconciliation FREQUENTLY ASKED QUESTIONS NATIONAL MEDICATION SAFETY PROGRAMME

NHS Business Services Authority Burden Reduction Response

Annual Complaints Report 2014/15

A Comparison of Methods of Producing a Discharge Summary: handwritten vs. electronic documentation

Mental Health Crisis Pathway Analysis

Intensive Psychiatric Care Units

Use of social care data for impact analysis and risk stratification

National Cancer Patient Experience Survey National Results Summary

Supporting information for appraisal and revalidation: guidance for Supporting information for appraisal and revalidation: guidance for ophthalmology

REFERRAL TO TREATMENT ACCESS POLICY

My Discharge a proactive case management for discharging patients with dementia

Physiotherapy outpatient services survey 2012

Supporting information for appraisal and revalidation: guidance for pharmaceutical medicine

Standard Operating Procedure: Mental Health Services Data Set (MHSDS) Identifier metrics

Finalised Patient Reported Outcome Measures (PROMs) in England Data Quality Note

Reducing emergency admissions

Supporting information for appraisal and revalidation: guidance for psychiatry

NHS and independent ambulance services

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, April 2013

CONTINUING PROFESSIONAL DEVELOPMENT (CPD)

National Inpatient Survey. Director of Nursing and Quality

Intensive Psychiatric Care Units

NHS Summary Care Record. Guide for GP Practice Staff

Babylon Healthcare Services

Intensive Psychiatric Care Units

Emergency admissions to hospital: managing the demand

Health Care Quality Indicators in the Irish Health System:

Clinical Coding Policy

21 March NHS Providers ON THE DAY BRIEFING Page 1

The Trainee Doctor. Foundation and specialty, including GP training

National Cancer Patient Experience Survey National Results Summary

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, June 2014

Mental Capacity Act (2005) Deprivation of Liberty Safeguards (England)

Charlotte Banks Staff Involvement Lead. Stage 1 only (no negative impacts identified) Stage 2 recommended (negative impacts identified)

Committee is requested to action as follows: Richard Walker. Dylan Williams

MORTALITY REVIEW POLICY

Improvement and assessment framework for children and young people s health services

UK Renal Registry 20th Annual Report: Appendix A The UK Renal Registry Statement of Purpose

Enhanced service specification. Avoiding unplanned admissions: proactive case finding and patient review for vulnerable people

Inspecting Informing Improving. Patient survey report Mental health survey 2005 Humber Mental Health Teaching NHS Trust

2011 National NHS staff survey. Results from London Ambulance Service NHS Trust

TRUST CORPORATE POLICY RESPONDING TO DEATHS

Patient Experience Strategy

The right of Dr Dennis Green to be identified as author of this work has been asserted in accordance with the Copyright, Designs and Patents Act 1988.

NHS waiting times for elective care in England

Document Title Investigating Deaths (Mortality Review) Policy

Patient survey report 2004

Utilisation Management

Sarah Bloomfield, Director of Nursing and Quality

Department of Health. Managing NHS hospital consultants. Findings from the NAO survey of NHS consultants

Same day emergency care: clinical definition, patient selection and metrics

Paediatric Observation and Assessment Unit Operational Policy

Hard Truths Public Board 29th September, 2016

NHS Continuing Healthcare Funded Care Report Frequently Asked Questions 2017/18

Health Visiting and School Nursing Service Clinical Record Keeping Re-Audit 2014/15

Guidance notes on National Reporting and Learning System official statistics publications

Item E1 - Bart s Health Quality Indicators

Improving Healthcare Together : NHS Surrey Downs, Sutton and Merton clinical commissioning groups Issues Paper

Transcription:

NHS Patient Survey Programme Statement of Administrative Sources: quality of sample data January 2016

About this document This document sets out our Statement of Administrative Sources confirming the administrative data used to support delivery of the NHS Patient Survey Programme and how quality assurance of this data takes place. This contributes to a wider quality statement which outlines how quality is protected within the NHS Patient Survey Programme. This document accompanies a portfolio of supporting materials about the survey programme. These can be viewed at: www.cqc.org.uk/surveys 2

Statement of Administrative Sources The NHS Patient Survey Programme uses pre-existing administrative data to generate survey samples. In this document we set out how the administrative data is used, how we assure the quality of this data for the purposes of the survey programme, and how we endeavor to ensure the administrative data provided matches requirements for use (both for our use and for external users of survey results). To focus discussions around quality arrangements and subsequent impact on data, the Administrative Data Quality Assurance Toolkit produced by the UK Statistics Authority is used as a guide. The NHS Patient Survey Programme uses data drawn from NHS Trust Patient Administration Systems (PAS) to generate samples of patients who will receive questionnaires. PAS records patients administrative information, allowing NHS trusts to communicate and continue caring for their patients after discharge from hospital. For the survey programme, it is the source of contact information and demographics (e.g. name, home address and date of birth) that are used to compile patient samples. Other sample variables are also requested which are sometimes used for secondary analysis by stakeholders e.g. the Clinical Commissioning Group (CCG) variable is sometimes used by NHS England. PAS are devolved, locally run systems, meaning NHS Trusts may use one of a number of different systems supplied by different international suppliers. As such there is no one owner of PAS that the survey programme may collaborate with regarding our information requirements. This makes assurance around the quality of the data more challenging than if there were one provider, and means auditing PAS systems would be a resource intensive exercise. However, whenever we seek to collect new (i.e. previously uncollected) variables within survey sample files, contact is made with a sample of trusts to check their understanding of the intended variables and to identify issues that might arise during the collection. For new surveys or a change in methodology, a sampling pilot involving all trusts will be undertaken. We set out within this document the assurance that is available from other PAS data users that helps provide assurance that PAS data is of sufficient quality to meet our needs for the survey programme. 3

Recording and quality assurance of PAS data Data used for the purposes of the NHS Patient Survey Programme from PAS are first recorded at trust level by admin and clerical staff. The process map in Figure 1 below shows which data is input at each stage as the information passes through the trust, and where quality assurance takes place. The process map has been developed via consultation with a number of acute NHS trusts. Variables included in the map are those used to select patients for patient survey samples. Staffing issues and the environment at each stage of data entry can impact on data quality. Fig 1. Process map key When data is captured and by whom What quality assurance is conducted and by whom How records are continually managed and updated When data is extracted 4

Fig 1. PAS Process Map When is patient information captured and input into PAS? Quality assurance key Registration, Referral or Admission Name ❶❷❹❻ Address ❶❷❹❻ Postcode ❶❷❸❹❻ Date of Birth ❶❷❹❻ Input & captured by: Admin and clerical staff, Ward staff, Clinical site practitioners Real time Date of admission ❺❼ Date of discharge ❺❼ Input & captured by: Nurses, Admin and clerical staff During care episode Main specialty code on discharge ❶❺❼ Input & captured by: Admin and clerical staff Auto-generated Length of stay ❺❼ Patient information for surveys is extracted shortly before questionnaires are sent. Trusts wait for clinical coding to be completed on the data that is to be used. ❶ System validation ❷ Max number of characters checks ❸ Address validation software ❹ Personal Demographics Service updates ❺ Data quality reports ❻ Cross checks with patients ❼ Clinical coding Quality assurance is conducted by the following: - Information / Data Quality teams - Clinical Coding teams - Registration staff - Admissions and ward staff There is a continual update of contact information while contact with a patient is ongoing. It is possible that if a patient has not been seen/ treated at a trust for some time that data may start to go out of date, but patient survey samples are extracted within a maximum of three months of discharge. Records will also be subject to a Demographic Batch Service (DBS) trace to ensure that patient demographics are up to date. If a member of trust staff is told by a patient that their details have changed, or are incorrect, then this would be updated by administrative staff in the clinical setting. 5

Patient contact information Contact details for patients are input upon first contact with a trust. From the perspective of the survey programme, these are the most essential pieces of information we use from PAS, and allow for the distribution of questionnaires to patients. If there were to be large scale problems with the data recorded by trusts at this stage, response rates would be impacted and bias could be introduced into results. The map in Figure 1 shows that name, address and postcode undergo a number of quality assurance checks at trust level. In fact when staff are first entering details for a patient they should check to see if they are already on the trust PAS system owing to a previous hospital attendance, to ensure that the patient records are linked and the correct NHS number is used. If a match cannot be made within trust records, their system should attempt a match for the patient with the national NHS Spine Personal Demographics Service (PDS), prompting an opportunity to confirm whether the patient has moved address and again to ensure the correct NHS number is used. PDS will generate a record for a patient upon their very first contact with the NHS, assigning an NHS number which they should retain throughout all future contacts with the NHS. This helps minimise the potential for clerical staff to make input errors if they are entering patient contact details in busy reception areas or similar. It means the risk of input error is greater only for those patients who have moved address or had not had a previous contact with the NHS (and therefore not registered with the PDS Spine). Just one data field used within the programme (length of stay) is auto generated though this is cross checked with date of admission and date of discharge by the Survey Co-ordination Centre. The ability of trusts to cross check records internally and with the Spine provides some protection against the inaccurate recording of contact details, however more generally we expect that trusts have a strong motivation to record accurate contact details for their patients, given this is their means to make contact with them about their on-going care, and our surveys will usually be sent out to patients within a maximum of two months of them receiving care, meaning most records should still be up-to-date. Trusts also require postcode information for Payment by Results which is discussed under Other users later in this document, which means there is added impetus to encourage accuracy of recording this information initially. We acknowledge that recording of address is not likely to be completely error free however, as there is always the opportunity for introduction of typographical errors when entering new information. We take steps to try and mitigate these errors during survey implementation. Approved contractors with experience of delivering large scale surveys work with trusts across all surveys within the programme, and will check the validity of address fields as far 6

as they are able (please see Quality assurance of data provided by NHS trusts). Additionally, as an outcome of sending questionnaires, we record whether questionnaires are returned not known at this address which suggests people have moved since their care episode, or there were errors in recording their address that were not remedied. Levels of returns at trust level are monitored by approved contractors. The A&E and maternity surveys have slightly higher proportions of questionnaires returned in this way, and these are the surveys where you might expect patient addresses to change more frequently (for example some patient groups using A&E services might move home more frequently and women using maternity services may move home after babies are born). Data used to identify eligible patients Beyond contact details, incorrect sample frame information could result in incorrect inclusion/exclusion from the sample, for example, incorrect date of birth would affect eligibility where patients must be older than 15 years to participate, or date of discharge could have an impact where patients must have stayed at least one night in hospital or were discharged during a certain month. The Process Map in Figure 1 shows the data fields that are used for the purposes of selecting patients for samples for most surveys. For example: date of birth, date of admission, date of discharge, and main specialty on discharge are used to ensure patients are the appropriate age, stayed in hospital for the required period of time, and were not treated for conditions deemed exempt from the survey. Incorrect exclusion is unlikely to have much impact on the sample because in most contexts the sampling frame is large enough to prevent bias that might result from ad-hoc recording errors and we would not normally expect systematic issues to occur; with the possible exception of staffing issues in a particular location leading to long term data entry errors on a scale that might have an impact on results (though we have not to date uncovered any evidence of this happening). Incorrect inclusion will be identified at the data cleaning stage for most sample variables (e.g. with date of birth issues, where the primary data source is patient-provided information) and the volume is small for ad hoc recording errors. In the 2014 adult inpatient survey, year of birth was mismatched between sample data and response data in 1.4% of responses, and only a very small proportion of these would affect eligibility (i.e. whether a patient was younger than 16 and therefore ineligible). Incorrect inclusion errors should be detected by the approved contractor, or by the Survey Co-ordination Centre; anomalies are investigated with trusts and corrective action taken (i.e. trust contacts are asked to supply correct information or resample). The errors found during sample checks are detailed in the routine sampling error reports produced for each survey 7

in the programme. For example, the 2014 Adult Inpatient Survey Sampling Errors Report notes 25 minor errors made across 15 trusts stemming from incorrect coding (not all of these would have affected inclusion criteria, but might have had an impact for data users wishing to conduct secondary analysis). Systematic errors that might introduce bias in results would also usually be identified through checks undertaken by the Co-ordination Centre, whereby eligible population size is checked against figures for previous years for recurrent surveys, as well as being broken down by demographic group for comparisons. We have not to date found evidence of wide scale problems with the information recorded by trusts that would have a major impact on their data, rather any problems have stemmed from incorrect coding used to extract the records to generate samples i.e. the inclusion of incorrect patient groups contrary to survey guidance. We find examples of these problems during the checking process and are able to allow the trust correct mistakes if time permits, however if fieldwork has progressed too far, there are occasions where we would have to exclude their results from the survey. There are two routes by which errors in sample information related to eligibility or contact details can impact on the results of a trust or those for England as a whole. The first is by reducing the achieved sample size, which will be reflected in increased margins of error i.e. the results will be less precise than those from a sample unaffected by sample data errors. Given what we know about the extent of these kinds of error, the impact is likely to be small at the trust level and very small at England level, and where we become aware of these errors during checking, we exclude data for these trusts. The other way that errors could impact on results is through bias in the achieved sample. Bias would be introduced if there was a relationship between the likelihood of individuals having incorrect contact details and of their reporting positive experiences of care. For example, the contact details for a sample member were incorrect and the mechanism by which they came to be incorrect would impact on the direction of results. This would be the case if data errors were related to survey results, that is, if people with particularly good/poor experiences to report were more likely to have incorrect contact information. Apart from deliberate falsification on the part of trust staff to improve survey results, there are some plausible mechanisms that might produce such bias, such as communication difficulties between patient and staff resulting in both poor experience and failure to record the correct contact details. It does not seem likely that the extent of this would be great enough to impact perceptibly on trust-level results. However we have no evidence that such events are 8

not occurring. We have looked at the correlation between rates of questionnaires returned undelivered (as a proxy for incorrect contact details) and overall survey results at a trust level. We found a weak inverse correlation (-0.324) which suggests this kind of bias is uncommon and/or unobserved at most. The negative relationship suggests that if there is bias, it is in excluding patients with more positive experiences. However, an alternative and more plausible explanation is that this is not bias at all but that areas with worse experiences are also areas where there is greater population mobility. Fig 2. Correlation between overall experience score and percentage of questionnaires returned undelivered At this point it is worth again reflecting upon the main use of patient contact details, which is for the NHS trust to maintain contact with their patients, so such errors are more likely to appear during the sampling process rather than within the underlying administrative data. To rehearse the scenario above, whereby we might assume patients with communication difficulties are having incorrect contact details recorded, we ask respondents to tell us about long term conditions and consistently find approximately 11-12% of responses in Adult Inpatient and A&E surveys are received from patients who say they have a long term condition which causes difficulty with communicating, mixing with others or socialising. The relative constancy of this figure suggests no large scale attempt to suppress responses from 9

this group or large scale omissions, however this does not mean very low numbers of individual cases might not be affected, and as long term conditions information is not available in the sample data, we cannot check this. Based on what we do check and monitor, as reported within the sampling errors reports, we do sometimes find that coding instructions in guidance manuals have been misinterpreted and certain patient groups included or excluded inappropriately e.g. during the 2015 Community Mental Health Survey, one trust had removed patients with dementia from their sample and were asked to redraw their sample. Although trusts have a vested interest in holding accurate contact details for their patients, we are exploring mechanisms for assessing accuracy of this information further. For the 2016 Adult Inpatient Survey, when trusts submit their sample declaration form before submitting their patient samples, we will pilot an additional check which asks how many patients (what proportion of their sample) could not be matched by the Demographic Batch Service. As noted previously, the Demographic Batch Service allows NHS staff to verify patient s NHS numbers by checking against the NHS Spine Personal Demographics Service (PDS) and we ask that trusts use it to check for deceased patients prior to confirming their sample for mail out of questionnaires. The PDS is the national electronic database of NHS patient demographic details such as name, address and date of birth and records are normally created and an NHS number allocated for patients, on, or shortly after, their very first contact with an NHS service. A high proportion of untraced records could indicate an issue with the validity of the trust sample which might impact on sample bias. As Figure 1 showed however, trusts should have already checked their records with PDS to check accuracy of demographic information held. Quality assurance of data supplied by NHS Trusts Trusts participating in the survey programme are advised to use the services of one of a small number of approved contractors vetted to work on the programme and with permission to work with limited patient identifiable data. The work of trusts and contractors on the survey programme is underpinned by a survey instruction manual which sets out the standard methodology for administering the survey, including definitions for variables we request trusts supply in their sample files. Amongst other tasks, contractors are responsible for undertaking the first set of quality assurance checks on the data received from trusts, e.g. checking patient addresses, ages, ethnicity, route of admission, discharge codes and checking for duplications. These checks are effective in identifying sampling errors made by 10

trusts, for example in the 2014 survey of children and young people, contractors identified 635 errors in the way sample data was drawn (301 of these being name or address errors). Importantly, contractors will check the validity of addresses using a Royal Mail database. Once contractors have approved samples, these are submitted to the Survey Co-ordination Centre for final checks (without patient name and address). The diagram overleaf depicts the minimum checking process: 11

Fig 2. Data checks on trust samples Member of staff at NHS trust will draw a sample from PAS to specifications in survey instruction manual. NHS trust checks inc. incomplete information, distribution of patient age and gender, route of admission, exclusion of ineligible patient groups, date of discharge/ attendance. Revised list sent to the Demographics Batch Service (DBS) - to check for deceased patients. DBS matches patient records in the file against the NHS Spine Personal Demographics Service (PDS). DBS requires: - NHS number and full date of birth (yyyymmdd) or - Surname, first name, gender, date of birth and postcode Deceased patients will be removed by the trust before the sample is sent to their approved contractor (if contractor not in place trust removes before sending to Co-ordination Centre) NHS trust creates sample file using a pre-designed spreadsheet designed by the Coordination centre and downloaded from the NHS Surveys website http://www.nhssurveys.org/surveys/833 Person preparing the sample completes sample declaration form, available on the NHS Surveys website to download at http://www.nhssurveys.org/surveys/863. This form asks trusts to confirm numbers of dissenting patients removed from samples. NHS trust shares the patient sample file with an approved contractor in encrypted format. Approved contractor undertakes own checks on samples. Checks are undertaken ensuring variables meet requirements set out in the instruction manual e.g. discharge dates, patient exclusions and addresses checked against Royal Mail database. Errors are fed back to trust and samples re-drawn if necessary, errors logged with the Coordination Centre and published for future learning e.g. http://www.nhssurveys.org/survey/1548 Approved sample file submitted to the Co-ordination Centre. Co-ordination Centre checks sample files before approval given to mail out to patients. 12

The Co-ordination Centre undertake checks on each of the sample files, using the data provided from the Trust (all sample variables used for the purpose of the survey can be checked with the exception of name and address details which the Co-ordination Centre do not receive for confidentiality reasons). The checks undertaken on the sample file are survey specific, due to the type of data received and also the requirements of the sampling approach for the survey. The following checks are conducted for every trust for every survey: Age ranges: checking that sample members are within the eligible range for each survey and that the distribution of ages is as expected, using data from previous samples submitted by the trust where applicable Gender: checking that the split between male and females is relatively even (exceptions include the Maternity Survey and samples submitted by certain specialist trusts, e.g. women s trusts) and broadly in line with previous samples submitted by the trust Ethnicity: checking that distribution is as expected and broadly in line with previous samples Sample period dates: checking that the trust has used the correct sample period and the correct sampling approach (for example, checking if a trust has accidentally drawn a random sample instead of consecutive discharges for Inpatients) Route of admission and main specialty codes: checking that only eligible patients are included (e.g.: for Inpatients particular groups of patients such as those admitted for conditions related to mental health or psychiatry are excluded) and that distributions are broadly in line with previous submissions Diagnosis, CCG and Site codes checked for validity Checking for unlikely combinations of data: o o You would not expect to find males having babies, nor elderly people having babies You would not expect to see large differences between the number of mothers delivering babies and the number of babies being born As previously noted, the Co-ordination Centre produces a survey sampling report which lists evidence of any errors the Co-ordination Centre finds, and trusts will be asked to redraw 13

faulty samples. The survey sampling report is published on www.nhssurveys.org for every survey that is undertaken, and trusts and survey contractors are asked to review this before drawing samples for new surveys. Trusts also receive direct feedback on identified issues with their samples and are asked to explain or remedy these. The outcome of questionnaires mailed out to patients is recorded for every survey, including whether questionnaires have been returned as not known at this address. Outcomes are monitored on a weekly basis by contractors and in-house trusts to allow investigation of anomalies. It is recognised that some questionnaires which might be mailed out to the wrong address would not be returned and instead would show in our records as outcome unknown. Generally for each survey we would have approximately 1-2% of records recorded as not known at this address (though this is between 2-3% for the A&E survey). Source data is explored in advance of any newly designed survey or if the collection of a new sample variable is proposed. The Co-ordination Centre works with contacts within trusts to discuss the sample data that can be acquired and the accuracy of that data, and usually a sampling pilot exercise will be run with a sample of all NHS trusts. In testing the sampling method we would look for key differences against existing data sources: for example, we check against HES data on key demographics where possible; this has been done for the A&E survey, the Inpatient and Outpatient surveys. For surveys that have been running for several cycles, we also look across trend data during sample checking. We compare sample and response data during the data cleaning stage of the process (at the close of fieldwork) which can identify discrepancies and therefore suggest that data quality on the PAS (for some fields) is not up to standard. It is on the basis of this consultation work that we know email addresses are not currently routinely collected or reliable enough to run a national survey, and that for accuracy purposes, it is better to ask respondents to state their ethnicity than to rely solely on information recorded within trusts. Assurance sought by other users of PAS data We have noted that the primary purpose of data recorded on PAS is for the trust to provide patient care. However data recorded by trusts in their local PAS systems are used for a host of commissioning, healthcare planning and public policy purposes. These are summarised in Fig. 3. Perhaps the most well established use of data is within Hospital Episode Statistics (HES). HES is a data warehouse containing details of all admissions, outpatient appointments and A&E attendances at NHS hospitals in England. HES provides data for a wide range of healthcare analysis for the NHS, government and others. 14

Fig 3. Data flows from trust PAS systems Data is sent by NHS trusts to the Secondary Uses Service (SUS), a data warehouse that provides data which is used for HES and for Payment by Results (PbR). The Health and Social Care Information Centre (HSCIC) have documented their Processing Cycle and Data Quality which sets out how the HES Data Quality Team validates and approves data obtained from trusts. Data quality reports are presented on the HSCIC website which occasionally contain fields relevant to patient surveys, however it is patient name and address that is integral to the success of the survey with other variables commonly used in our analysis taken from patient responses in questionnaires directly (e.g. age, gender, ethnicity, how many babies women have had in the maternity survey, long term conditions and route of admission). HSCIC have reported that national completeness for postcodes in their admitted patient care and outpatient care datasets is 99.8% and 98.5% respectively (taken from the SUS Data Quality Dashboards). However this does not tell us about the accuracy of the codes. 15

Consistency of definitions across trusts is facilitated by the use of an NHS Data Model and Dictionary. This dictionary ensures trusts use a standard set of codes when they send data to SUS. SUS report data quality against these in their Data Quality Dashboards. The importance of HES and Payment by Results (PbR) means there are assurance processes and checks in place to maximize the quality and completeness of data. This will have some benefit for the survey programme and the assurance we require around robust data, assuming there is an emphasis placed on ensuring accuracy of data when first input at a clerical level, rather than relying on errors being picked up later in trust or even via SUS quality assurance mechanisms. However it is obvious from Fig 3. Survey Programme data is extracted separately and as it does not come via the Secondary Uses Service (SUS) we cannot assume all of the same data checks will have been undertaken and that quality checks undertaken by SUS will automatically and for every trust extend benefits to our extracted data. That said, the PAS Process Map (Fig 1.) did show a number of checks taking place before survey sample data are extracted. We know that there are pressures at trust level which will impact on the way information is entered, for example clinical coding departments are impacted by the experience of staff members, capacity and vacancy problems, and coding system issues. As PbR ensures that acute hospitals are paid for the patients that are admitted, or attend A&E or outpatient appointments, there are financial implications that potentially could make certain data fields vulnerable to distortive effects. However work is regularly undertaken to audit the quality of clinical coding at trusts as part of the Payment by Results data assurance framework programme. This provides assurance over the quality of the data that underpin payments as part of PbR, promoting improvement in data quality and supporting the accuracy of payment within the NHS. 1 It is undertaken on an on-going basis. In 2013/14, they reported the quality of clinical coding to be variable with an average error rating of 7% of patient spells changing payment code in audited trusts though selected trusts were already at risk of poor quality coding. They concluded such average error level was low. Accuracy of name and address is not included within these audits though age on admission, admission method, sex, and length of stay were. Data quality for these variables was much higher with over half of trusts showing no data item errors. Of the remaining trusts 1 https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/364476/the_quality_of_clin ical_coding_in_the_nhs.pdf 16

with errors, these mainly related to errors in length of stay or age. Others have reported that since PbR was instigated, accuracy of administrative data (diagnoses codes) has increased 2. We consider it less likely that there would be incentives to enter patient name and address information incorrectly at trust level, given what has previously been noted about the need for on-going contact with patients. For NHS trusts participating in the survey programme, there are incentives for questionnaires to be correctly dispatched to patients using an accurate name and address given the potential penalties that might be applied if a trust were to yield a low response rate. CQC in its regulatory function uses patient survey data as part of assessing risk in trusts and informing lines of enquiries during inspections. If trust data cannot be used because of too few responses, the trust would be deemed to be at greater risk for these indicators. 2 Burns, EM; Rigby, E; Mamidanna, R; Bottle, A; Aylin, P; Ziprin, P and Faiz, OD (2011) Systematic Review of Discharge Coding Accuracy. Journal of Public Health. http://jpubhealth.oxfordjournals.org/content/34/1/138 17

Administrative Data Quality Assurance Toolkit The Administrative Data Toolkit 3 is published by the UK Statistics Authority and is designed to guide quality assurance arrangements around administrative data used for official statistics purposes. It helps data producers consider the risks around administrative data and the assurances they require. The toolkit contains a QA Matrix which helps determine the types of assurance and documentation required to inform users about the quality assurance arrangements for administrative data (in our case PAS data). Four practice areas are identified and there are a total of four levels of assurance that can be sought against each: AO no assurance; A1 basic assurance; A2 enhanced assurance and A3 comprehensive assurance. As a data producer, we are responsible for determining which level of assurance is required though when assessed by the UK Statistics Authority against the Code of Practice for Official Statistics, we must defend our decisions based on our perceptions of level of risk of quality issues and the public interest profile of the statistics. 3 http://www.statisticsauthority.gov.uk/assessment/monitoring/administrative-data-and-official-statistics 18

Table 1. Areas of quality assurance practice against level of assurance Practice area Operational context and administrative data collection Level of assurance A1 basic Examples of assurance We have illustrated the process by which PAS data is entered within trusts and have begun to outline the operational context in which data is entered. We have specified the actions we take to identify and minimise risks to quality through a two stage sample checking process. We have also identified and summarised the implications for accuracy and quality of data. We have considered how we can monitor any changes in collection arrangements. Communication with data supply partners A1 - basic We aspire towards enhanced assurance by confirming the process map for data collection processes during our data collection exercise with a greater number of trusts (see below) and working with stakeholders to define data standards for relevant PAS fields. This is a higher quality assurance than is required. We provide a survey instruction manual that outlines the need for the data, sets strict criteria for timing and format of data supply. All trusts must sign-off their data via their Caldicott Guardian before submitting samples to ensure adherence to confidentiality and information security principles. Identified errors are also fed back to trusts supplying data, and they are consulted prior to use of new data fields or during development of new surveys/ methodologies. 19

QA principles, standards and checks applied by data suppliers A1 basic We have begun to articulate our knowledge of trusts QA checks, recognising that there may be variation in trusts processes which we will seek to understand during our data collection exercise (see below). We have also identified audits which are conducted on the admin data by other users making more substantive use of the data e.g. HES and PbR. We have detailed the QA checks that are undertaken on data once it is passed outside of the trust and the steps to work with providers to remedy any errors through the sample checking process. Producer s investigations documentation QA and A2 Enhanced assurance This document sets out the main stages covering QA checks on the admin data, detailed the general approach and findings for postcode quality indicators, identified the strengths and limitations of the admin data, and explored the likely degree of risk to quality. We have considered the quality issues arising from the PAS data that may affect the quality of our statistics, and of the nature of the public interest served by the statistics. We consider quality concerns around our statistics to be of low concern based on the information cited in this document, including the extensive and integral use trusts make of the data which are essential for trust purposes (patient name and address), we also consider public interest in our statistics to be intermediate rather than low or high. We mitigate risks to the quality of admin data in our processes by using patient reported variables wherever possible (and these variables cover the variables we use for standardising patient data reported back to trusts). Next steps Currently our quality assurance of PAS data is focused on the data once it has been extracted for the purpose of the survey programme (except when developing new surveys or requesting new variables). Although for the purpose of the survey programme we regard the impact of error on our work as small and expect that as name and address information for patients is essential for continuing patient care and payment trusts have a strong interest in ensuring the accuracy of this data, we will explore the environment in which data is collected 20

further. We will seek to understand better the quality of PAS data through a data collection exercise which will confirm how data is input to PAS systems across all trusts, as well as confirming quality assurance processes that take place on PAS data at source. We will ask trusts to tell us about the training in place for staff using PAS data to build our understanding of the issues that might impact on the administrative data at source. This feedback will be sought as part of a sampling pilot for the 2016 Children s Survey and can be shared with organisations who also work with PAS data. This Statement of Administrative Sources will be updated once the exercise is complete. We will also continue to work with other users of PAS/ SUS data to protect the integrity of the data. CQC in its capacity as regulator is currently working with HSCIC, NHS England, Monitor and the NHS Trust Development Authority to publish a set of data quality standards for all NHS care providers. These will facilitate improvement in the punctuality, validity and completeness of data entered into electronic records. Once these Data Quality Standards are published, CQC will consider performance against them as part of its regulatory regime. Currently a number of the fields used within the survey programme are being considered, with patient postcode currently a candidate for inclusion. We will continue to undertake sampling exercises when developing new surveys and collecting new data fields, ensuring the fields we collect are consistently interpreted by all providers. For example, during the 2016 Adult Inpatient Survey we will collect an additional discharge variable, and compare this against the one we already collect to see which is the more suitable for our ongoing needs. Trusts are being consulted on this change. We will trial an adaptation of our data collection tools (our sampling checklist) to include a check on whether trusts have a high proportion of untraced records which could indicate an issue with the validity of the trust sample which might impact on sample bias. It will allow us to monitor issues with tracing patients more effectively going forward. We are also facilitating greater transparency and monitoring of questionnaires that are returned undelivered by reporting numbers of questionnaires returned in this way in public facing reports and exploring reasons for this with trusts where numbers are higher. Further questions This document has been produced by CQC s Survey Team. If you have any questions regarding the programme, please contact the Team directly at, Patient.Survey@cqc.org.uk 21

22