ASPIRE for quality: a new evidencebased tool to evaluate clinical service performance in South Australian Local Health Networks

Size: px
Start display at page:

Download "ASPIRE for quality: a new evidencebased tool to evaluate clinical service performance in South Australian Local Health Networks"

Transcription

1 ASPIRE for quality: a new evidencebased tool to evaluate clinical service performance in South Australian Local Health Networks Prepared for: Allied and Scientific Health Office Department of Health, South Australia Prepared by: Dr Lucylynn Lizarondo Postdoctoral Research Fellow International Centre for Allied Health Evidence University of South Australia Adelaide SA 5000

2 Table of Contents Project Brief... 3 Project governance... 4 Executive Summary... 5 Setting the scene Introduction Objectives Methods Underpinning Evidence (Phase I, Part 1: Systematic Review) Methods Results Key summary points Underpinning Evidence (Phase I, Part 2: Descriptive Study) Methods Results Key summary points Synthesis of Part 1 and Part 2 (Genesis of ASPIRE) Synthesis of findings Pilot evaluation of ASPIRE (Phase II) Methods Results of quantitative evaluation Findings from qualitative evaluation Key summary points Appendices References P a g e 2

3 Project brief: This report describes the conceptualisation, development, and evaluation of an evidence-based performance evaluation system (ASPIRE) that can be used by allied health practitioners to assess their clinical service performance. P a g e 3

4 PROJECT GOVERNANCE International Centre for Allied Health Evidence School of Health Sciences City East Campus University of South Australia Adelaide South Australia 5000 Website: Centre Director Professor Karen Grimmer Phone: (08) Fax: (08) Team Leader, Evidence translation Dr Saravana Kumar Phone: (08) Fax: (08) Research Fellow Dr Lucylynn Lizarondo Phone: (08) Fax: (08) Project administered by Ms Madeleine Mallee Business Services Officer Business Development Unit Division of Health Sciences University of South Australia Phone: (08) Fax: (08) P a g e 4

5 EXECUTIVE SUMMARY ASPIRE for quality: a new evidence-based tool to evaluate clinical service performance in South Australian Local Health Networks The ASPIRE for quality framework is the only evidence-based approach available, which facilitates evaluation of clinical performance in allied health. What is ASPIRE Allied health clinical performance evaluation should be underpinned by processes that are based on research, and with an understanding of the perspectives of different stakeholders (i.e. allied health practitioners, managers/directors, consumers). It should be reinforced by a long-term vision to improve overall health outcomes, health service delivery, allied health workforce, and healthcare utilisation and cost. The ASPIRE for quality framework is developed to assist allied health practitioners to evaluate their clinical service performance as a means for improving the quality of allied health services. This framework is based on a systematic review of the literature on performance evaluation systems, layered with a local snapshot of current practice in performance evaluation in South Australian health networks. Figure 1 describes the approach. Figure 1: Developmental process for ASPIRE tool The ASPIRE model captures the core elements of performance evaluation which include prioritisation of clinical area for evaluation, upfront articulation of goals, careful identification of performance measures, mapping of measures to information sources, analysis of performance data and reporting of results, and evaluation of the performance evaluation system. The implementation of an effective performance evaluation, however, is hindered by interplay of factors which primarily include lack of time, limited resources and lack of understanding of the process of evaluation. Therefore, in recognition of these barriers, ASPIRE utilises a collaborative approach between allied health practitioners and experienced researchers who are skilled in providing evaluation training and also in undertaking performance evaluation. The ASPIRE model divides the core tasks between researchers and allied health practitioners P a g e 5

6 from the health site, as outlined in Table 1. The researchers will provide strong initial support and guidance which gradually reduces to enable practitioners to establish and maintain independence and promote a sense of ownership of the performance evaluation system. The ASPIRE model is ideal in building capacity that can increase the likelihood of allied health practitioners conducting performance evaluation in the future. Area for evaluation Set goals Performance indicators Information sources Report results Evaluate Table 1: ASPIRE for quality model The evaluation team from the allied health site identifies and prioritises the clinical area for performance evaluation. Based on the identified clinical area, the evaluation team sets the goals for performance evaluation. The evaluation team, assisted by experienced researchers, identifies performance measures or indicators. The evaluation team maps the performance measures to information sources. The researchers and evaluation team collaboratively analyses the results and report to stakeholders. The researchers and evaluation team collaboratively evaluates the performance evaluation process and its outcomes. ***Tasks in blue text boxes are responsibilities of allied health practitioners, and those in red are shared responsibilities of researchers and practitioners How to operationalise the ASPIRE tool Integral to the ASPIRE model is the formation of an evaluation team from the allied health site prior to undertaking performance evaluation. The team may include any of the following personnel: manager, senior staff, an independent accreditor or human resource personnel. Each team member must understand and be clear on what will be measured and what the foundations for evaluation are. The team should actively contribute to, and agree on the processes, methods and tools for evaluation and each member should be clear on their roles and responsibilities. The evaluation team works closely with the researchers and regular meetings should be organised to monitor the process. The ASPIRE for quality framework includes a set of tools (Appendix 1-6 in this report) which allied health practitioners (i.e. evaluation team) can use to facilitate the process of evaluation. The following outlines the steps involved in ASPIRE. ASPIRE: Area for evaluation The evaluation team from the allied health site identifies a clinical area for performance evaluation. The following questions/checklist may be used as a guide in determining a priority area for evaluation. Is it important and relevant to the group for which the performance measurement system is being produced? P a g e 6

7 Is it problem-prone and with high frequency of occurrence, or is it suspected of overuse, underuse, or misuse? Does it have strong financial impact? Does it have potential to improve health care delivery and outcomes? Has it recently undergone major changes? Does it have proven and significant variation in quality of service among health care providers? Is it considered high risk for patients? ASPIRE: Set goals for evaluation The goal for performance evaluation should be clearly articulated by the evaluation team before the measurement process is commenced. It is typically targeted to improve more than one of the following domains: acceptability, accessibility, appropriateness, care environment and amenities, continuity, competence or capability, effectiveness, improving health or clinical focus, expenditure or cost, efficiency, equity, governance, patient-centeredness, safety, sustainability, timeliness, and utilisation of care. ASPIRE: Performance indicators The evaluation team and researchers work collaboratively to identify performance indicators. A performance measure or indicator is used to assess a health care structure, process or an outcome. Structure measures evaluate the means and resources used by the health system to deliver allied health services. Process measures assess what the allied health practitioner did for the patient and how well it was done. Outcome measures examine the change in patients health status which can be attributed to the effectiveness of the treatment. Performance measures are based on standards of care, which can be evidence-based or, in the absence of scientific evidence, determined by an expert panel of health practitioners. They must be comprehensible, valid, reliable, reproducible, discriminative and easy to use. Performance evaluation typically involves multiple measures, rather than a single performance measure, in order to obtain a comprehensive assessment of performance. ASPIRE: Information sources The evaluation team reflects on the data they need to collect to measure structure, process or outcomes. Common sources of information or performance data are medical records, administrative data, and patient surveys. There may also be other information systems that can be sourced to obtain information such as incident reporting systems, documentation on clinical and professional supervision, and staff feedback. Additional data collection may be undertaken if required. The evaluation team should map the identified performance measures to the information sources. ASPIRE: Report results The evaluation team and researchers collaboratively analyse the data obtained from various information systems. Descriptive and inferential statistics may be required for quantitative data and thematic analysis for qualitative data. Presentation of results and findings should be concise, easy to understand and tailored to the needs of the stakeholders (e.g. clients, allied health practitioners, managers, human resources department). Performance evaluation reports typically include a combination of text, tables and graphs or charts. P a g e 7

8 ASPIRE: Evaluate the performance evaluation process and its outcomes The evaluation team and researchers work collaboratively to evaluate the performance evaluation process and its outcomes. The evaluation will focus on the following: practice changes that have occurred as a result of the evaluation, how well the new model is accepted by staff and management, the extent to which the model has improved the quality of health care (i.e. health impact and quality of service), and what improvements can be made to the evaluation system that can facilitate its effective and sustainable uptake by allied health practitioners. How allied health practitioners perceived ASPIRE The ASPIRE framework was evaluated for its feasibility (i.e. acceptability, usefulness, appropriateness) in allied health practice evaluation. Overall, the participants were positive about ASPIRE, and felt that performance evaluation using a structured framework was a worthwhile experience. The evaluation findings suggest that participants found ASPIRE a useful, appropriate, and easy to implement model for evaluating clinical performance in allied health, and that the current structure is acceptable and convenient to clinicians. They agreed that ASPIRE has addressed evaluation difficulties encountered in the past and identified the partnership with researchers as an effective strategy for encouraging allied health practitioners to evaluate performance. The participants reported that ASPIRE has improved their level of confidence and motivation to conduct evaluation. Time to gather evaluation information and difficulty in identifying performance indicators were described as barriers to performance evaluation. Strategies such as longer time for evaluation planning, face-to-face consultations with researchers (as opposed to teleconference), and a wider team involvement in the identification of performance indicators can potentially address these barriers. The current situation in allied health performance evaluation in South Australia This study found few instances of formal performance evaluation in allied health in South Australia. Most performance evaluation is ad hoc, and is characterised by: a lack of practical standards or best practice guidance to assist practitioners through the evaluation process, and limited time and skills to conduct evaluation. icahe recommends that ASPIRE for quality framework be used as a standard tool to guide and support allied health practitioners when undertaking clinical performance evaluation. Rolling out the ASPIRE for quality framework The ASPIRE for quality framework is ready to be implemented now. However, to facilitate effective and sustainable uptake of ASPIRE within South Australian Local Health Networks, the following strategies must occur: 1. Develop a training package to assist allied health evaluation teams to undertake performance evaluation using ASPIRE 2. Promote and offer ASPIRE to all South Australian (SA) Health--Allied Health Clinicians to facilitate routine clinical performance evaluation P a g e 8

9 3. Organise performance evaluation plans (e.g. templates for data collection, step-by-step guide to data abstraction, report templates) for every allied health department in the different local health networks in SA 4. Increase consumer involvement by engaging with consumer advocate groups during the process of developing performance evaluation plan 5. Finally, evaluate the impact of performance evaluation using ASPIRE on overall health outcomes, health service delivery, allied health workforce, and healthcare utilisation and cost icahe has the skills and expertise to undertake these strategies, and can be contacted to discuss further the implementation of ASPIRE. P a g e 9

10 Understanding and closing the evidence-practice gap: A blueprint to assist in building capacity in, and providing ongoing support to, South Australian Health Allied Health Clinicians, to evaluate clinical service performance in Local Health Networks SETTING THE SCENE P a g e 10

11 INTRODUCTION The sustainability of Australia s current health system and level of service it provides is an increasing concern for Federal and state governments. Multiple factors are involved, such as increasing availability of, and demand for, advanced technology services, an ageing population (more older people surviving, but with the chronic and multi-morbid diseases of ageing impacting on their independence, and quality of life), rapid advances in technology, and the ongoing issues meeting supply of, with demand for, healthcare providers. It is projected that under the current system, health care expenditure will increase to between 12 and 15% of the gross domestic product over the next 30 years (Novak & Judah 2011). In response to these challenges, changes are being implemented to the way Australia s health care system operates, for instance an increasing focus on primary healthcare, and workforce redesign to provide more practitioners with specific skills to deal with specific health issues (Gilmore et al 2011). Allied health services have been increasingly highlighted over the last five years as essential primary, sub-acute and tertiary services which could contribute significantly more to Australia s healthcare system than they are currently doing. Not only do these services offer much in terms of management of morbidities and improving quality of life in ageing populations, but there have been encouraging initiatives to explore advanced and extended scope of allied health practice to fill the workforce shortages in medicine and nursing specialty areas (Morris & Grimmer 2013; Stanhope et al 2012; Morris et al 2011). Generally underpinning healthcare system changes are pressures to reduce costs, increase access and affordability of services, and provide greater accountability. Consequently healthcare practitioners are regularly challenged with examining how they practice, and justifying their service performance and productivity. Performance measurement seeks to monitor, evaluate and communicate the extent to which various aspects of the health system meet their key objectives (Smith et al 2008). Productivity, on the other hand, refers to the extent to which the resources used by the health system are used efficiently in the pursuit of effectiveness (Smith et al 2008). Performance and productivity are therefore important indicators of the extent to which scarce resources should be allocated. They provide hard evidence about how well a healthcare system is progressing towards its goals, and help identify strengths and weaknesses to improve future performance (Purbey 2006). There is international evidence to suggest that organisations which do not integrate ongoing performance evaluation and productivity measurement into their system tend to experience lower than expected performance improvements, as well as higher dissatisfaction and turnover of staff (Longenecker & Fink 2001). The design of an effective performance and productivity measurement model is fundamental to aligning a healthcare organisation s operations with its strategic direction. It involves an ongoing cyclical process of information gathering, analysis and action at different levels the workforce, consumers of care, and organisation in which the services are provided. It also includes the appropriate selection of measures and approaches for analysing the results. Performance and productivity P a g e 11

12 measurement raises several challenges for allied health practitioners in relation to the selection of measures and implementation of an effective assessment strategy. The need for this research has been identified as urgent by the Allied and Scientific Health Office, Department of Health South Australia. The aim of this research was to establish a blueprint which will assist allied health clinicians in South Australia to better understand and close the evidence-practice gap, and assist in building capacity in evaluation of their clinical service performance. The framework proposed for this project addresses current theories of performance management and is tailored to the needs and contexts of allied health practitioners in local health networks. OBJECTIVES 1. To scope potential systems/models that can be used to deliver the following: Embedded understanding of how to evaluate performance and productivity of clinical services in local health networks (LHNs) by Allied Health in line with contemporary key SA Health and LHN plans and strategies Embedded understanding of how to promulgate the results of evaluation to achieve business change (including resources such as templates/toolkits) Increased levels of evaluated Allied Health services in LHNs Increased instances of Allied Health service improvement resulting from such evaluation A suite of measures such as key performance indicators, clinical indicators, outcomes, inputs and outputs accessible to allied health to evaluate and improve their clinical services 2. To recommend one system/model including a robust system of evaluation of the model, in consultation with the Chief Allied and Scientific Health Advisor or delegate 3. To pilot the recommended system/model in a controlled area (e.g. Division, Department) within a LHN 4. To report results to the Allied and Scientific Health Office with detailed recommendation for further phases/rollout as appropriate METHODS This project was undertaken in sequential stages using different research designs to comprehensively address the objectives of the project, as shown in Figure 2. In Phase I, a systematic review of the literature and a survey to describe current practices and perspectives were undertaken to address objectives 1 and 2. In Phase II, a mixed-methods approach utilising quantitative and qualitative strategies were conducted to address objectives 3 and 4. P a g e 12

13 Phase I Objectives 1. To scope potential models of performance evaluation 2. To recommend one system/model including a robust system of evaluation of the model, in consultation with the Chief Allied and Scientific Health Advisor or delegate Research design Systematic review of the literature (Part 1) Descriptive study (Part 2) Research methods Systematic search and analysis of the literature Survey questionnaire Timeline June October 2013 Phase II Objectives 1. To pilot the recommended system/model in a controlled area within a local health network 2. To report the results to the Allied & Scientific Health Office with detailed recommendation for further phases/rollout as appropriate Research design Mixed methods Research methods Questionnaires Focus group interview Timeline October 2013 March 2014 Clinical Performance Evaluation Framework for Local Health Networks in South Australia Figure 2: Research strategy P a g e 13

14 UNDERPINNING EVIDENCE Phase I (Part 1): Systematic review of the literature P a g e 14

15 Methods Criteria for considering studies in the review The following section describes the methodology of the systematic review component of this project Types of studies All peer-reviewed publication types including systematic reviews, literature reviews, evaluation studies, surveys and discussion papers were included in the review. Types of participants Publications which focused on any allied health performance measurement or those which examined performance in health care in general were considered in the review. Studies which focused only on nurses or physicians were excluded, as were those studies which described assessment of student-related performance or those which focused on improving educational curriculum. Types of exposure Publications which described performance measurement for individual practitioners, organisations or at a national level were reviewed. Search strategy All publications were retrieved from a combination of keywords 1 and keywords 2. Keyword 1 AND Keyword 2 performance measurement performance evaluation performance assessment performance monitoring performance appraisal healthcare health care The following electronic databases were searched: MEDLINE, Embase, CINAHL, PsychInfo, and Academic Search Premier. Limits were used to include only articles written in the English language and published between 2000 to present. The titles and abstracts identified by the search strategy were assessed for eligibility by two icahe researchers. Full text copy of the different studies considered to be potentially relevant for the review was retrieved for further examination. Data extraction and synthesis Data extraction was undertaken using a purpose-built data extraction tool. A narrative synthesis of the findings is presented in this report. P a g e 15

16 Results Summary of search results The database search yielded 720 articles, of which 645 were excluded due to duplicates and selection criteria. Full text copies were retrieved for the remaining 75 articles for further comparison against selection criteria. A consort diagram for the literature search is shown in Figure 3. Computerised search of databases 720 articles found 645 articles were excluded due to duplicates and selection criteria 75 potentially relevant articles Hard copies of all 75 articles obtained for further evaluation To date, 38 articles have been reviewed and included in the interim report To date, 37 articles have been excluded due to selection criteria (Nursing-, physicianfocused papers) Figure 3: CONSORT diagram of selection procedure Note: The reference list of all relevant articles will be reviewed for items that were not identified in the electronic search. Characteristics of included studies Purpose of performance evaluation Of the 38 articles that were reviewed, only five are specific to allied health, with the remaining publications reporting on health care practitioners in general. The majority of reviewed articles are discussion papers, followed by literature reviews and primary studies (e.g. survey, qualitative study, post-test only design, cohort, cross-sectional, retrospective audit, case studies). The information presented in most articles is based on USA (United States of America) and UK (United Kingdom) experience, with only a few describing performance measurements in the Australian context and other European countries. Different terminologies have been used in the literature to describe the process of monitoring and evaluating performance and productivity of health services. These include performance measurement, performance evaluation, performance assessment, performance monitoring, and performance appraisal. For consistency, the term performance evaluation will be used in this report to refer to this concept. The literature reports numerous reasons for undertaking performance evaluation and for the majority of the reviewed studies performance evaluation denotes measurement of health care quality. Obtaining an accurate insight about the quality P a g e 16

17 of care and promoting improvement in terms of health service delivery (Manderscheid 2006; Mant 2001; Mannion & Goddard 2002; Jolley 2003; Derose & Petitti 2003; Mainz 2003a; Sund et al 2007; Tawfik-Shukor et al 2007; Nuti et al 2013), administration and operational and financial management have been identified as one of the key roles of performance evaluation (Jolley 2003). The ultimate goal for which is to improve health outcomes by stimulating improvements in health care. The premise of performance evaluation is to assist health practitioners or organisations identify issues that require attention or opportunities for improvement, and recognise exemplary performance and effective practices. Once they are identified, strategies can be taken to foster improvement and achieve the desired outcomes. In addition to improving the quality of health care, there are many other reasons for undertaking performance evaluation and they can be categorised based on the perspectives of different stakeholders. From a practitioner perspective, performance evaluation can be an effective tool in providing objective feedback in order to validate their skills or trigger corrective action if poor skills are demonstrated, or act as a medium to correct or reward performance (Arnold & Pulich 2003; Koss et al 2002; Doherty 2004; Geddes & Gill 2012; Chandra & Frank 2004; Gregory 2000; Mant 2001). It can also identify professional development needs and assist in fulfilling professional regulatory body obligations (Geddes & Gill 2012). At a consumer level, performance evaluation provides clients with information that can facilitate choice of health care provider (Mant 2001; Mainz 2003a) and allows them to participate in the improvement of care delivered to them (Doherty 2004). From an organisational level, performance evaluation can assist in meeting accreditation standards and thirdparty contractual standards (Geddes & Gill 2012; Mainz 2003a). It can also support leadership development, and inform human resources decisions (e.g. pay increases, termination of practitioners, promotions) (Geddes & Gill 2012; Arnold & Pulich 2003; Chandra & Frank 2004). Moreover, performance evaluation data can be used to promote market competition for privately funded health services or public accountability for state-funded health care (Mant 2001). Kollberg et al (2005) defines performance evaluation as the process of collecting, computing, and presenting quantified constructs for the managerial purposes of following up, monitoring, and improving organizational performance. Finally, at the national level, performance evaluation data can inform policy making and assist with formulating strategies at a regional or national level (Mant 2001). Figure 4 summarises the different areas/ stakeholders/ domains which have an interest in performance evaluation. P a g e 17

18 Figure 4: Key domains of performance evaluation Core elements of a performance evaluation system Various methods or approaches to performance evaluation are described in the literature. However, no one best method can be recommended as every health care delivery system is unique, and has different performance needs. Performance evaluation approaches are context dependent and they should be designed to meet the unique requirements of the health care system. However, a critical examination of the literature suggests that there are key steps (as shown in Figure 5) and elements to a successful performance evaluation system. Figure 5: Steps involved in performance evaluation P a g e 18

19 Prioritising clinical areas for performance evaluation Undertaking performance evaluation can be a laborious and time-consuming process, and for it to be meaningful, carefully selecting a clinical area for evaluation is very important. The literature proposed several criteria for selecting aspects of care suitable for performance evaluation, and these include areas which: Are important and relevant to the group for which the performance evaluation system is being produced (this criterion recognizes that quality is viewed in various ways by different groups of stakeholders) (Marshall & Davies 2000; Geraedts et al 2003) Are problem-prone and with high frequency of occurrence, or those suspected of overuse, underuse, or misuse (Marshall & Davies 2000; Geraedts et al 2003; Mainz 2003b) Have strong financial impact (Marshall & Davies 2000; Geraedts et al 2003; Mainz 2003b) Have the potential to improve health care delivery and outcomes (Geraedts et al 2003; Mainz 2003b) Have recently undergone major changes (Geraedts et al 2003) Have proven and significant variation in quality of service among health care providers, or where there is evidence that the quality of service is substandard (Marshall & Davies 2000; Geraedts et al 2003; Mainz 2003b) Are considered high risk for patients (Geraedts et al 2003) Setting the goals for performance evaluation The basic tenet of good performance evaluation is the upfront development of strategic measurement goals (Loeb 2004; Mannion & Goddard 2002; Sibthorpe & Gardner 2007). Performance evaluation can be viewed as a mechanism that facilitates monitoring and promotion of progress towards performance goals rather than an end in itself. Therefore, these goals should be clearly articulated before the evaluation process is commenced. The goal of evaluation is typically targeted to improve the following domains: acceptability, accessibility, appropriateness, care environment and amenities, continuity, competence or capability, effectiveness, improving health or clinical focus, expenditure or cost, efficiency, equity, governance, patientcenteredness, safety, sustainability, timeliness, and utilization (Veillard et al 2005; Tawfik-Shukor et al 2007; Beyan & Baykal 2012; Nuti et al 2013). A performance evaluation activity usually targets more than one dimension, and is generally designed to address the needs of the stakeholders. The literature reports that it is quite common to begin the measurement process by determining what data are available within the practice or organisation (Marshall & Davies 2000; Loeb 2004). This is then followed by the identification of goals which align with the available performance data an approach that seeks to minimise the P a g e 19

20 collection of further data. While this would be cost-effective, it violates the basic principle of having to establish goals prior to the development of the evaluation system or process. Loeb (2004) claims that a more rational approach would be to define the evaluation goals first, and then determine whether reliable data exist to support such goals. Additional data collection can then be applied if required. However, it is also important to determine whether the advantages of additional data collection outweigh the time and costs associated with the process (Loeb 2004). Selecting the unit of analysis Performance evaluation can be examined at different levels of the health care system ranging from individual practitioners to geographical (e.g. regional or state) locations. An analysis of performance of an individual practitioner can promote ownership of the quality of care provided to clients (Marshall & Davies 2000). However, differences in performance between practitioners are often the result of random fluctuations rather than real differences in quality of care (Marshall & Davies 2000, page 308). On the other hand, by examining a state level performance, statistical problems which are common in individual level assessment, are not an issue. At this higher level, however, practitioners do not feel a sense of ownership for their performance and are therefore not likely to be motivated to change or improve their practice (Marshall & Davies 2000). Marshall & Davies (2000) suggest a mid-level assessment which involves small functional groups of health professionals, ranging from individual hospitals to groups of practices. This unit of analysis can encourage ownership of professional performance and help promote effective teamwork (Marshall & Davies 2000). In a mid-level assessment, it is always important to obtain the commitment of the chief executive and management team as this is pivotal to the successful implementation of performance evaluation systems (Colton 2007). The management plays a key role in performance evaluation as they will articulate the system or organisation s vision of quality, ensure that there is an infrastructure and systematic approach, and make resources available to support the process (Colton 2007). Harp (2004) operationalized performance for a group of health professionals (i.e. physiotherapists) into five distinct criteria namely output, quality, productivity, costs and customer satisfaction. Output was determined based on the calculation of the revenue for the patients in the program. Revenue is generated from the services billed to either the individual patient or insurance company. Quality of the service was determined by examination of patient outcomes. Productivity was measured using the billable patient care time of the staff. Costs involved both equipment and personnel costs (e.g. salaries, benefits, training and education). Customer satisfaction was measured using satisfaction surveys. Performance of the physiotherapy program was measured using these variables. Selecting performance measures A performance measure or indicator, also known as quality indicator, refers to a quantitative measure that can be used to monitor and evaluate the quality of important governance, management, clinical and support functions that affect patient P a g e 20

21 outcomes. (Joint Commission on Accreditation of Health Care Organisations 1990, reported by Bannigan 2000 and Mainz 2003a). It measures the extent to which set goals or targets are achieved. Figure 6 summarises the basic factors to consider when selecting performance measures. Figure 6: Factors to consider when selecting measures The performance measure should correspond to one or more of the target dimensions (i.e. acceptability, accessibility, appropriateness etc.) and is determined based on the level of health system being evaluated. At the practice or individual practitioner level, the performance measures can be developed from goals and objectives, which should be in line with the individual s work duties, and the strategic and operational goals of the organisation (Jolley 2003; Arnold & Pulich 2003; Geddes & Gill 2012; Chandra & Frank 2004). Individual practitioners goals should be jointly established by the manager and the practitioner as this will provide the opportunity for the manager to engage in interim planning with the practitioner (Arnold & Pulich 2003). At the organisational level, performance measures should be linked to the strategic planning of the service and the overall organisation values and standards (Jolley 2003; Chandra & Frank 2004; Purbey et al 2007). At the national level, performance measures should capture outcomes which are broad in scope (Jolley 2003). Performance measures are related to structure, process and outcomes (Donabedian 1988) and these quality concepts have been reported in performance evaluation studies (Mannion & Goddard 2002; Mainz 2003a; Derose & Petitti 2003; Johansen et al 2004; Tawfik-Shukor et al 2007; Sibthorpe & Gardner 2007; Kilbourne et al 2010; Beyan & Baykal 2012). The structure measures evaluate the means and resources used by the health P a g e 21

22 system to deliver services (Mainz 2003a; Beyan and Bekal 2012). Quantity and quality of health personnel (staffing), physical facilities, equipment and infrastructure, and existence of regulatory programs are considered as structural measures (Sibthorpe & Gardner 2007). Process measures examine the interaction between service providers and consumers (Beyan & Bekal 2012). It is concerned with activities which are carried out in relation to diagnosis, treatment, rehabilitation and care. Process measures assess what the health practitioner did for the patient and how well it was done (Sibthorpe & Gardner 2007). Outcome measures examine the change in patients health status which can be attributed to the effectiveness of the treatment. It is comprised of both physical and perceived benefits such as improvement in health status, satisfaction from the health service, receiving health related information and changing habits in maintaining personal health (Sibthorpe & Gardner 2007; Beyan and Bekal 2012). There are relative merits associated with using structure, process and outcome measures as performance indicators in health care (Marshall & Davies 2000; Roper & Mays 2000; Mant 2001; Geddes & Gill 2012; Perrin 2002; Johansen et al 2004; Kilbourne et al 2010). While structure, process and outcome measures have a role to play in performance evaluation, they have their own set of limitations and advantages. Type of performance measure Advantages Disadvantages Structure Process Outcomes Relatively easy to measure Sensitive to changes in the quality of care Cheap, quick and simple Few measurement difficulties Easy to interpret and readily actionable Fewer problems with confounding variables What matters to patients Intuitive Avoids prescription over healthcare activities Subject to response bias or vagueness in terminology or question Do not indicate whether good care happened There is often an indirec link between the structural variable and desired outcome. Can be too prescriptive Variable link to outcomes not necessarily predictors of outcomes Measurement concerns Causal ambiguity between activities and outcomes Time interval between some interventions and P a g e 22

23 outcome Insensitive to changes in underlying quality Derose, S & Petitti, D 2003, Measuring quality of care and performance from a population health care perspective, Annual Review of Public Health, vol. 24, pp Marshall MN, Davies HTO 2000, Performance measurement and management of healthcare professionals, Dis Manage Health Outcomes; 7(6): Kilbourne A, Keyser D, Pincus HA 2010, Challenges and opportunities in measuring the quality of mental health care, Can J Psychiatry; 55(9): Mant (2001) argued that in instances where health services have a major impact on outcome, use of outcome measures as performance indicators is appropriate, provided that the data collected can be interpreted reliably. Conversely, in situations where factors such as lifestyle, co-morbidities, socioeconomic circumstances rather than health care play a major role in health outcomes, process measures are preferred (Mant 2001; Bente 2005). This does not mean, however, that outcome data should not be collected, just that it should be collected within context. In other words, what seems to be the best solution is to combine process and outcome measures which are tailored to local circumstances and priorities (Mannion & Goddard 2002; van der Geer et al 2009). Performance measures are based on standards of care, which can be evidence-based or, in the absence of scientific evidence, determined by an expert panel of health practitioners (Mainz 2003a; Bente 2005). Performance measures must be clear, valid, reliable, reproducible, discriminative and easy to use (Bannigan 2000; Roper & Mays 2000; Marshall & Davies 2000; Geraedts et al 2003; Purbey et al 2007; Derose & Petitti 2003; Mainz 2003a; Veillard et al 2005). They should be comprehensive yet practically relevant and meaningful (Mainz 2003a; Geddes & Gill 2012). The use of multiple measures is favoured over a single measure in order to obtain a comprehensive picture of health care (Loeb 2004). Single measures are, in most cases, limited in scope thereby reducing their utility to relevant stakeholders. Identifying types and sources of information Performance evaluation should obtain information or data from several perspectives (i.e. multi-feedback) as this will provide a richer assessment of performance compared to a single source (Jolley 2003; Hamilton 2007; Purbey et al 2007; Kilbourne et al 2010; Nuti et al 2013). This should involve representatives from specific stakeholder groups depending on the level of health system being evaluated (Geraedts et al 2003; Jolley 2003). Information can be obtained from various sources, such as information systems, reports, surveys and records. Data types are usually categorised as clinical data, administrative data and patient-based data (Derose & Petitti 2003; Beyan & Bekal 2012). Clinical data includes information which can be obtained from all types of P a g e 23

24 medical records or medically oriented sources such as outcome measurements reported in patient charts, discharge reports, diagnostic reports. Administrative data are related to health costs including billing and claims. Patient-based data refer to the information collected directly from patients through questionnaires or interviews. Undertaking performance evaluation The objectives, procedures, participants (target groups), materials (e.g. training materials, interpretation guides, etc.), and premises for the performance measurement should be clearly identified and documented (Geraedts et al 2003). A schedule for performance evaluation that works well with the practice is recommended (Doherty 2004). Evaluator training is a key factor to conducting effective performance evaluation (Arnold & Pulich 2003; Chandra & Frank 2004). Training has been reported to improve consistency and develop confidence with the use of evaluation instrument (Chandra & Frank 2004). All evaluators or anyone completing the measurement must be instructed about the performance measurement process (Arnold & Pulich 2003). Reporting of results Reporting of results should be built into the performance evaluation system (Bannigan 2000; Nuti et al 2013). The results serve as feedback to health practitioners, either as recognition for good performance or as a prompt for further improvement or development, which can increase service performance or work motivation (Vasset et al 2011). Graphic presentations of data provide an attractive and more easily understood approach to presenting results. Providing diagrams or graphs may show patterns and associations which may not be evident and easy to interpret without statistical analysis. While much of the data collected in health care organisations can stand alone in providing useful information, additional information can be obtained when comparative data is also presented. When available, norms, standards and benchmarks provide opportunities to compare data to external sources (Colton 2007). Tools for evaluating performance A wide range of tools for measurement or evaluation were reported in the literature, often comprising the use of more than one instrument. The choice of tools is dependent on several factors including but not limited to the level of health system for evaluation, objectives of evaluation and target participants. Figure 7 illustrates the different performance evaluation tools described in the literature. P a g e 24

25 Figure 7: Tools for measuring performance Practice-focused tools Benchmarking, also known as Standards : a process in which the performance data collected are compared with pre-determined standards of practice, locally or nationally (Bannigan 2000; Hamilton et al 2007). These standards define the level of performance required for the successful achievement of work expectations and specify what the consumers can expect from the practitioners (Hamilton et al 2007). Practitioner-focused tools Direct clinical observation of the clinician in the patient s setting (Doherty 2004; Geddes & Gill 2012; Chandra & Frank 2004): a process where the practice manager (or supervisor) observes the practitioner s interactions with each client, and documents behaviour (e.g. consent, client-centred care, professionalism, communication, time management, and safety). This process can be time-consuming, and validity and reliability are common concerns (Hamilton et al 2007). Interview with practitioner: This involves collaborative discussion between the practice manager (supervisor) and the individual practitioner where they share their assessment of the strengths and areas of concern of the practitioner and organisation (Geddes & Gill 2012). The individual practitioner provides information about his/her current workload, describes how the practice aligns with the organisation s mission and vision, and can then express interest in participating in other work opportunities within the organisation (Geddes & Gill 2012). Critical incident reporting: a process which involves the investigation and review of adverse events with the aim of identifying poor performance (Bannigan 2000). The practice manager (or supervisor) compiles a list of outstanding and unsatisfactory performance, and if no critical incidents are recorded, the practitioner s performance is considered acceptable (Arnold & P a g e 25

26 Pulich 2003). Face-to-face interviews are the preferred method, however, critical incident reports can also be gathered through questionnaires, telephone interviews, workshops, group interviews and observation (Hamilton et al 2007). Self-reflection or self-appraisal: the practitioner summarises how professional goals were achieved during the year and sets goals for the following year (Geddes & Gill 2012). This process is useful in validating the objectivity of the supervisor or manager but may also be an issue when there are significant differences in rating (Arnold & Pulich 2003). Another advantage of this process is that it can decrease the defensiveness of the practitioners being appraised (Chandra & Frank 2004). Peer review or appraisal: this method of assessment requires input from other team members (Doherty 2004; Bannigan 2000; Arnold & Pulich 2003). Bannigan (2000) reports this process as a review of clinical caseload away from the clinical setting with a group or individual colleagues. Peer review may be helpful in revealing issues regarding objectivity, although it may also create conflict among peers (Arnold & Pulich 2003). Chart-stimulated recall (CSR) is a peer review process that combines traditional chart audit with personal interview and discussion (Salvatori et al 2008). It involves a random selection of patient records followed by an interview with a colleague or peer to review the charts and discuss cases. The CSR process measures clinical competence and identifies areas for improvement in practice behaviour. Salvatori et al (2008) implemented and examined the CSR process in a pilot study involving occupational therapists. Results indicated that this assessment process can discriminate occupational therapists in terms of clinical competence and identify specific areas of concern that could be targeted for professional development at both the individual and practice area level. Patient-focused tools The use of outcome measures to collect information about patient health status has been reported in the literature (Johansen et al 2004). Outcome measures are used to determine change in patients status over time. In addition to providing clinicians with feedback on patient outcomes, they also allow progress of status to be effectively communicated to patients, and promote treatment planning (Duncan & Murray 2012). Routine outcome measurement can also support or justify the interventions administered to patients, and provide supporting evidence to funding bodies (Duncan & Murray 2012). This could also be in the form of patient/client satisfaction questionnaire (Bannigan 2000), or patient reports (verbal reporting to the practice team (Doherty 2004), or actual complaints from clients (Bannigan 2000)). Koch et al (2011) suggest that patient data may be used to demonstrate accountability, feedback to individual practitioner, staff supervision, meet accreditation P a g e 26

27 requirements, enhance staff morale, and support budget requests. Barriers and challenges to implementation of performance evaluation While there are significant benefits associated with performance evaluation, the literature also highlighted barriers and challenges to its implementation. These include: Resources Time requirements: For example, Gill & Geddes (2012) reported that five to six hours were spent on chart audit and four hours for individual interview and completion of action plan. Colton (2007) argued that the time and manpower needed to support performance measurement may be constrained by a health care financial system that places limitations on reimbursements. Cost associated with the process of undertaking performance measurement (Chandra & Frank 2004; Loeb 2004; Hamilton 2007) Personality There may be personality conflicts between managers (or supervisors) and individual practitioners (Arnold & Pulich 2003; Geddes & Gill 2012). Resistance from personnel: practitioners may also be skeptical about the validity and usefulness of performance measurement data (Colton 2007; Kollberg et al 2005). Difficulty in motivating and personnel and heads of department (Kollberg et al 2005) Different understanding of the meaning and language used in performance assessment by those collecting the measurement and those using it (Jolley 003) P a g e 27

28 KEY SUMMARY POINTS Performance measurement is an integral part of health care. Its primary aim is to measure the quality of health services, the ultimate goal of which is to improve health outcomes by stimulating improvements in health care. In addition to improving quality, there are other reasons for undertaking performance measurement and these are dependent on the perspectives of different stakeholders. From a practitioner perspective, performance measurement serves as a medium to provide feedback that could validate a clinician s performance or trigger corrective actions if poor performance is demonstrated. From a consumer perspective, performance measurement allows clients to participate in the care delivered to them. From an organisational perspective, performance measurement assists in meeting accreditation standards and facilitates human resources decisions. From a national level, performance measurement serves an important role in policy making. There is no one-size-fits-all approach for performance measurement that can be recommended to all allied health care settings. There are, however, essential elements and key steps involved in performance measurement which include: prioritising clinical areas for measurement, setting goals, selecting the unit of measurement analysis, selecting performance measures, identifying sources of feedback, undertaking performance measurement, and reporting the results to relevant stakeholders. Performance measurement is multi-dimensional. It requires information or data from more than one perspective to provide a rich assessment of performance. Fundamental to an effective performance measurement system is the establishment of strategic measurement goals which should be aligned to the practice or organisation s values and strategies. It should involve measurements from multi and interrelated perspectives, depending on the level of health care system involved in the analysis. A multi-method strategy that captures various perspectives is therefore required in order to gather a comprehensive picture of health care quality and performance. The quality of performance and clinical care is generally assessed using structure, process and outcome measures. Selection of appropriate measures, also known as quality indicators, is dependent on the level of health system being evaluated. Comprehensibility, validity, reliability, reproducibility, discrimination, and ease of use are quality requirements that guide the choice of performance measures. A variety of tools or methods to assess performance or quality of care have been reported in the literature. While performance measurement is imperative to improving health care quality, there are barriers and challenges to its implementation, which include cost and time constraints, and conflicts between health care staff and their superiors. These need to be considered for effective uptake and sustainability of the performance measurement system. P a g e 28

29 UNDERPINNING EVIDENCE Phase I (Part 2): Descriptive study: survey of current practice P a g e 29

30 Methods Development and validation of the survey Implementation of the survey The following section describes the methodology of the descriptive study component of this project. Based on the results of the systematic review, a draft questionnaire was developed by the research team to examine current practice in performance evaluation in local health networks in SA. The 24 items in the questionnaire covered topics related to current employment, experiences in performance evaluation and outcome measurement, and perspectives about how performance evaluation should be undertaken. Content validity of the questionnaire was established through formal feedback from a panel of experienced performance evaluators from different allied health disciplines. Changes were made to the questionnaire based on comments from the expert panel, and then finalised into a 22-item questionnaire (Performance Evaluation Questionnaire) which was converted to an electronic format via survey monkey. A copy of the questionnaire is presented in Appendix 7. Approval for the survey process was obtained from the University of South Australia Human Research Ethics Committee and South Australia Health Human Research Ethics Committee. Survey respondents were allied health managers working under SA Health. invitation was sent to all allied health managers through the State-wide Allied Health Executive for dissemination throughout the health networks. The participant information sheet and a link to the electronic questionnaire were attached to the invitation. The survey period was 5 th September to 27 th September Descriptive statistics and graphs were used to summarise the responses to the questionnaire. P a g e 30

31 Results Characteristics of the survey participants A total of 42 allied health managers completed the Performance Evaluation Questionnaire. Figure 8 shows majority of respondents are social workers (SW), followed by occupational therapists (OT), radiographers, physiotherapists (physio), and podiatrists (pod), with a few speech pathologists (SP), psychologists (psych), dietician (DN) and orthotist. A large proportion of the managers assume either an AHP2 (29%) or AHP4 (39%) classification; 24% are AHP3 and only a few AHP5 (5%) and AHP6 (2%). Many are employed in a primary or community health service (38%), and the remaining are working in a variety of clinical settings such as mental health service (25%), acute tertiary hospital (23%), acute general hospital (10%) and subacute or rehabilitation hospital (5%). SW 33% SP 5% DN 2% OT 17% Physio 12% Orthotist 2% Radiographer 14% Pod 10% Psych 5% Figure 8: Distribution of respondents according to profession The majority of respondents (83%) reported undertaking performance evaluation in their department. As shown in Figure 9, all managers from five allied health professions, namely orthotics, physiotherapy, podiatry, psychology and speech pathology, stated they routinely conduct performance evaluation. Seven managers (17% of respondents) consisting of one dietician, two occupational therapists, two radiographers, and two social workers reported that no such evaluation exist in their department. Lack of management direction, heavy clinical workload, and fear of evaluation outcomes were reported as reasons for not undertaking a formal performance evaluation of health service. One of the managers explained that informal benchmarking in specific areas, opportunistic evaluation of client satisfaction and staff appraisals have occurred in their department, as opposed to a formal evaluation. The managers, however, reported positive attitudes toward performance evaluation and believed that a structured process can identify areas of learning that will benefit clients and overall service delivery. It can also help monitor patient care, health services and overall performance. P a g e 31

32 Figure 9: Percentage of professions which conduct performance evaluation Purposes of performance evaluation Improving the quality of health services, identifying areas for professional development and meeting accreditation standards were identified as the top three drivers for performance evaluation, as shown in Figure 10. Providing feedback to staff following evaluation was reported as the next common reason for undertaking evaluation. Less than half of the survey respondents thought that outcomes of performance evaluation can assist in promoting the services of the department, informing human resources decisions, and allowing clients to participate in the improvement of care delivered to them. Analysis of individual professions showed similar results across disciplines. Promote services of the dep't For HR decisions Meet accreditation standards For client participation Identify professional needs Feedback to staff Quality improvement 0% 20% 40% 60% 80% 100% Figure 10: Purpose of performance evaluation Frequency of performance evaluation More than half of the respondents reported undertaking performance evaluation annually, while very few runs evaluations monthly, biannually, ad-hoc or quarterly (Figure 11). There is agreement among professions that, given the heavy clinical workload of allied health practitioners, evaluations should be conducted annually only. P a g e 32

33 Quarterly 3% Adhoc 6% Biannually 9% Monthly 16% Annually 66% Figure 11: Frequency of performance evaluation Key areas of evaluation Performance measures are related to structure (assessment of resources or means used by the health system to deliver services), process (assessment of what the health practitioners did for the client and how well it was done), and outcomes (assessment of health outcomes or client s health status which can be attributed to the effectiveness of health services). Overall, these three key areas are considered in the performance evaluation of allied health services, as shown in Figure 12. Outcomes Process Structure 0% 20% 40% 60% 80% Figure 12: Key areas for performance evaluation There is variability in the focus of evaluation amongst professions. Occupational therapists consider structure, process and outcomes equally important in performance evaluation. Physiotherapists deem process measures as the most important followed by outcome measures then structure measures. Social workers and psychologists believe that process and outcomes are equally considered in evaluation and focus less on structure measures. Radiographers, podiatrists and speech pathologists consider P a g e 33

34 process measures more than outcome and structure measures. For the orthotist respondent, only outcomes are regarded as performance measures. Approaches to performance evaluation A variety of approaches to performance evaluation were described by the respondents, as shown in Figure 13. Self-appraisal (66%) was the most commonly reported method for performance evaluation, followed by direct observation of staff (54%) and client/patient satisfaction survey (54%). Standards (51%), a process in which the performance data collected are compared with pre-determined standards of practice, was also a popular method for examining performance in allied health. Almost half (49%) of the respondents reported using peer review and critical incident reporting as means for evaluating performance. Less commonly used methods were interview (46%), outcome measures (40%), and chart stimulated recall (3%). Client/patient satisfaction survey Outcome measures Chart stimulated recall Critical incident reporting Self-appraisal Peer review Direct observation Interview Standards 0% 10% 20% 30% 40% 50% 60% 70% Figure 13: Approaches to performance evaluation When individual professions were examined, self-appraisal remains to be the most common method of evaluation for occupational therapists, physiotherapists and social workers but not for other disciplines. Standards and direct observation are commonplace for radiographers and podiatrists, while client/patient surveys and peer review are popular among speech pathologists. The majority of respondents (36%) reported that managers and senior staff are responsible for implementing the performance evaluation, while 24% said only the managers are involved; another 18% reported that only senior staff evaluated performance. A few of the respondents (12%) mentioned the involvement of independent assessors in the evaluation team, and only 9% revealed that human resources department was also part of the evaluation. For most respondents (66%), performance information is obtained from administrative data and almost equal number of managers reported using medical records (49%) and patient survey (51%) as sources of data. Only a few mentioned the use of other information systems (9%) such as incident reporting systems, documentation on P a g e 34

35 clinical and professional supervision, procedure reviews, and staff feedback. Challenges to performance evaluation While there are benefits to undertaking performance evaluation, the respondents have also reported barriers and challenges to its implementation, as shown in Figure 14. Lack of time Lack of funding Personality differences Lack of managerial interest Lack of understanding of the process Others None 0% 10% 20% 30% 40% 50% 60% 70% 80% Figure 14: Challenges to performance evaluation Lack of time was the most frequently cited barrier to performance evaluation, followed by lack of understanding of the process and limited funding. A few of the respondents reported personality differences and lack of managerial interest as significant barriers. Reluctance of staff and lack of guidelines or framework to undertake the process were also mentioned by a few managers. Strategies to address the barriers The respondents were asked to identify strategies which can potentially address the barriers associated with performance evaluation. The following are themes which emerged from the responses, including verbatim quotations from respondents: A. A formal process for evaluation Formal process for evaluation and adequate line of management to follow up on the outcomes Guidelines / framework for allied health leads / managers A standard tool to do evaluations that has managerial support would be very useful B. Training or education about the process training for all staff Education re the processes C. Support from an external evaluator I believe you require someone who has expertise in this area who is P a g e 35

36 brought in outside of the team to conduct research as opposed to 'adding on' more work for staff who already have a lot of work. Partnerships with universities to have students evaluate our group programs as part of degrees (i.e. Honours, Masters and PhDs) D. Allocating a position dedicated to performance evaluation Managers are so stretched with HR responsibilities and day to day operational requirements of the service. Senior staffs have high clinical load and other responsibilities. Allocating time - or a position dedicated to performance evaluation - would be very helpful. A key issue with the time is not only the development of the appropriate tool, but also the actual use - audit, observation, interview etc., but most time consuming is the data analysis. There is a total lack of data/quality support. There are clinicians are particularly interested in performance evaluation and it would be good to have funded positions available for them to rotate through (e.g. a 6mth period) to undertake performance evaluation activities Changes to practice associated with the outcomes of performance evaluation Eighty percent (80%) of allied health managers reported that outcomes of performance evaluation are used to improve and strengthen current health services. When asked about how they have used the results to improve service delivery, the following themes emerged from the responses, including verbatim quotations from respondents: A. Identification of training or professional development needs highlight areas of clinical skill which has not been developed/implemented and make these goals for further CPD/up skilling use to identify training needs for staff B. Identification of service gaps identified service gaps addressed family therapy working with people with a borderline personality home equipment for clients C. Determination of allocation of resources and funding Decisions regarding where to put services, how to prioritise resource allocation The data is used to support equipment requests D. Improvement of clinical services Data from evaluations is used to improve service delivery P a g e 36

37 KEY SUMMARY POINTS The majority of survey respondents reported undertaking regular performance evaluation in their department. The primary drivers for conducting evaluation are related to improving the quality of health services, identifying areas for professional development, and meeting accreditation standards. Local practices are generally based on widely accepted methods and principles. Key areas for performance evaluation across professions include structure, process and outcomes. However, there is variability in the focus of evaluation for individual professions. Occupational therapists consider structure, process and outcomes equally important in performance evaluation. Physiotherapists deem process measures as the most important followed by outcome measures then structure measures. Social workers and psychologists believe that process and outcomes are equally considered in evaluation and focus less on structure measures. Radiographers, podiatrists and speech pathologists consider process measures more than outcome and structure measures. For the orthotist respondent, only outcomes are regarded as performance measures. Performance evaluation is undertaken using a variety of approaches which mainly include self-appraisal, direct observation of staff, and survey of patient satisfaction. Other methods such as standards (i.e. audit), peer review, critical incident reporting are also popular in allied health evaluation. Less commonly used methods are interview, outcome measures, and chart stimulated recall. While all respondents value the importance of performance evaluation, the majority reported various challenges associated with the process. These include lack of time, lack of understanding of the process, limited funding, lack of managerial interest, personality differences, and lack of a standard framework to undertake the process. When asked about strategies that can potentially address these barriers, allied health managers believed that a formal process for evaluation and training provided to evaluators would be useful. Support from an external evaluator or allocating a position dedicated to performance evaluation were also identified as potential strategies. Outcomes of performance evaluation for most professions lead to identification of professional development needs, identification of service gaps, determination of resource or funding allocation, and improvements in clinical services. P a g e 37

38 Synthesis of findings from Phase I (Parts 1 and 2): The genesis of the ASPIRE tool P a g e 38

39 FRAMEWORK FOR PERFORMANCE EVALUATION This section of the interim report describes how the ASPIRE tool was produced, from a synthesis of the findings of the literature review and the user survey. To our knowledge, this is the first tool to assist in structured performance evaluation, which has been developed within an evidence-based framework. Synthesis of review findings and survey results Performance evaluation denotes assessment of the quality of allied health services. It is context dependent, and should be designed to meet the unique requirements of the health system. A variety of approaches to performance evaluation have been reported in the literature, and the local practices are generally based on widely accepted methods and principles. The literature suggests a mid-level assessment that involves small functional groups of health professionals or practices as it promotes a sense of ownership of professional performance and effective team work. Underpinning an effective evaluation system are core elements or processes that include prioritisation of clinical area for evaluation, upfront articulation of goals, careful identification of performance measures, mapping of measures to information sources, analysis of performance data and reporting of results, and evaluation of the performance evaluation system. The literature also highlighted barriers and challenges to performance evaluation, which are similar to those identified in the survey. These include lack of time, limited resources and lack of understanding of the process of evaluation. Therefore, based on findings from the systematic review and a survey of current practice, a model for performance evaluation that can be used across allied health professions in South Australia is proposed. This model captures best practice in performance evaluation of health services guided by the local practice context in South Australia. An acronym, ASPIRE, was formulated to describe the core elements of performance evaluation (as shown in Table 2). The ASPIRE model utilises a collaborative approach between allied health practitioners and experienced researchers who are skilled not just in providing training but also in undertaking performance evaluation. The collaboration recognises the barriers related to lack of time, resources and understanding of the evaluation process, and therefore divides the core tasks between researchers and allied health practitioners from the health site. The researchers will provide strong initial support and guidance which gradually reduces to enable practitioners to establish and maintain independence and promote a sense of ownership of the performance evaluation system. The ASPIRE model is ideal in building capacity that can increase the likelihood of allied health practitioners conducting performance evaluation in the future. Table 2: ASPIRE for quality model Area for evaluation The evaluation team from the allied health site identifies and prioritises the clinical area for performance evaluation. Set goals Performance indicators Based on the identified clinical area, the evaluation team sets the goals for performance evaluation. The evaluation team, assisted by experienced researchers, P a g e 39

40 identifies performance measures or indicators. Information sources Report results Evaluate The evaluation team maps the performance measures to information sources. The researchers and evaluation team collaboratively analyses the results and report to stakeholders. The researchers and evaluation team collaboratively evaluates the performance evaluation process and its outcomes. Integral to the ASPIRE model is the formation of an evaluation team from the allied health site prior to the undertaking of performance evaluation. The team may include any of the following personnel: manager, senior staff, an independent accreditor or human resource personnel. Each team member must understand and be clear on what will be measured and what the foundations for evaluation are. The team should actively contribute to, and agree on the processes, methods and tools for evaluation and each member should be clear on their roles and responsibilities. The evaluation team is expected to work closely with the researchers and regular meetings should be organised to monitor the process. ASPIRE: Area for evaluation The evaluation team from the allied health site identifies a clinical area for performance evaluation. The following questions/checklist may be used as a guide in determining a priority area for evaluation. Is it important and relevant to the group for which the performance measurement system is being produced? Is it problem-prone and with high frequency of occurrence, or is it suspected of overuse, underuse, or misuse? Does it have strong financial impact? Does it have potential to improve health care delivery and outcomes? Has it recently undergone major changes? Does it have proven and significant variation in quality of service among health care providers? Is it considered high risk for patients? ASPIRE: Set goals for evaluation The goal for performance evaluation should be clearly articulated by the evaluation team before the measurement process is commenced. It is typically targeted to improve more than one of the following domains: acceptability, accessibility, appropriateness, care environment and amenities, continuity, competence or capability, effectiveness, improving health or clinical focus, expenditure or cost, efficiency, equity, governance, patient-centeredness, safety, sustainability, timeliness, and utilisation of care. P a g e 40

41 ASPIRE: Performance indicators The evaluation team and the researchers work collaboratively to identify performance indicators. A performance measure or indicator is used to assess a health care structure, process or an outcome. Structure measures evaluate the means and resources used by the health system to deliver allied health services. Process measures assess what the allied health practitioner did for the patient and how well it was done. Outcome measures examine the change in patients health status which can be attributed to the effectiveness of the treatment. Performance measures are based on standards of care, which can be evidence-based or, in the absence of scientific evidence, determined by an expert panel of health practitioners. They must be comprehensible, valid, reliable, reproducible, discriminative and easy to use. Performance evaluation typically involves multiple measures, rather than a single performance measure, in order to obtain a comprehensive assessment of performance. ASPIRE: Information sources The evaluation team reflects on the data they need to collect to measure structure, process or outcomes. Common sources of information or performance data are medical records, administrative data, and patient surveys. There may also be other information systems that can be sourced to obtain information such as incident reporting systems, documentation on clinical and professional supervision, and staff feedback. Additional data collection may be undertaken if required. The evaluation team should map the identified performance measures to the information sources. ASPIRE: Report results The evaluation team and researchers collaboratively analyse the data obtained from various information systems. Descriptive and inferential statistics may be required for quantitative data and thematic analysis for qualitative data. Presentation of results and findings should be concise, easy to understand and tailored to the needs of the stakeholders (e.g. clients, allied health practitioners, managers, human resources department). Performance evaluation reports typically include a combination of text, tables and graphs or charts. ASPIRE: Evaluate the performance evaluation process and its outcomes The evaluation team and researchers work collaboratively to evaluate the performance evaluation processes and its outcomes. The evaluation will focus on the following: practice changes that have occurred as a result of the evaluation, how well the new model is accepted by staff and management, the extent to which the model has improved the quality of health care (i.e. health impact and quality of service), and what improvements can be made to the evaluation system that can facilitate its effective and sustainable uptake by allied health practitioners. The ASPIRE for quality model includes a set of tools (Appendix 1-6) which allied health practitioners (i.e. evaluation team) can use to facilitate the process of evaluation. P a g e 41

42 Phase II: Pilot Evaluation of ASPIRE P a g e 42

43 Results of evaluation Pilot sites Three sites, which represented a metropolitan rehabilitation hospital, a metropolitan acute tertiary hospital and a regional general hospital, participated in the pilot evaluation of the ASPIRE framework. Metropolitan Rehabilitation Hospital: This site has identified rehabilitation following unilateral below knee and above knee amputation as their area for performance evaluation. The goal for evaluation was to examine practice compliance against established clinical guidelines for amputation in order to stimulate improvements in allied health services, which could potentially improve patients functional outcomes and decrease their length of stay in rehabilitation. Three allied health practitioners, namely a physiotherapist, an occupational therapist and a social work, volunteered to participate in the pilot evaluation. Metropolitan Acute Tertiary Hospital: This site aimed to determine the impact of implementing a structured mood tool in identifying patients who are likely to be depressed or are experiencing mood disturbance following a stroke episode. The mood tool was administered to comply with the National Stroke Guidelines which recommend the use of a structured and psychometrically sound instrument to detect early mood changes (i.e. depression) and therefore facilitate timely referrals to psychological assessment and treatment. Three social workers volunteered to assist in the performance evaluation using the ASPIRE framework. Regional General Hospital: This site examined compliance of current practice in foot screening against the National Evidence-based Guideline for the Prevention, Identification and Management of Foot Complications in Diabetes. Two podiatrists volunteered to undertake the clinical performance evaluation using the ASPIRE framework. Methods Clinical performance evaluation was undertaken by the three participating sites for a 2- month period using the ASPIRE framework. A mixed-methods approach was used to evaluate the framework. A 15-item survey questionnaire (Appendix 8) was developed and administered to examine the extent to which ASPIRE was considered useful, acceptable, and appropriate to allied health clinical practice evaluation, including overall satisfaction with the framework. Semistructured group interviews were also undertaken to validate and complement the results of the survey. The following broad questions were used as a guide during the interview: 1. What are your perceptions regarding ASPIRE as a framework for your routine performance evaluation in your department? 2. What are your impressions of how well your team embraced the ASPIRE to facilitate performance evaluation? 3. What are your perceptions of what works well and what does not work well in the ASPIRE framework? P a g e 43

44 4. What difference did ASPIRE make in the conduct of your performance evaluation? Percentage and graphs were used to analyse the responses in the survey questionnaire. Using content analysis, data from interviews were coded and distilled into contentrelated categories and themes identified. Characteristics of the evaluators Results of quantitative evaluation Six allied health practitioners (i.e. two from each site) completed the survey questionnaire and participated in a group interview with the investigator. The majority of evaluators were AHP2 (66%), with only one AHP3 (17%) and AHP 1 (17%). Their years of experience varied from 1.5 years to 32 years; all spend 60% or more of their time on direct patient care. Results are presented using percentages and graphs. A. The extent to which ASPIRE has helped in the process of undertaking performance evaluation B. Usefulness of the ASPIRE framework to the department P a g e 44

45 C. Appropriateness to the department, organisation, ease of implementation, pace of delivery of the ASPIRE framework (Same ratings were provided by the participants) D. Evaluation issues in the past that evaluators have hoped to address with the ASPIRE Lack of patient-centred outcomes (i.e. look at length of stay, but not necessarily what the patient achieved in that time) Motivation and impetus to undertake a new project of any significance due to lack of confidence (and perceived competence). Time restrictions / limitations; Administrative Support within the department, and beyond to facilitate the processes of running such activities/ studies. Objective data to measure performance against beyond the global Length of Stay measure. Lack of time dedicated to performance improvement E. Evaluation issues in the past which were addressed by ASPIRE The performance indicators were based on guidelines for patient care, rather than funding-related measures, so they tended to be more patientcentred Both issues were helped by the framework and support and guidance of the ASPIRE team; together with a staff member with strong admin and organisational skills to perform these tasks. The project forced us to spend time working on performance evaluation. This format helped us to evaluate specific measures and hopefully, once P a g e 45

46 data is analysed, give us scope for improving practice and medico legal compliance. F. How well did ASPIRE meet the department s expectations G. Extent of improvement in the level of confidence in undertaking performance evaluation P a g e 46

47 H. Extent of improvement in the level of motivation in undertaking performance evaluation I. Support received from researchers in the critical aspects of performance evaluation J. Areas of performance evaluation (using ASPIRE) evaluators feel they need more information/training Nil, just more practice to become familiar with the tool. Targeting areas of suitable to study and SMART; Need to make sure the P a g e 47

48 topic will be relevant and serve a purpose and be valid activity Setting up our own spreadsheet. (Though this may be more an IT software competence issue) Data analysis K. Areas of performance evaluation (using ASPIRE) do you feel confident about and therefore do not require assistance from external evaluators (i.e. researchers) Case note auditing Auditing, creating process measures L. Areas of performance evaluation (using ASPIRE) evaluators feel they need more help from external evaluators (i.e. researchers) Interpreting data A key benefit of the ASPIRE program was the access to and guidance of Lucy and Alvin to lead us along the journey; providing oversight, research expertise, assurance and keep the project on track and within timelines. Provided us with the feedback and confidence to maintain momentum. Data analysis M. Changes evaluators anticipate as a result of the performance evaluation Mainly around processes - i.e. we know we are doing what is recommended, but not consistently documenting it. I ll take lessons from the results and change clinical practice, modify forms to help ensure compliance and performance moving forward. Ensuring we meet 100% compliance with identified process measures. N. Likelihood of using ASPIRE in the next round of performance evaluation P a g e 48

49 O. Overall rating for ASPIRE Findings from qualitative evaluation The experiences of allied health practitioners regarding performance evaluation using ASPIRE were classified into: previous evaluations, strengths of the framework, challenges associated with performance evaluation using ASPIRE, and refinements to the ASPIRE framework. Previous performance evaluations When asked to compare the current performance evaluation using ASPIRE and previous evaluations, the participants reported different views. One of the sites commented that the process was similar, except for the support received from researchers who organised and facilitated the evaluation using a structured framework. Apart from the external support from a researcher who helped us in the data analysis, the process is pretty similar. Yours (i.e. ASPIRE) is more organised and you prepared everything and it flowed better. Some of the data that need to be collected you have also identified. Another site described their process as informal, did not measure a specific area of clinical practice, and there was no form of any evaluation documentation. Tend to be organised by the organisation as a quality driven activity but not in terms of actually measuring our clinical performance in a specific area as what we ve done in this ASPIRE project I don t know if we ve done something like this before or even if we did, it hasn t been documented or not as comprehensive as we are too busy. Too busy delivering client services One other site recalled their previous evaluations as purely outcome assessment, without regard for process evaluation. It s always been about checking the length of stay rather than looking at processes it s all about the outcome for the 3 and ½ years that I ve been working there. Very much P a g e 49

50 driven by the hospital, the system, the funding and the beds pressure about are you moving patients out of the system It s about you ve got longer length of stay, get it down. Strengths of the ASPIRE framework The participants agreed that working together with researchers from icahe is an effective strategy to encourage allied health to evaluate their clinical performance. They found the framework useful in providing them a structure or a step-by-step guide in undertaking a performance evaluation. The participants felt that partnership between allied health evaluators and icahe researchers is a blending of expertise, with icahe facilitating the research component (e.g. development of data abstraction forms, analysis of data) while clinicians provide an understanding of the work environment and clinical context. One thing I found daunting is taking on the task of developing a whole structure and how it s going to happen, what s going to be meaningful but the fact that you did all of that I found empowering. There was an organised structure it was very good. Being involved in the process gave us a sense of ownership. It s good to have someone to come on board and kind of narrow down what to look at otherwise, it would be too broad area and now how do we actually get that information out. One of the participants commented: It saved us quite bit of time. It was a different way of thinking. You simplified it and it didn t seem to be cumbersome because you can be frightened about the evaluation process but you made us feel that we can do this it s that encouragement that we got because it didn t seem like a complex process, and you guide us through. One of the sites recognised the value of including process measures in the evaluation and how these can be linked to outcomes. Going through those process measures is a good way of making sure that we do improve those things, which could potentially affect the outcome. It s more objective going back to the case notes and examining those process measures. You have to look at what was documented and without then you can never prove it was done. Some of those process measures would be higher than what the notes reflected but even so it s still an issue. It s also about legal stuff about documentation that we need to be mindful of. One of the sites also noted that going through the clinical guidelines as part of the process of identifying key performance indicators was a useful exercise for reflective practice. The participants recognised the value of evidence-based recommendations; however, they are not always up-to-date with scientific information. Being made aware of the clinical guidelines was very useful because we re not always aware of the breadth of things that are out there.which makes you think, ahhh we re P a g e 50

51 doing these but maybe we don t. All participants agreed that undertaking performance evaluation using ASPIRE created an environment for change and made them think of ways to improve the quality of their services. It also offered them an opportunity to reflect on their own clinical performance and discuss as a team potential strategies to correct or improve practice behaviour. One of the participants commented: This evaluation identified that many of the assessments that we do are not properly or adequately documented. We know that a lot of us do this but we don t necessarily write them in the notes, which in itself is a legal issue. We need to revisit our documentation and because we have this report we can say, look this is what s happening and we have to do something about it. Challenges associated with performance evaluation using ASPIRE framework The challenges raised by the participants were not specific to the use of ASPIRE but rather common to any process of performance evaluation. One of the participants reported that identification of process indicators that are relevant to their outcome of interest was quite challenging, particularly if there are several process recommendations in the clinical guidelines. I found it difficult to know which of those processes from the guidelines would affect the outcomes. Time to collect or abstract data from clinical case records was also a concern for some participants. The resources available, personnel to abstract the data, on top of all the work that we need to do can be quite challenging. Should we include this or should we consider that in the data abstraction you have to look back at the instructions and indicators every now and then, which can be quite time consuming but it was a learning process Refinements to the ASPIRE framework to facilitate effective and sustainable uptake in allied health Overall, the participants were positive about ASPIRE, and felt that performance evaluation using a framework was a worthwhile experience. However, they believed that there are still opportunities for improvement which could increase its effectiveness. The most telling comments came from participants who felt that the evaluation process could have been more effective if there was longer time spent on planning the evaluation. Longer planning time especially when developing the data abstraction sheet to develop a common understanding of what should be abstracted. Participants from the regional site suggested that a face-to-face consultation, rather than a teleconference, is beneficial particularly during the early stages of planning. P a g e 51

52 Face-to-face contact and a visit to the site by the researchers during the planning process, rather than a teleconference, would be preferred. Some participants felt that distilling performance indicators from evidence-based clinical guidelines could have been an easier process if a wider team was involved. The idea of having a wider team to discuss the guidelines to identify the indicators. P a g e 52

53 KEY SUMMARY POINTS Overall, the participants were positive about ASPIRE, and felt that performance evaluation using a structured framework was a worthwhile experience. The evaluation findings suggest that participants found ASPIRE as a useful, appropriate, and easy to implement model for evaluating clinical performance in allied health, and that the current structure is acceptable and convenient to clinicians. They agreed that ASPIRE has addressed evaluation difficulties encountered in the past and identified the partnership with researchers as an effective strategy for encouraging allied health practitioners to evaluate performance. The participants reported that ASPIRE has improved their level of confidence and motivation to conduct performance evaluation. Time to gather evaluation information and difficulty in identifying performance indicators were described as barriers to performance evaluation. Strategies such as longer time for evaluation planning, face-to-face consultations with researchers (as opposed to teleconference), and a wider team involvement in the identification of performance indicators can potentially address these barriers. P a g e 53

54 APPENDICES P a g e 54

55 Appendix 1 (Worksheet 1) P a g e 55

56 Appendix 2 (Worksheet 2) P a g e 56

57 Appendix 3 (Worksheet 3) P a g e 57

58 Appendix 4 (Worksheet 4) P a g e 58

59 Appendix 5 (Worksheet 5) P a g e 59

60 Appendix 6 (Worksheet 6) P a g e 60

61 Appendix 7 (Performance Evaluation Questionnaire) P a g e 61

62 P a g e 62

63 P a g e 63

64 P a g e 64

65 P a g e 65

66 P a g e 66

67 Appendix 8 (ASPIRE for quality Evaluation Questionnaire) P a g e 67

68 P a g e 68

69 P a g e 69

Nursing skill mix and staffing levels for safe patient care

Nursing skill mix and staffing levels for safe patient care EVIDENCE SERVICE Providing the best available knowledge about effective care Nursing skill mix and staffing levels for safe patient care RAPID APPRAISAL OF EVIDENCE, 19 March 2015 (Style 2, v1.0) Contents

More information

Quality Management Building Blocks

Quality Management Building Blocks Quality Management Building Blocks Quality Management A way of doing business that ensures continuous improvement of products and services to achieve better performance. (General Definition) Quality Management

More information

Continuing Professional Development Supporting the Delivery of Quality Healthcare

Continuing Professional Development Supporting the Delivery of Quality Healthcare 714 CPD Supporting Delivery of Quality Healthcare I Starke & W Wade Continuing Professional Development Supporting the Delivery of Quality Healthcare I Starke, 1 MD, MSc, FRCP, W Wade, 2 BSc (Hons), MA

More information

Australian Medical Council Limited

Australian Medical Council Limited Australian Medical Council Limited Procedures for Assessment and Accreditation of Specialist Medical Programs and Professional Development Programs by the Australian Medical Council 2017 Specialist Education

More information

Clinical governance for Primary Health Networks

Clinical governance for Primary Health Networks no: 22 date: 21/04/2017 title Clinical governance for Primary Health Networks authors Amanda Jones Manager, Deeble Institute for Health Policy Research Australian Healthcare and Hospitals Association Email:

More information

NATIONAL INSTITUTE FOR HEALTH AND CARE EXCELLENCE. Health and Social Care Directorate Quality standards Process guide

NATIONAL INSTITUTE FOR HEALTH AND CARE EXCELLENCE. Health and Social Care Directorate Quality standards Process guide NATIONAL INSTITUTE FOR HEALTH AND CARE EXCELLENCE Health and Social Care Directorate Quality standards Process guide December 2014 Quality standards process guide Page 1 of 44 About this guide This guide

More information

Supporting information for appraisal and revalidation: guidance for psychiatry

Supporting information for appraisal and revalidation: guidance for psychiatry Supporting information for appraisal and revalidation: guidance for psychiatry Based on the Academy of Medical Royal Colleges and Faculties Core for all doctors. General Introduction The purpose of revalidation

More information

Physiotherapist. South Adelaide Local Health Network. Flinders Medical Centre. Bedford Park AHP1. AHP1 - $64,978- $79,604 (pro rata)

Physiotherapist. South Adelaide Local Health Network. Flinders Medical Centre. Bedford Park AHP1. AHP1 - $64,978- $79,604 (pro rata) SA Health Job Pack Job Title Physiotherapist Job Number 579392 Applications Closing Date 31/12/2016 Region / Division Health Service Location Classification South Adelaide Local Health Network Flinders

More information

Effectively implementing multidisciplinary. population segments. A rapid review of existing evidence

Effectively implementing multidisciplinary. population segments. A rapid review of existing evidence Effectively implementing multidisciplinary teams focused on population segments A rapid review of existing evidence October 2016 Francesca White, Daniel Heller, Cait Kielty-Adey Overview This review was

More information

Quality of Care Approach Quality assurance to drive improvement

Quality of Care Approach Quality assurance to drive improvement Quality of Care Approach Quality assurance to drive improvement December 2017 We are committed to equality and diversity. We have assessed this framework for likely impact on the nine equality protected

More information

Supporting information for appraisal and revalidation: guidance for Supporting information for appraisal and revalidation: guidance for ophthalmology

Supporting information for appraisal and revalidation: guidance for Supporting information for appraisal and revalidation: guidance for ophthalmology FOREWORD As part of revalidation, doctors will need to collect and bring to their appraisal six types of supporting information to show how they are keeping up to date and fit to practise. The GMC has

More information

Quality assessment / improvement in primary care

Quality assessment / improvement in primary care Quality assessment / improvement in primary care Drivers of quality Patients should receive the care they need, which is known to be effective, and in a way that does not harm them. Patients should not

More information

Core competencies* for undergraduate students in clinical associate, dentistry and medical teaching and learning programmes in South Africa

Core competencies* for undergraduate students in clinical associate, dentistry and medical teaching and learning programmes in South Africa Core competencies* for undergraduate students in clinical associate, dentistry and medical teaching and learning programmes in South Africa Developed by the Undergraduate Education and Training Subcommittee

More information

GUIDANCE ON SUPPORTING INFORMATION FOR REVALIDATION FOR SURGERY

GUIDANCE ON SUPPORTING INFORMATION FOR REVALIDATION FOR SURGERY ON SUPPORTING INFORMATION FOR REVALIDATION FOR SURGERY Based on the Academy of Medical Royal Colleges and Faculties Core Guidance for all doctors GENERAL INTRODUCTION JUNE 2012 The purpose of revalidation

More information

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, June 2014

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, June 2014 Supporting information for appraisal and revalidation: guidance for Occupational Medicine, June 2014 Based on the Academy of Medical Royal Colleges and Faculties Core for all doctors. General Introduction

More information

Guidance on supporting information for revalidation

Guidance on supporting information for revalidation Guidance on supporting information for revalidation Including specialty-specific information for medical examiners (of the cause of death) General introduction The purpose of revalidation is to assure

More information

Assessing competence during professional experience placements for undergraduate nursing students: a systematic review

Assessing competence during professional experience placements for undergraduate nursing students: a systematic review University of Wollongong Research Online Faculty of Science, Medicine and Health - Papers Faculty of Science, Medicine and Health 2012 Assessing competence during professional experience placements for

More information

CAREER & EDUCATION FRAMEWORK

CAREER & EDUCATION FRAMEWORK CAREER & EDUCATION FRAMEWORK FOR NURSES IN PRIMARY HEALTH CARE ENROLLED NURSES Acknowledgments The Career and Education Framework is funded by the Australian Government Department of Health under the Nursing

More information

AMC Workplace-based Assessment Accreditation Guidelines and Procedures. 7 October 2014

AMC Workplace-based Assessment Accreditation Guidelines and Procedures. 7 October 2014 AMC Workplace-based Assessment Accreditation Guidelines and Procedures 7 October 2014 Contents Part A: Workplace-based assessment accreditation procedures... 1 1. Background information... 1 2. What is

More information

Supporting information for appraisal and revalidation: guidance for pharmaceutical medicine

Supporting information for appraisal and revalidation: guidance for pharmaceutical medicine Supporting information for appraisal and revalidation: guidance for pharmaceutical medicine Based on the Academy of Medical Royal Colleges and Faculties Core for all doctors. General Introduction The purpose

More information

Organisational factors that influence waiting times in emergency departments

Organisational factors that influence waiting times in emergency departments ACCESS TO HEALTH CARE NOVEMBER 2007 ResearchSummary Organisational factors that influence waiting times in emergency departments Waiting times in emergency departments are important to patients and also

More information

Occupational Therapist. Andrew Maglaras Occupational Therapy Manager.

Occupational Therapist. Andrew Maglaras Occupational Therapy Manager. SA Health Job Pack Job Title Occupational Therapist Job Number 507918 Applications Closing Date 31/12/15 Region / Division Health Service Location Classification Job Status Indicative Total Remuneration

More information

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, April 2013

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, April 2013 Supporting information for appraisal and revalidation: guidance for Occupational Medicine, April 2013 Based on the Academy of Medical Royal Colleges and Faculties Core for all doctors. General Introduction

More information

Carving an identity for allied health

Carving an identity for allied health Carving an identity for allied health DOMINIC DAWSON Dominic Dawson developed the Division of Allied Health at Lottie Stewart Hospital and was the director of Allied Health until January 2001. Abstract

More information

Allied Health Review Background Paper 19 June 2014

Allied Health Review Background Paper 19 June 2014 Allied Health Review Background Paper 19 June 2014 Background Mater Health Services (Mater) is experiencing significant change with the move of publicly funded paediatric services from Mater Children s

More information

Physiotherapy UK 2018 will take place on October, at the Birmingham ICC.

Physiotherapy UK 2018 will take place on October, at the Birmingham ICC. Call for abstracts Physiotherapy UK 2018 will take place on 19-20 October, at the Birmingham ICC. The Chartered Society of Physiotherapy is inviting abstract submissions for platform and poster presentations.

More information

A Primer on Activity-Based Funding

A Primer on Activity-Based Funding A Primer on Activity-Based Funding Introduction and Background Canada is ranked sixth among the richest countries in the world in terms of the proportion of gross domestic product (GDP) spent on health

More information

Initial education and training of pharmacy technicians: draft evidence framework

Initial education and training of pharmacy technicians: draft evidence framework Initial education and training of pharmacy technicians: draft evidence framework October 2017 About this document This document should be read alongside the standards for the initial education and training

More information

NATIONAL TOOLKIT for NURSES IN GENERAL PRACTICE. Australian Nursing and Midwifery Federation

NATIONAL TOOLKIT for NURSES IN GENERAL PRACTICE. Australian Nursing and Midwifery Federation NATIONAL TOOLKIT for NURSES IN GENERAL PRACTICE Australian Nursing and Midwifery Federation Acknowledgements This tool kit was prepared by the Project Team: Julianne Bryce, Elizabeth Foley and Julie Reeves.

More information

ASPIRE for quality: a new evidence based tool to evaluate clinical service performance

ASPIRE for quality: a new evidence based tool to evaluate clinical service performance DOI 10.1186/s13104-016-2109-0 BMC Research Notes RESEARCH ARTICLE ASPIRE for quality: a new evidence based tool to evaluate clinical service performance Jeric Uy *, Lucylynn Lizarondo and Alvin Atlas Open

More information

O1 Readiness. O2 Implementation. O3 Success A FRAMEWORK TO EVALUATE MUSCULOSKELETAL MODELS OF CARE

O1 Readiness. O2 Implementation. O3 Success A FRAMEWORK TO EVALUATE MUSCULOSKELETAL MODELS OF CARE FOR MUSCULOSKELETAL HEALTH O1 Readiness O2 Implementation O3 Success A FRAMEWORK TO EVALUATE MUSCULOSKELETAL MODELS OF CARE GLOBAL ALLIANCE SUPPORTING ORGANISATIONS The following organisations publicly

More information

Vanguard Programme: Acute Care Collaboration Value Proposition

Vanguard Programme: Acute Care Collaboration Value Proposition Vanguard Programme: Acute Care Collaboration Value Proposition 2015-16 November 2015 Version: 1 30 November 2015 ACC Vanguard: Moorfields Eye Hospital Value Proposition 1 Contents Section Page Section

More information

PG snapshot PRESS GANEY IDENTIFIES KEY DRIVERS OF PATIENT LOYALTY IN MEDICAL PRACTICES. January 2014 Volume 13 Issue 1

PG snapshot PRESS GANEY IDENTIFIES KEY DRIVERS OF PATIENT LOYALTY IN MEDICAL PRACTICES. January 2014 Volume 13 Issue 1 PG snapshot news, views & ideas from the leader in healthcare experience & satisfaction measurement The Press Ganey snapshot is a monthly electronic bulletin freely available to all those involved or interested

More information

Study definition of CPD

Study definition of CPD 1. ABSTRACT There is widespread recognition of the importance of continuous professional development (CPD) and life-long learning (LLL) of health professionals. CPD and LLL help to ensure that professional

More information

Public Health Skills and Career Framework Multidisciplinary/multi-agency/multi-professional. April 2008 (updated March 2009)

Public Health Skills and Career Framework Multidisciplinary/multi-agency/multi-professional. April 2008 (updated March 2009) Public Health Skills and Multidisciplinary/multi-agency/multi-professional April 2008 (updated March 2009) Welcome to the Public Health Skills and I am delighted to launch the UK-wide Public Health Skills

More information

Clinical analysis of coded data and the effect on quality of care

Clinical analysis of coded data and the effect on quality of care Clinical analysis of coded data and the effect on quality of care Colin McCrow Abstract Having an indication of the cost of healthcare is the fi rst step in achieving an activity-based funding (ABF) environment.

More information

Clinical Leadership in Community Health. Project Report

Clinical Leadership in Community Health. Project Report Clinical Leadership in Community Health Project Report March 2009 Table of Contents Introduction... 3 Background..3 Why Clinical Leadership 3 Project Overview... 4 Attributes and Tasks for Effective Clinical

More information

Wales Psychological Therapies Plan for the delivery of Matrics Cymru The National Plan 2018

Wales Psychological Therapies Plan for the delivery of Matrics Cymru The National Plan 2018 Wales Psychological Therapies Plan for the delivery of Matrics Cymru The National Plan 2018 Written by the National Psychological Therapies Management Committee, supported by 1000 Lives Improvement, Public

More information

October 2015 TEACHING STANDARDS FRAMEWORK FOR NURSING & MIDWIFERY. Final Report

October 2015 TEACHING STANDARDS FRAMEWORK FOR NURSING & MIDWIFERY. Final Report October 2015 TEACHING STANDARDS FRAMEWORK FOR NURSING & MIDWIFERY Final Report Support for this activity has been provided by the Australian Government Office for Learning and Teaching. The views expressed

More information

Evaluation of the Links Worker Programme in Deep End general practices in Glasgow

Evaluation of the Links Worker Programme in Deep End general practices in Glasgow Evaluation of the Links Worker Programme in Deep End general practices in Glasgow Interim report May 2016 We are happy to consider requests for other languages or formats. Please contact 0131 314 5300

More information

Challenges Of Accessing And Seeking Research Information: Its Impact On Nurses At The University Teaching Hospital In Zambia

Challenges Of Accessing And Seeking Research Information: Its Impact On Nurses At The University Teaching Hospital In Zambia Challenges Of Accessing And Seeking Research Information: Its Impact On Nurses At The University Teaching Hospital In Zambia (Conference ID: CFP/409/2017) Mercy Wamunyima Monde University of Zambia School

More information

Charge Nurse Manager Adult Mental Health Services Acute Inpatient

Charge Nurse Manager Adult Mental Health Services Acute Inpatient Date: February 2013 DRAFT Job Title : Charge Nurse Manager Department : Waiatarau Acute Unit Location : Waitakere Hospital Reporting To : Operations Manager Adult Mental Health Services for the achievement

More information

The Trainee Doctor. Foundation and specialty, including GP training

The Trainee Doctor. Foundation and specialty, including GP training Foundation and specialty, including GP training The duties of a doctor registered with the General Medical Council Patients must be able to trust doctors with their lives and health. To justify that trust

More information

Standards to support learning and assessment in practice

Standards to support learning and assessment in practice Standards to support learning and assessment in practice Houghton T (2016) Standards to support learning and assessment in practice. Nursing Standard. 30, 22, 41-46. Date of submission: January 19 2012;

More information

MASONIC CHARITABLE FOUNDATION JOB DESCRIPTION

MASONIC CHARITABLE FOUNDATION JOB DESCRIPTION MASONIC CHARITABLE FOUNDATION Grade: E JOB DESCRIPTION Job Title: Monitoring & Evaluation Officer Job Code: TBC Division/Team: Operations Department / Strategy & Special Projects Team Location: Great Queen

More information

Support Workers in Community Rehabilitation

Support Workers in Community Rehabilitation Support Workers in Community Rehabilitation Centre for Allied Health Evidence University of South Australia Adelaide Ms Leah Jeffries Dr Saravana Kumar Prof Karen Grimmer-Somers In association with Queensland

More information

Reviewing the literature

Reviewing the literature Reviewing the literature Smith, J., & Noble, H. (206). Reviewing the literature. Evidence-Based Nursing, 9(), 2-3. DOI: 0.36/eb- 205-02252 Published in: Evidence-Based Nursing Document Version: Peer reviewed

More information

Clinical Practice Guideline Development Manual

Clinical Practice Guideline Development Manual Clinical Practice Guideline Development Manual Publication Date: September 2016 Review Date: September 2021 Table of Contents 1. Background... 3 2. NICE accreditation... 3 3. Patient Involvement... 3 4.

More information

QUASER The Hospital Guide. A research-based tool to reflect on and develop your quality improvement strategies Version 2 (October 2014)

QUASER The Hospital Guide. A research-based tool to reflect on and develop your quality improvement strategies Version 2 (October 2014) QUASER The Hospital Guide A research-based tool to reflect on and develop your quality improvement strategies Version 2 (October 2014) Funding The research leading to these results has received funding

More information

Identifying key components of Professional Practice Models for nursing: A synthesis of the literature

Identifying key components of Professional Practice Models for nursing: A synthesis of the literature Identifying key components of Professional Practice Models for nursing: A synthesis of the literature Professor Di Twigg; Dr. Susan Slatyer; Dr. Linda Coventry; & Adjunct Associate Professor Sue Davis

More information

Improving teams in healthcare

Improving teams in healthcare Improving teams in healthcare Resource 1: Building effective teams Developed with support from Health Education England NHS Improvement Background In December 2016, the Royal College of Physicians (RCP)

More information

NHS. The guideline development process: an overview for stakeholders, the public and the NHS. National Institute for Health and Clinical Excellence

NHS. The guideline development process: an overview for stakeholders, the public and the NHS. National Institute for Health and Clinical Excellence NHS National Institute for Health and Clinical Excellence Issue date: April 2007 The guideline development process: an overview for stakeholders, the public and the NHS Third edition The guideline development

More information

Programme Specification

Programme Specification Programme Specification Title: Advanced Clinical Practice Final Award: Master of Science (MSc) With Exit Awards at: Postgraduate Certificate (PG Cert) Postgraduate Diploma (PG Dip) Master of Science (MSc)

More information

O ver the past decade, much attention has been paid to

O ver the past decade, much attention has been paid to EDUCATION AND TRAINING Developing a national patient safety education framework for Australia Merrilyn M Walton, Tim Shaw, Stewart Barnet, Jackie Ross... See end of article for authors affiliations...

More information

Charlotte Banks Staff Involvement Lead. Stage 1 only (no negative impacts identified) Stage 2 recommended (negative impacts identified)

Charlotte Banks Staff Involvement Lead. Stage 1 only (no negative impacts identified) Stage 2 recommended (negative impacts identified) Paper Recommendation DECISION NOTE Reporting to: Trust Board are asked to note the contents of the Trusts NHS Staff Survey 2017/18 Results and support. Trust Board Date 29 March 2018 Paper Title NHS Staff

More information

Business Case Advanced Physiotherapy Practitioners in Primary Care

Business Case Advanced Physiotherapy Practitioners in Primary Care 1 Business Case Advanced Physiotherapy Practitioners in Primary Care 1.0 Introduction This scheme supports the sustainability of primary care and the move towards a first line prudent multi-professional

More information

Nurse Consultant Impact: Wales Workshop report

Nurse Consultant Impact: Wales Workshop report Nurse Consultant Impact: Wales Workshop report Background Nurse Consultant (NC) posts were established in the United Kingdom in 2000 as part of the modernisation agenda for the NHS. The roles were intended

More information

Australian emergency care costing and classification study Authors

Australian emergency care costing and classification study Authors Australian emergency care costing and classification study Authors Deniza Mazevska, Health Policy Analysis, NSW, Australia Jim Pearse, Health Policy Analysis, NSW, Australia Joel Tuccia, Health Policy

More information

Request for Proposals

Request for Proposals Request for Proposals November 2017 2018 Primary Care Models of Care Evaluation Research Partnership A joint research initiative funded by the Health Research Council of New Zealand and Ministry of Health.

More information

HEADER. Enabling the consumer role in clinical governance A guide for health services

HEADER. Enabling the consumer role in clinical governance A guide for health services HEADER Enabling the consumer role in clinical governance A guide for health services A supplementary paper to the VQC document Better Quality, Better Health Care A Safety and Quality Improvement Framework

More information

St George s Healthcare NHS Trust: the next decade. Research Strategy

St George s Healthcare NHS Trust: the next decade. Research Strategy the next decade Research Strategy 2013 2018 July 2013 Page intentionally left blank Contents Introduction The drivers for change 4 5 Where we are currently with research Where we want research to be Components

More information

Student-Led Clinics: Building Placement Capacity and Filling Service Gaps

Student-Led Clinics: Building Placement Capacity and Filling Service Gaps Student-Led Clinics: Building Placement Capacity and Filling Service Gaps MADELYN NICOLE MICHELE FAIRBROTHER SRIVALLI VILAPAKKAM NAGARAJAN JULIA BLACKFORD LINDY MCALLISTER University of Sydney, Sydney,

More information

Stroke in Young Adults Funding Opportunity for Mid- Career Researchers. Guidelines for Applicants

Stroke in Young Adults Funding Opportunity for Mid- Career Researchers. Guidelines for Applicants Stroke in Young Adults Funding Opportunity for Mid- Career Researchers Guidelines for Applicants 1 Summary This document guides you through the preparation and submission of an application for the Stroke

More information

HEALTH CARE HOME ASSESSMENT (HCH-A)

HEALTH CARE HOME ASSESSMENT (HCH-A) HEALTH CARE HOME ASSESSMENT (HCH-A) To be used by Health Care Homes involved in stage one implementation To asses practice readiness, monitor progress, and for evaluation purposes. Practice name Your name

More information

Range of Variables Statements and Evidence Guide. December 2010

Range of Variables Statements and Evidence Guide. December 2010 Range of Variables Statements and Evidence Guide December 2010 Unit 1 Demonstrates knowledge sufficient to ensure safe practice. Each of the competency elements in this unit needs to be reflected in the

More information

australian nursing federation

australian nursing federation australian nursing federation Submission to the National Health Workforce Taskforce - Discussion paper: clinical placements across Australia: capturing data and understanding demand and capacity February

More information

Guide to Assessment and Rating for Services

Guide to Assessment and Rating for Services Guide to Assessment and Rating for Services September 2013 Copyright The details of the relevant licence conditions are available on the Creative Commons website (accessible using the links provided) as

More information

Standards of Practice for Professional Ambulatory Care Nursing... 17

Standards of Practice for Professional Ambulatory Care Nursing... 17 Table of Contents Scope and Standards Revision Team..................................................... 2 Introduction......................................................................... 5 Overview

More information

Draft National Quality Assurance Criteria for Clinical Guidelines

Draft National Quality Assurance Criteria for Clinical Guidelines Draft National Quality Assurance Criteria for Clinical Guidelines Consultation document July 2011 1 About the The is the independent Authority established to drive continuous improvement in Ireland s health

More information

Movember Clinician Scientist Award (CSA)

Movember Clinician Scientist Award (CSA) Movember Clinician Scientist Award (CSA) Part 1: Overview Information Participating Organisation(s) Funding Category Description The Movember Foundation and Prostate Cancer Foundation of Australia Movember

More information

TYRE STEWARDSHIP AUSTRALIA. Tyre Stewardship Research Fund Guidelines. Round 2. Project Stream

TYRE STEWARDSHIP AUSTRALIA. Tyre Stewardship Research Fund Guidelines. Round 2. Project Stream TYRE STEWARDSHIP AUSTRALIA Tyre Stewardship Research Fund Guidelines Round 2 Project Stream Tyre Stewardship Australia Suite 6, Level 4, 372-376 Albert Street, East Melbourne, Vic 3002. Tel +61 3 9077

More information

National Accreditation Guidelines: Nursing and Midwifery Education Programs

National Accreditation Guidelines: Nursing and Midwifery Education Programs National Accreditation Guidelines: Nursing and Midwifery Education Programs February 2017 National Accreditation Guidelines: Nursing and Midwifery Education Programs Version Control Version Date Amendments

More information

The PCT Guide to Applying the 10 High Impact Changes

The PCT Guide to Applying the 10 High Impact Changes The PCT Guide to Applying the 10 High Impact Changes This Guide has been produced by the NHS Modernisation Agency. For further information on the Agency or the 10 High Impact Changes please visit www.modern.nhs.uk

More information

CANCER COUNCIL NSW PROGRAM GRANTS INFORMATION FOR APPLICANTS

CANCER COUNCIL NSW PROGRAM GRANTS INFORMATION FOR APPLICANTS CANCER COUNCIL NSW PROGRAM GRANTS INFORMATION FOR APPLICANTS For funding commencing in 2016 Applications open on 9 th February 2015 and close at 5pm (AEST) on 27 th April 2015. Late applications will not

More information

Occupational Therapist Level 1/2 - Locum

Occupational Therapist Level 1/2 - Locum Occupational Therapist Level 1/2 - Locum INFORMATION PACK CONTENTS: 1. Selection Criteria (please address in a cover letter) & How To Apply 2. Context and Scope 3. HammondCare s Motivation, Mission and

More information

Measure what you treasure: Safety culture mixed methods assessment in healthcare

Measure what you treasure: Safety culture mixed methods assessment in healthcare BUSINESS ASSURANCE Measure what you treasure: Safety culture mixed methods assessment in healthcare DNV GL Healthcare Presenter: Tita A. Listyowardojo 1 SAFER, SMARTER, GREENER Declaration of interest

More information

Re: Rewarding Provider Performance: Aligning Incentives in Medicare

Re: Rewarding Provider Performance: Aligning Incentives in Medicare September 25, 2006 Institute of Medicine 500 Fifth Street NW Washington DC 20001 Re: Rewarding Provider Performance: Aligning Incentives in Medicare The American College of Physicians (ACP), representing

More information

Final Report ALL IRELAND. Palliative Care Senior Nurses Network

Final Report ALL IRELAND. Palliative Care Senior Nurses Network Final Report ALL IRELAND Palliative Care Senior Nurses Network May 2016 FINAL REPORT Phase II All Ireland Palliative Care Senior Nurse Network Nursing Leadership Impacting Policy and Practice 1 Rationale

More information

TRAINEE CLINICAL PSYCHOLOGIST GENERIC JOB DESCRIPTION

TRAINEE CLINICAL PSYCHOLOGIST GENERIC JOB DESCRIPTION TRAINEE CLINICAL PSYCHOLOGIST GENERIC JOB DESCRIPTION This is a generic job description provided as a guide to applicants for clinical psychology training. Actual Trainee Clinical Psychologist job descriptions

More information

Operations Director, Specialist Community & Regional Services Clinical Director, Mental Health Director of Nursing

Operations Director, Specialist Community & Regional Services Clinical Director, Mental Health Director of Nursing TO Hospital Advisory Committee FROM Operations Director, Specialist Community & Regional Services Clinical Director, Mental Health Director of Nursing DATE 26 August 2014 SUBJECT Mental Health Review MEMORANDUM

More information

High level guidance to support a shared view of quality in general practice

High level guidance to support a shared view of quality in general practice Regulation of General Practice Programme Board High level guidance to support a shared view of quality in general practice March 2018 Publications Gateway Reference: 07811 This document was produced with

More information

Eye Bank Coordinator. Flinders Medical Centre. Bedford Park. MeS1. $66,603 to $81,572 per annum

Eye Bank Coordinator. Flinders Medical Centre. Bedford Park. MeS1. $66,603 to $81,572 per annum SA Health Job Pack Job Title Eye Bank Coordinator Job Number 626016 Applications Closing Date 28/07/17 Region / Division Health Service Location Classification Southern Adelaide Local Health Network Flinders

More information

Community Health Centre Program

Community Health Centre Program MINISTRY OF HEALTH AND LONG-TERM CARE Community Health Centre Program BACKGROUND The Ministry of Health and Long-Term Care s Community and Health Promotion Branch is responsible for administering and funding

More information

Project Priority Assessment Tool

Project Priority Assessment Tool Guide for using the Project Priority Assessment Tool for potential regional Initiatives or projects in Melbourne East Contents 1. Executive summary... 3 2. Guidance on how to use the Melbourne East Project

More information

CPD for Annual Recertification of Medical Imaging and Radiation Therapy Practitioners

CPD for Annual Recertification of Medical Imaging and Radiation Therapy Practitioners CPD for Annual Recertification of Medical Imaging and Radiation Therapy Practitioners Recertification includes a number of tools used by the Board to monitor the ongoing competence of all practising medical

More information

Career Development Fellowships 2018 Guidelines for Applicants. Applications close 12 noon 05 April 2018

Career Development Fellowships 2018 Guidelines for Applicants. Applications close 12 noon 05 April 2018 Career Development Fellowships 2018 Guidelines for Applicants Applications close 12 noon 05 April 2018 Contents Definitions 3 Overview 4 Career Development Fellowship (CDF) 5 Eligibility 7 Assessment of

More information

INPATIENT SURVEY PSYCHOMETRICS

INPATIENT SURVEY PSYCHOMETRICS INPATIENT SURVEY PSYCHOMETRICS One of the hallmarks of Press Ganey s surveys is their scientific basis: our products incorporate the best characteristics of survey design. Our surveys are developed by

More information

ALLIED PHYSICIAN IPA ADVANTAGE HEALTH NETWORK IPA ARROYO VISTA MEDICAL IPA GREATER ORANGE MEDICAL GROUP IPA GREATER SAN GABRIEL VALLEY PHYSICIANS IPA

ALLIED PHYSICIAN IPA ADVANTAGE HEALTH NETWORK IPA ARROYO VISTA MEDICAL IPA GREATER ORANGE MEDICAL GROUP IPA GREATER SAN GABRIEL VALLEY PHYSICIANS IPA ALLIED PHYSICIAN IPA ADVANTAGE HEALTH NETWORK IPA ARROYO VISTA MEDICAL IPA GREATER ORANGE MEDICAL GROUP IPA GREATER SAN GABRIEL VALLEY PHYSICIANS IPA QUALITY IMPROVEMENT PROGRAM 2010 Overview The Quality

More information

The significance of staffing and work environment for quality of care and. the recruitment and retention of care workers. Perspectives from the Swiss

The significance of staffing and work environment for quality of care and. the recruitment and retention of care workers. Perspectives from the Swiss The significance of staffing and work environment for quality of care and the recruitment and retention of care workers. Perspectives from the Swiss Nursing Homes Human Resources Project (SHURP) Inauguraldissertation

More information

Retrospective Chart Review Studies

Retrospective Chart Review Studies Retrospective Chart Review Studies Designed to fulfill requirements for real-world evidence Retrospective chart review studies are often needed in the absence of suitable healthcare databases and/or other

More information

Collaborative Commissioning in NHS Tayside

Collaborative Commissioning in NHS Tayside Collaborative Commissioning in NHS Tayside 1 CONTEXT 1.1 National Context Delivering for Health was the Minister for Health and Community Care s response to A National Framework for Service Change in the

More information

Appendix 1 MORTALITY GOVERNANCE POLICY

Appendix 1 MORTALITY GOVERNANCE POLICY Appendix 1 MORTALITY GOVERNANCE POLICY 1 Policy Title: Executive Summary: Mortality Governance Policy For many people death under the care of the NHS is an inevitable outcome and they experience excellent

More information

Staff Health, Safety and Wellbeing Strategy

Staff Health, Safety and Wellbeing Strategy Staff Health, Safety and Wellbeing Strategy 2013-16 Prepared by: Effective From: Review Date: Lead Reviewer: Hugh Currie Head of Occupational Health and Safety 31 st January 2013 01 st April 2014 Patricia

More information

Ready for revalidation. Supporting information for appraisal and revalidation

Ready for revalidation. Supporting information for appraisal and revalidation 2012 Ready for revalidation Supporting information for appraisal and revalidation During their annual appraisals, doctors will use supporting information to demonstrate that they are continuing to meet

More information

GUIDELINES FOR JUNIOR DOCTORS USING THE NATIONAL ASSESSMENT TOOLS

GUIDELINES FOR JUNIOR DOCTORS USING THE NATIONAL ASSESSMENT TOOLS GUIDELINES FOR JUNIOR DOCTORS USING THE NATIONAL ASSESSMENT TOOLS This training manual contains materials which are intended to be used to assist JUNIOR DOCTORs in using the National Assessment Tools.

More information

Guy s and St. Thomas Healthcare Alliance. Five-year strategy

Guy s and St. Thomas Healthcare Alliance. Five-year strategy Guy s and St. Thomas Healthcare Alliance Five-year strategy 2018-2023 Contents Contents... 2 Strategic context... 3 The current environment... 3 National response... 3 The Guy s and St Thomas Healthcare

More information

RISK MANAGEMENT EXPERT SUPPORT TO MANAGE RISK AND IMPROVE PATIENT SAFETY

RISK MANAGEMENT EXPERT SUPPORT TO MANAGE RISK AND IMPROVE PATIENT SAFETY RISK MANAGEMENT EXPERT SUPPORT TO MANAGE RISK AND IMPROVE PATIENT SAFETY medicalprotection.org +44 (0)113 241 0359 or +44 (0)113 241 0624 RISK MANAGEMENT EXPERT SUPPORT TO MANAGE RISK AND IMPROVE PATIENT

More information

Consultation on initial education and training standards for pharmacy technicians. December 2016

Consultation on initial education and training standards for pharmacy technicians. December 2016 Consultation on initial education and training standards for pharmacy technicians December 2016 The text of this document (but not the logo and branding) may be reproduced free of charge in any format

More information

Health System Outcomes and Measurement Framework

Health System Outcomes and Measurement Framework Health System Outcomes and Measurement Framework December 2013 (Amended August 2014) Table of Contents Introduction... 2 Purpose of the Framework... 2 Overview of the Framework... 3 Logic Model Approach...

More information

Control: Lost in Translation Workshop Report Nov 07 Final

Control: Lost in Translation Workshop Report Nov 07 Final Workshop Report Reviewing the Role of the Discharge Liaison Nurse in Wales Document Information Cover Reference: Lost in Translation was the title of the workshop at which the review was undertaken and

More information