Understanding change process Halle, July 12 th 2016 Sascha Köpke Professor of Nursing Research University of Lübeck Nursing Research Group sascha.koepke@uksh.de The MRC framework MRC 2008 MRC 2008 Complex interventions I have developed a fantastic (complex) intervention My main question: Does it work? -> RCT Result Yes -> Implement it! No -> Forget about it! -> Or maybe, thinking about it, we could perform subgroup analyses to assess if maybe we could find an effect in male nursing home residents over 75, where there might be 1
What s the problem? Introduction of such a system did not significantly reduce the incidence of our study outcomes. Possible explanations for our findings are that the MET system is an ineffective intervention; the MET is potentially effective but was inadequately implemented in our study; we studied the wrong outcomes; control hospitals were contaminated as a result of being in the study; the hospitals we studied were unrepresentative; or our study did not have adequate statistical power to detect important treatment effects. Understanding (change) processes Negative or unexpected outcomes are (more) common with complex interventions Process evaluations aim to understand such outcomes MRC Framework: Understanding processes A process evaluation is often highly valuable providing insight into why an intervention fails unexpectedly or has unanticipated consequences or why a successful intervention works and how it can be optimised MRC 2008 2
New MRC Framework: Process evaluation of complex interventions MRC 2015; Moore et al. 2015 New MRC Framework: Process evaluation of complex interventions Moore et al. 2015 in Richards & Halberg 2015 Process evaluation Oakley et al. BMJ 2006 3
The RIPPLE study Pupil peer-led sex education Aim: Decreasing risky sexual behaviour English secondary schools Pupils aged 16-17 years (given a brief training) deliver the programme to 13-14 year olds Control intervention: Teacher based Oakley et al. BMJ 2006 Process evaluation: Aims Views of participants on the intervention Implementation of the intervention Distinguish components of the intervention Contextual factors affecting intervention Required dose to reach aim of intervention Variation of effects in subgroups Distinguish How - to Ineffective collect these interventions information? from Oakley et al. BMJ 2006 - Badly delivered interventions Process evaluation: Aims Document the implementation of both interventions Describe and compare processes in the two forms of sex education Collect information from study participants about the experience of taking part Collect data on individual school contexts How to collect these information? Oakley et al. BMJ 2006 4
Methodological issues Process data should be collected from all sites (intervention and control) Data should be both qualitative and quantitative Process data should be analyzed before outcome data to avoid bias in interpretation Steps should be taken to minimise the possibility of bias and error in interpreting the findings from statistical approaches regression analyses subgroup analyses Oakley et al. BMJ 2006 Methods Interview studies Focus group studies Observations Case studies Documentary analysis Diaries Surveys / Questionnaires Structured interviews Resource use Oakley A et al. BMJ 2006; O Cathain A et al. BMC HSR 2007 Qualitative methods Before During After 5
Qualitative Process Evaluation Atkins et al. 2015 in Richards & Halberg 2015 Testing Treatment Fidelity Hasson 2015 in Richards & Halberg 2015 Reporting 6
TIDieR checklist CONSORT Extension (in progress) http://www.spi.ox.ac.uk/research/centre-for-evidence-basedintervention/consort-study.html Components (1) Implementation what is implemented, and how? Mechanisms of impact how does the delivered intervention produce change? Context how does context affect implementation and outcomes? 7
Key recommendations Components (2) Adapted from: Linnan L & Steckler A 2002 1. Recruitment 2. Reach 3. Fidelity 4. Satisfaction 5. Dose delivered 6. Dose received Driessen M et al. Implement Sci 2010 8
Own example (1) Own example (1) Since we aim to implement a complex intervention programme intervening in a complex system, more insight into nurses' comprehension of the Own restraint reduction example approach, the dissemination and delivery of the intervention is needed. Collection of process data will allow us to draw conclusions about potential barriers and facilitators of the intervention. 1. Recruitment 2. Reach 3. Fidelity 4. Satisfaction 5. Dose delivered 6. Dose received 9
Process evaluation Own example (2) DEcision Coaching In MS: Cluster-RCT to enhance patient involvement in decision making in Multiple Sclerosis Decision coaching Decision coach (MS nurse) Preferences & values Evidence-based patient information Physician Patient 10
Decision coaching (in MS) Goal: To offer individually adapted (timing, content) interactive information and counselling Concept: Change of information-provision structure in immunotherapy decision making Hypothesis: Patients will be more empowered to reflect on their preferences and values, leading to More informed patient choices Increased patient involvement Increased adherence Higher satisfaction Intervention components 3-days nurse training program (evidence, patient information, shared decision making, communication skills, ) WIKI-based online information system Patient facing immunotherapy decision referred by physician to MS nurse (decision coach) 1-3 sessions with MS nurse for patients to prepare for decision making with physician Final decision within physician encounter 11
Components (3) DECIMS-Process Evaluation DECIMS-Process Evaluation 12
Group work Strenuous Difficult to decide on adequate methods Data hard to interpret Commensurability of mixed methods Conclusions Complex interventions require adequate methods to understand change processes Mixed method approaches focusing on qualitative methods Aim to identify mechanisms, facilitators & barriers of efficacy Analyse reasons for inefficacy Find strategies for implementation References (1) Bleijlevens M, Hendriks M, van Haastregt J, van Rossum E, Kempen G, Diederiks J, Crebolder H, van Eijk J (2008). Process factors explaining the ineffectiveness of a multidisciplinary fall prevention programme: a process evaluation. BMC Public Health 8: 332 Driessen M, Proper K, Anema J, Bongers P, van der Beek A (2010). Process evaluation of a participatory ergonomics programme to prevent low back pain and neck pain among workers. Implement Sci 5: 65 Gillespie L, Robertson M, Gillespie W, Lamb S, Gates S, Cumming R, Rowe B (2009). Interventions for preventing falls in older people living in the community. Cochrane Database Syst Rev (2): CD007146 Grant A, Treweek S, Dreischulte T, Foy R, Guthrie B (2013). Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting Trials 14: 15 Hawe P, Shiell A, Riley T (2004). Complex interventions: how "out of control" can a randomised controlled trial be? BMJ 328: 1561-1563 Haut A, Köpke S, Gerlach A, Mühlhauser I, Haastert B, Meyer G (2009): Evaluation of an evidence-based guidance on the reduction of physical restraints in nursing homes: a cluster-randomised controlled trial (ISRCTN34974819). BMC Geriatrics 9: 42 Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, Altman DG, Barbour V, Macdonald H, Johnston M, Lamb SE, Dixon-Woods M, McCulloch P, Wyatt JC, Chan AW, Michie S (2014). Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 348: g1687 Hendriks M, Bleijlevens M, van Haastregt J, Crebolder H, Diederiks J, Evers S, Mulder W, Kempen G, van Rossum E, Ruijgrok J, Stalenhoef P, van Eijk J (2008). Lack of effectiveness of a multidisciplinary fallprevention program in elderly people at risk: a randomized, controlled trial. JAGS 56: 1390-1397 Köpke S, Mühlhauser I, Gerlach A, Haut A, Haastert B, Möhler R, Meyer G (2012). Effect of a guideline-based multicomponent intervention on use of physical restraints in nursing homes: a randomized controlled trial. JAMA 307: 2177-2184. 13
References (2) Linnan L, Steckler A (2002). Process evaluation for Public Health Interventions and Research; an overview. In Process Evaluation for Public Health Interventions and Research. Jossey-Bass Incorporated, Publishers: 1-23 Medical Research Council (2008). Developing and evaluating complex interventions: new guidance. http://www.mrc.ac.uk/consumption/idcplg?idcservice=get_file&did=15585&ddocname=mrc004871 &allowinterrupt=1 Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O'Cathain A, Tinati T, Wight D, Baird J (2015). Process evaluation of complex interventions. In: Richards D, Rahm Hallberg I: Complex interventions in health. Routledge: London & New York 2015, 222-31. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O'Cathain A, Tinati T, Wight D, Baird J (2015). Process evaluation of complex interventions: Medical Research Council guidance. BMJ 350: h1258 Möhler R, Köpke S, Meyer G (2015). Criteria for Reporting the Development and Evaluation of Complex Interventions in healthcare: revised guideline (CReDECI 2). Trials 16: 204 Munro A & Bloor M (2010). Process evaluation: the new miracle ingredient in public health research. Qualitative Research 10: 699-713 Oakley A, Strange V, Bonell C, Allen E, Stephenson J; RIPPLE Study Team (2006). Process evaluation in randomised controlled trials of complex interventions. BMJ 332:413-416 O'Cathain A, Murphy E, Nicholl J (2007). Why, and how, mixed methods research is undertaken in health services research in England: a mixed methods study. BMC Health Serv Res 14;7:85 Wells M, Williams B, Treweek S, Coyle J and Taylor J (2012). Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials 13:95 14