Report on a QI Project Eligible for Part IV MOC

Similar documents
Report on a QI Project Eligible for Part IV MOC

University of Michigan Health System Part IV Maintenance of Certification Program [Form 12/1/14]

QI Project Application/Report for Part IV MOC Eligibility

Improving Rates of Foot Examination for Patients with Diabetes

Implementing Surgeon Use of a Patient Safety Checklist in Ophthalmic Surgery

QI Project Application/Report for Part IV MOC Eligibility

QI Project Application/Report for Part IV MOC Eligibility

Report on a QI Project Eligible for Part IV MOC

Report on a QI Project Eligible for MOC ABMS Part IV and AAPA PI-CME. Improving Rates of Developmental Screening in Pediatric Primary Care Clinics

Breast and Colon Cancer Best Practice Advisory utilization rates in Family Medicine House Officers Instructions

QI Project Application/Report for Part IV MOC Eligibility

Timing of Pre-operative Antibiotics in Cardiac Surgery Patient

Constipation, Screening and Management in Palliative Care Patients Prescribed Opioids (Continued, Titrated, or Initiated)

QI Project Application for Part IV MOC Eligibility

QI Project Application/Report for Part IV MOC Eligibility

Report on a QI Project Eligible for MOC ABMS Part IV and AAPA PI-CME. Decreasing missed opportunities for HPV vaccination in Family Medicine

Report on a QI Project Eligible for MOC ABOG Part IV Decreasing missed opportunities for HPV vaccination in Ob/Gyn

Transforming Depression: A Great Lakes Practice Project Wave 1

Appetite Assessment During Palliative Care Consultations

MOCQI APPROVAL PROCESS AND REQUIREMENTS FOR QUALITY IMPROVEMENT PROJECTS

Begin Implementation. Train Your Team and Take Action

3. Does the institution have a dedicated hospital-wide committee geared towards the improvement of laboratory test stewardship? a. Yes b.

Adolescent Champion Model

EXECUTIVE SUMMARY. Introduction. Methods

The Four Pillars of Ambulatory Care Management - Transforming the Ambulatory Operational Framework

QAPI Making An Improvement

Choosing and Prioritizing QI Project

USING ACUTE CARE PLANS TO IMPROVE COORDINATION AMONG ED HIGH UTILIZER PATIENTS MASSACHUSETTS GENERAL HOSPITAL Publication Year: 2014

Telemedicine services provided by Teladoc

Primer on Quality Improvement and Integrating MOC into my Practice. Erik Stratman, MD

APPLICATION FOR CATEGORY 1 CREDIT DESIGNATION FOR A QUALITY IMPROVEMENT (QI) PROJECT BEING DOCUMENTED FOR PART IV MAINTENANCE OF CERTIFICATION (MOC)

Sore Throat Test & Treat Service. NHS Innovation Accelerator

Clinical Program Cost Leadership Improvement

Medical Directive. July 1, 2011 Review due by: December 1, Medical Director: Date Revised: December 1, 2017

Program Overview

Minnesota Department of Health (MDH) Health Care Homes (HCH) Initial Certification. Reviewed: 03/15/18

DAVIES COMMUNITY HEALTH AWARD COMMUNITY HEALTH ORGANIZATION

Minnesota Department of Health (MDH) Health Care Homes (HCH) HCH Recertification Training. Reviewed: 03/22/18

Toward the Electronic Patient Record:

Accountable Care Atlas

ALLIED PHYSICIAN IPA ADVANTAGE HEALTH NETWORK IPA ARROYO VISTA MEDICAL IPA GREATER ORANGE MEDICAL GROUP IPA GREATER SAN GABRIEL VALLEY PHYSICIANS IPA

Standards and Guidelines for Program Sponsorship

Our detailed comments and recommendations on the RFI are found on the following pages.

National Jewish Health Best Practices for Medication Reconciliation in a Respiratory Academic Medical Center

Using Electronic Health Records for Antibiotic Stewardship

How Allina Saved $13 Million By Optimizing Length of Stay

Say ahhhhh. for sore throats

Setting Your QI Goals

Uses a standard template but may have errors of omission

PUTTING TOGETHER A PRESSURE ULCER PREVENTION TOOLKIT FOR AHRQ

Meaningful Use and PCC EHR

Indianapolis Transitional Grant Area Quality Management Plan (Revised)

Passport Advantage (HMO SNP) Model of Care Training (Providers)

2019 Quality Improvement Program Description Overview

Transitioning OPAT (Outpatient Antibiotic Therapy) patients from the Acute Care Setting to the Ambulatory Setting

Injury Prevention + SEEK Learning Collaborative PRACTICE RECRUITMENT PACKET

The Heart and Vascular Disease Management Program

2017 Quality Improvement Work Plan Summary

Using Practitioner Supply Orders and Standing Orders in the Rheumatic Fever Prevention Programme. Guidance for sore throat management services

HIMSS Submission Leveraging HIT, Improving Quality & Safety

Quality Management and Improvement 2016 Year-end Report

Tools & Resources for QI Success

Quality Management Building Blocks

NCQA PCMH 2014 Quality Measurement and Improvement Worksheet

MALNUTRITION QUALITY IMPROVEMENT INITIATIVE (MQii) FREQUENTLY ASKED QUESTIONS (FAQs)

Select the correct response and jot down your rationale for choosing the answer.

Domestic Violence Screening in Women s Health: Rooming Alone

Neurosurgery Clinic Analysis: Increasing Patient Throughput and Enhancing Patient Experience

MODULE 8 HOW TO COLLECT, ANALYZE, AND USE HEALTH INFORMATION (DATA) ACCOMPANIES THE MANAGING HEALTH AT THE WORKPLACE GUIDEBOOK

Medicine Reconciliation FREQUENTLY ASKED QUESTIONS NATIONAL MEDICATION SAFETY PROGRAMME

Total Cost of Care Technical Appendix April 2015

Jumpstarting population health management

INSERT ORGANIZATION NAME

Building Evidence-based Clinical Standards into Care Delivery March 2, 2016

Using Data for Proactive Patient Population Management

VASCULAR HEALTH QI TOOLKIT

Selecting Measures. Presented by: Rebecca Lash, PhD, RN Collaborative Outcomes Council July 2016

Raising the Bar On Infusion Safety: A Patient Safety Program at Baylor Scott & White Health Improving Infusion Pump Safety: A Systematic Approach

2016 Complex Case Management. Program Evaluation. Our mission is to improve the health and quality of life of our members

Efficacy of Tympanostomy Tubes for Children with Recurrent Acute Otitis Media Randomization Phase

Describe the process for implementing an OP CDI program

2016 AAMC Clinical Care Innovation Challenge Pilot Awards Program Overview

Annual Quality Management Program Evaluation. Fiscal Year

White Paper BKLYN Incubator

RE: CMS-1677-P; Medicare Program; Request for Information on CMS Flexibilities and Efficiencies

Expanding Your Pharmacist Team

ProviderReport. Managing complex care. Supporting member health.

U.S. Healthcare Problem

BCBSM Physician Group Incentive Program

For more information on any of the topics covered, please visit our provider self-service website at

Measuring Value and Outcomes for Continuous Quality Improvement. Noelle Flaherty MS, MBA, RN, CCM, CPHQ 1. Jodi Cichetti, MS, RN, BS, CCM, CPHQ

North Carolina. CAHPS 3.0 Adult Medicaid ECHO Report. December Research Park Drive Ann Arbor, MI 48108

Re: Rewarding Provider Performance: Aligning Incentives in Medicare

State advocacy roadmap: Medicaid access monitoring review plans

Analytics in Action. Using Data to Improve Care and Reduce Costs CUSTOM MEDIA SPONSORED BY

7/7/17. Value and Quality in Health Care. Kevin Shah, MD MBA. Overview of Quality. Define. Measure. Improve

Provider newsletter. Dental Home Program launches for member s age 0 6

A Publication for Hospital and Health System Professionals

Bowling Green State University Dietetic Internship Program

Lab Quality Confab Process Improvement Institute. New Orleans, LA. John Waugh 11/3/2015

Transcription:

Report on a QI Project Eligible for Part IV MOC Instructions Determine eligibility. Before starting to complete this report, go to the UMHS MOC website [ocpd.med.umich.edu], click on Part IV Credit Designation, and review sections 1 and 2. Complete and submit a QI Project Preliminary Worksheet for Part IV Eligibility. Staff from the UMHS Part IV MOC Program will review the worksheet with you to explain any adjustments needed to be eligible. (The approved Worksheet provides an outline to complete this report.) Completing the report. The report documents completion of each phase of the QI project. Final confirmation of Part IV MOC for a project occurs when the full report is submitted and approved. An option for preliminary review (recommended) is to complete a description of activities through the intervention phase and submit the partially completed report. (Complete at least items 1-16 and 27a-b.) Staff from the UMHS Part IV MOC Program will provide a preliminary review, checking that the information is sufficiently clear, but not overly detailed. This simplifies completion and review of descriptions of remaining activities. Questions are in bold font and answers should be in regular font (generally immediately below the questions). To check boxes electronically, either put an X in front of a box or copy and paste over the blank box. For further information and to submit completed applications, contact either: Grant Greenberg, MD, UMHS Part IV Program Lead, 763-936-1671, ggreenbe@med.umich.edu R. Van Harrison, PhD, UMHS Part IV Program Co-Lead, 763-1425, rvh@umich.edu Chrystie Pihalja, UMHS Part IV Program Administrator, 763-936-1671, cpihalja@umich.edu Report Outline Section Items A. Introduction 1-6. Current date, title, time frame, project leader, specialties/subspecialties involved, funding B. Plan 7-10. General goal, patient population, IOM quality dimensions addressed, experimental design 11-12. Baseline measures of performance, specific performance objectives 13. Data review and identifying underlying (root) causes C. Do 14-16. Intervention(s), who is involved, initiated when D. Check 17-18. Post-intervention performance measurement, data collection, performance level E. Adjust Replan 19. Review, continuing/new underlying causes, F. Redo 20. Second intervention G. Recheck 21-22. Post-adjustment performance measurement, data collection, performance level H. Readjust plan 23. Review, continuing/new underlying causes to address I. Future plans 24-26. Subsequent PDCA cycles, standardize processes, spread to other areas J. Physician involvement 27-31. Physician s role, requirements, reports, reflections, participation, number K. Project Organization 32-34. Part of larger initiative, organizational structure, resources, oversight, Part IV opportunity

A. Introduction QI Project Report for Part IV MOC Eligibility 1. Date (this version of the report): November 26, 2014 2. Title of QI project: Improving Pharyngitis Management by Incorporating Guideline Recommendations into an EMR SmartSet 3. Time frame a. At what stage is the project? x Completed b. Time period (1) Date physicians begin participating (may be in design phase): March 1, 2014 (2) End date: October 21, 2014 4. QI project leader [responsible for attesting to the participation of physicians in the project]: a. Name: Heather Burrows and Kelly Orringer b. Title: Director of Education and Interim Division Director c. Institutional/organizational unit/affiliation: UMHS Division of General Pediatrics d. Phone number: 647 3552 e. Email address: armadill@umich.edu, korringe@umich.edu f. Mailing address: 5. What specialties and/or subspecialties are involved in this project? pediatrics 6. Will the funding and resources for the project come only from internal UMHS sources? Yes, only internal UMHS sources The Multi Specialty Part IV MOC Program requires that projects engage in change efforts over time, including at least three cycles of data collection with feedback to physicians and review of project results. Some projects may have only three cycles while others, particularly those involving rapid cycle improvement, may have several more cycles. The items below are intended to provide some flexibility in describing project methods. If the items do not allow you to reasonably describe the methods of your specific project, please contact the UMHS Part IV MOC Program office. B. Plan 7. General goal a. Problem/need. What is the gap in quality that resulted in the development of this project? Why is this project being undertaken? Pharyngitis is a common presenting symptom in pediatrics. Only approximately 10% of patients with pharyngitis have a bacterial etiology (Group A Streptococcus). It is important to accurately diagnose and treat these patients to prevent short and long term sequela. Recommendations regarding antibiotic dosing for Group A strep have recently changed to permit once daily dosing of medication. Overuse of antibiotics, and unnecessary use of broader spectrum antibiotics increases the incidence of antibiotic resistance in the population as well as the potential for side effects of antibiotics in the individual patient. A HEDIS measure evaluating the quality of care of patients with pharyngitis has been developed. This measure evaluates the number of patients diagnosed with pharyngitis, strep throat, or tonsillitis

who have had a test for Group A strep prior to antibiotic prescribing. As with other HEDIS measures, this can be used to determine the quality of care provided by University of Michigan physicians by insurance companies and consumers, which can impact reimbursement. Although the UMHS Clinical Care Guideline was updated in 2014, many faculty were not aware of the updates and were not using the updated medication schedules. As part of the implementation of the EPIC EMR, our division developed SmartSets for well child care to allow for efficient documentation (diagnosis, level of service, orders) for these types of visits. Although many faculty have incorporated these into daily work patterns, not all faculty have become facile in the use of the EMR SmartSets. We developed a SmartSet for pharyngitis based on the institutional clinical care guideline to provide decision support in the diagnosis and treatment of pharyngitis as well as improve efficiency. Ideally this should help to educate faculty about the utility of SmartSet use in general and will spur interest in developing other SmartSets.. b. Project goal. What outcome regarding the problem should result from this project? Improve the accuracy of diagnosis and treatment of patients presenting with pharyngitis.. 8. Patient population. What patient population does this project address. Pediatric primary care clinic patients seen in the office with a presenting symptom of pharyngitis and no other diagnosis requiring antibiotic treatment. 9. Which Institute of Medicine Quality Dimensions are addressed? [Check all that apply.] x Safety Equity x Timeliness x Effectiveness x Efficiency Patient Centeredness 10. What is the experimental design for the project? x Pre post comparisons (baseline period plus two or more follow up measurement periods) Pre post comparisons with control group Other: 11. Baseline measures of performance: a. What measures of quality are used? If rate or %, what are the denominator and numerator? Clinical care measures HEDIS Compliance: Testing: For patients with a final visit diagnosis of strep throat/ scarlet fever: % with rapid strep test. Numerator: patients with a rapid strep test Denominator: patients with a final visit diagnosis of strep throat/ scarlet fever UMHS Guideline compliance: Prescribing: For patients with Positive rapid strep test: % with positive rapid strep test and treated with recommended antibiotics (amoxicillin liquid, PCN tablet, or cephalexin) Numerator: patients treated with recommended antibiotics Denominator: patients with positive rapid strep test Negative rapid strep test: % with negative rapid strep test and no antibiotic prescription Numerator: patients without antibiotic prescription Denominator: patients with negative rapid strep test Followed guideline, % whose treatment followed UM clinical care guideline Numerator: Patients who followed UM clinical care guideline * Denominator: patients seen with presenting symptom of pharyngitis *Those who followed pharyngitis guideline are those with: negative strep test and no antibiotic PLUS

Process measure positive rapid strep test and appropriate antibiotic prescribed PLUS patients who did not meet criteria for rapid strep testing because they had a clear viral etiology (eg/ Hand foot mouth) SmartSet use: For patients with a presenting symptom of pharyngitis: % with SmartSet used for management and documentation Numerator: patients seen for whom the physician utilized the smart set Denominator: patients seen with presenting symptom of pharyngitis b. Are the measures nationally endorsed? If not, why were they chosen?.the HEDIS measure is a national metric for quality. The additional measures based on the UMHS clinical care guideline are based on local expertise and evaluation of the relevant literature. We used both measures as we felt that the HEDIS measure did not completely address all components of quality care of patients with pharyngitis. c. What is the source of data for the measure (e.g., medical records, billings, patient surveys)? Medical records d. What methods were used to collect the data (e.g., abstraction, data analyst)? Each physician will abstract their own data from their clinic patients seen with any complaint of sore throat, pharyngitis, or strep e. How reliable are the data being collected for the purpose of this project? Very reliable as faculty have clearly defined worksheets for entering data. Data are reported to the group as de identified, combined information so that there is no repercussion for individual faculty to misrepresent performance if it differs from the goal. f. How are data to be analyzed over time, e.g., simple comparison of means, statistical test(s)? Simple comparison of performance rates. g. To whom are data reported? The project leaders, the participating physicians, the MA staff at the participating clinics, The final report will be shared with: Terrill Bravender MD, Associate Chair for Quality Improvement, Dept of Pediatrics Valerie Castle MD, Chair of Pediatrics David Spahlinger MD, Executive Director, Faculty Group Practice h. For what time period is the sample collected for baseline data? Baseline: (August 1, 2013 January 31, 2014), 12. Specific performance objectives a. What is the overall performance level(s) at baseline? (E.g., for each measure: number of observations or denominator, numerator, percent. Can display in a data table, bar graph, run chart, or other method. Can show here or refer to attachment with data.) See charts attached (baseline data from initial division meeting) In summary (measures calculated as described above):

Measures Clinical Care Measure Testing : have final diagnosis of strep/scarlet fever and rapid strep test performed Prescribing Positive rapid test and appropriate antibiotic prescribed Negative rapid test and no antibiotic prescribed Baseline data (8/1/13 1/31/14) 94% 130/139 80% 104/130 93% 339/369 (6 with + cultures) Presenting symptom followed UMHS guideline 90% 518/576 Process Measure Smart Set Used 13% 75/576 b. Specific aim: What is the target for performance on the measure(s) and the timeframe for achieving the target? By the conclusion of the post adjustment period for this project 10/1/2014: Testing: 95% of patients with a final diagnosis of strep throat/ scarlet fever will have a positive rapid strep test Positive test and antibiotic: 95% of patients with positive rapid strep tests will be treated with recommended antibiotics (amoxicillin liquid, PCN tablet, cephalexin) Negative test and no antibiotic: 95% of patients with negative rapid strep tests will not be treated with antibiotics. Followed UMHS guideline: 95% of patients with a presenting symptom of pharyngitis will have treatment that followed UM clinical care guideline (tested if indicated, and treated with appropriate antibiotics if positive,) Smart set use: 60% of encounters with presenting symptoms of pharyngitis will use the EMR Pharyngitis SmartSet to assist in making clinical decisions for diagnosis and treatment c. How were the performance targets determined, e.g., regional or national benchmarks? Performance targets were determined based on national benchmarks for the HEDIS measure for pharyngitis. The national 75 th %ile benchmark for this HEDIS measure in our region is 83%. Project leads felt that the University of Michigan should be performing above the 75 th %ile. We used the same target for appropriate treatment of patients. Clinical leadership established goals for SmartSet use based on expectations of physicians working at UMHS 13. Data review and identifying underlying (root) causes. a. Who will be/was involved in reviewing the baseline data, identifying underlying (root) causes of the problem(s), and considering possible interventions ( countermeasures ) to address the causes? Briefly describe who is involved, how (e.g., in a meeting of clinic staff), and when.

Who: General pediatrics faculty who chose to participate, 35 started project. How: Met at regular division meeting to discuss baseline data and initial root cause analysis When: March 18, 2014 b. What are the primary underlying/root causes for the problem(s) that the project can address? (Causes may be aspects of people, processes, information infrastructure, equipment, environment, etc. List each primary cause separately. How the intervention(s) address each primary underlying cause will be explained in #14.c.) Knowledge: Faculty unaware of information conveyed in updated UMHS pharyngitis guideline Faculty may not recognize that SmartSet use can increase efficiency of their clinic encounter Process: Faculty unfamiliar with EMR and SmartSet use Faculty uncomfortable with EMR and SmartSet use C. Do 14. Intervention(s). a. Describe the interventions implemented as part of the project. EMR Changes: Create an EMR SmartSet based on the revised UMHS guideline (Division Epic Champion with input from faculty involved in developing the UMHS guideline) Education: Review the updated UMHS pharyngitis guideline at designated division meeting (March 18, 2014) Train pediatric faculty on use of the EMR SmartSet at designated division meeting (April 15, 2014) Process Changes: Faculty discussed as a group ways to efficiently incorporate the use of the pharyngitis SmartSet into clinical care of patients with presenting symptom of pharyngitis b. How are underlying/root causes (see #13.b) addressed by the intervention(s)? (List each cause, whether it is addressed, and if so, how it is addressed.) Knowledge Faculty may be unaware of information conveyed in updated UMHS pharyngitis guideline We offered CME for the session that reviewed the updated pharyngitis guideline (April 15, 2014) Faculty may not realize that SmartSet use can increase efficiency of their clinic encounter We demonstrated that this has all necessary components of a complete encounter and that it can be used quickly and efficiently in clinic. Faculty who routinely use SmartSets discussed the advantages of the use of this tool for efficient management and documentation. Process Faculty may be unfamiliar with EMR and smartset use We offered a live demo of the new SmartSet for pharyngitis in a follow up division meeting. Faculty uncomfortable with EMR and SmartSet use Through use of SmartSet after demo and education, and with reinforcement after using the SmartSet for appropriate visits, faculty gained comfort with the use of SmartSets 15. Who is involved in carrying out the intervention(s) and what are their roles? Project leads: Heather Burrows: developed interactive data collection worksheet, compiled data for presentation at division meetings.

Kelly Orringer: presented updated information from UMHS clinical care guideline, led discussion for root cause analysis of initial data. Division Epic Champions: Sharon Kileny/ David Hanauer developed EMR SmartSet Participating physicians: data collection, self reflection on personal data as compared to group data, participated in root cause analysis and planning of interventions, participate in educational sessions on UMHS clinical care guideline and SmartSet demonstration Other clinic staff: MA staff do swab patients at some sites based on a protocol for sore throat. 16. The intervention will be/was initiated when? (For multiple interventions, initiation date for each.) March 18, 2014: review of UMHS clinical care guideline April 15, 2014: demonstration of SmartSet and group discussion of ways to incorporate this into daily workflow D. Check 17. Post intervention performance measurement. Is this data collection to follow the same procedures as the initial collection of data described in #11: population, measure(s), and data source(s)? x Yes 18. Performance following the intervention. a. The collection of the sample of performance data following the intervention either: (Occurred for the 2 month period 5/1/2014 through 7/1/2014) b. If the data collection has occurred, what is post intervention performance level? (E.g., for each measure: number of observations or denominator, numerator, percent. Can display in a data table, bar graph, run chart, or other method. Can show here or refer to attachment with data.) See attached data chart (round 1) Overall noted improved compliance with the clinical care guideline as well as use of SmartSet In Summary: Measures Baseline data (8/1/13 1/31/14) Post intervention (5/1 7/1/14) Clinical Care Measure Testing : have final diagnosis of strep/scarlet fever and rapid strep test performed Prescribing Positive rapid test and appropriate antibiotic prescribed Negative rapid test and no antibiotic prescribed 94% 130/139 80% 104/130 93% 339/369 (6 with + cultures) 95% 164/172 81% 135/166 98% 245/250 (5 with + cultures) Followed UMHS guideline 90% 518/576 Process Measure 92% 410/446

Smart Set Used 13% 75/576 45% 199/446 E. Adjust Replan 19. Review of post intervention data and identifying continuing/new underlying causes. a. Who will be/was involved in reviewing the post intervention data, identifying underlying (root) causes of the continuing/new problem(s), and considering possible adjustments to interventions ( countermeasures ) to address the causes? Briefly describe who is involved, how (e.g., in a meeting of clinic staff), and when. Who: The participating pediatricians and project leads How: all faculty had the opportunity to reflect on their own data during data collection. The group met at a regular monthly division meeting, to review summary group data, and identify other issues impacting pharyngitis management, SmartSet adoption/ease of use. When: 7/15/2014 b. What are the primary underlying/root causes for the continuing/new problem(s) that the project can address? (Causes may be aspects of people, processes, information infrastructure, equipment, environment, etc. List each primary cause separately. How the intervention(s) address each primary underlying cause will be explained in #20.c.) Process Factors: Some patients swabbed for strep by the medical assistant when it was not indicated per MD EMR SmartSet Factors: Can t order chewable amoxicillin within the SmartSet orders Not all related diagnoses are in the SmartSet diagnoses list F. Redo 20. Second intervention. a. The second intervention will be/was initiated when? (For multiple interventions, initiation date for each.) July 15, 2014 b. If the second intervention has occurred, what interventions were implemented? Process: Providers worked with their clinic MA staff to educate them on which patients should have a strep test performed, review the standing protocols in each clinic EMR SmartSet: Requested development team to add chewable amoxicillin to the SmartSet medication options Discussed current diagnoses included in the SmartSet group elected not to add others as more than 80% typical sore throat diagnoses appear in the current SmartSet diagnosis list c. How are continuing/new underlying/root causes (see #19.b) addressed by the intervention(s)? (List each cause, whether it is addressed, and if so, how it is addressed.) Process Factors: Some patients swabbed for strep by the medical assistant when it was not indicated per MD Improving appropriate strep swabbing will improve compliance to UMHS guideline3 EMR SmartSet Factors:

Can t order chewable amoxicillin within the SmartSet orders Adding to smart set will enable easier prescribing of recommended antibiotics Not all related diagnoses are in the SmartSet diagnoses Not addressed by group decision G. Recheck 21. Post second intervention performance measurement. Is this data collection to follow the same procedures as the initial collection of data described in #11: population, measure(s), and data source(s)? x Yes No If no, describe how this data collection 22. Performance following the second intervention. a. The collection of the sample of performance data following the intervention(s) August 1, 2014 October 1, 2014 b. If the data collection has occurred, what is the performance level? See attached charts (final data) In Summary: Measures Baseline data (8/1/13 1/31/14) Post intervention (5/1 7/1/14) Post adjustment (8/1 10/1/14) Clinical Care Measure Testing : have final diagnosis of strep/scarlet fever and rapid strep test performed 94% 130/139 95% 164/172 86% 66/77 Prescribing Positive rapid test and appropriate antibiotic prescribed 80% 104/130 81% 135/166 88% 60/68 Negative rapid test and no antibiotic prescribed 93% 339/369 (6 with + cultures) 98% 245/250 (5 with + cultures) 98% 263/269 (1 with + culture) Followed UMHS guideline 90% 518/576 Process Measure Smart Set Used 13% 75/576 92% 410/446 45% 199/446 96% 369/383 45% 174/383 H. Readjust 23. Review of post second intervention data and identifying continuing/new underlying causes. a. Who will be/was involved in reviewing the data, identifying underlying (root) causes of the continuing/new problem(s), and considering additional possible adjustments to interventions

( countermeasures ) to address the causes? Briefly describe who is involved, how (e.g., in a meeting of clinic staff), and when. Who: 31 physicians who completed the entire project How: reviewed all 3 rounds of data collection, analysis during monthly division meeting When: October 21, 2014 division meeting (attendance required) b. What are the primary underlying/root causes for the continuing/new problem(s) that the project can address? (Causes may be aspects of people, processes, information infrastructure, equipment, environment, etc. List each primary cause separately.) To understand the lower apparent performance on the testing measure, we examined the individual cases. Extenuating clinical factors and misclassification resulted in this change. Extenuating clinical factors: One family of 3 were treated for clinically suspicious strep because of a child with rheumatic fever in the household. A family with international travel plans had a child treated before culture result was final. (No action is needed for these unusual cases. ) Misclassification: Four children were classified as having strep although the rapid strep test was negative. The children should not have been classified as having strep. No antibiotics were prescribed for these children, consistent with the negative rapid strep tests. Other important causes underlying the results are: SmartSet Factors: SmartSet was more likely to be used when the diagnosis was in fact strep throat/ scarlet fever and less likely with other final diagnoses. Specifically, hand foot mouth is a common summer illness with a presenting symptom of pharyngitis that is not in the included in the SmartSet diagnosis list and is not easily documented within the note associated with the SmartSet Concern that liquid amoxicillin is not included on the SmartSet list for heavier (>27 kg) children. Children often request liquid formulations even at higher weights. Process Factors: MA staff continue to run strep screens in patients who do not meet criteria for testing If no additional cycles of adjustment are to be documented for the project for Part IV credit, go to item #24. *** If a few additional cycles of adjustments, data collection, and review are to be documented as part of the project to be documented, document items #20 #23 for each subsequent cycle. Copy the set of items #20 #23 and paste them following the last item #23 and provide the information. When the project to be documented for Part IV credit has no additional adjustment cycles, go to item #24. If several more cycles are included in the project for Part IV credit, contact the UM Part IV MOC Program to determine how the project can be documented most practically. I. Future Plans 24. How many subsequent PDCA cycles are to occur, but will not be documented as part of the project for which Part IV credit is designated? No additional formal future data cycles are planned given our sufficiently high performance scores. Although the target aim was met for only two of the five measures, performance is sufficiently high on the other three measures, some reinforcing activities are planned, and our priority for formal improvement is shifting to other clinical areas. We have learned that many faculty are not using smartsets for pharyngitis and other common pediatric conditions (Otitis media). We will continue to offer faculty support and training to encourage use as we work with smartsets for other clinical topics. Additionally, we are working with our MA staff to educate them on the appropriate indications for rapid strep testing. Finally, we are encouraging faculty to use the diagnosis of strep/scarlet fever when the rapid test is + and the child needs antibiotics. And conversely, to use the diagnosis of pharyngitis when another etiology for sore throat is present.

25. How will the project sustain processes to maintain improvements? To improve more consistent use of the SmartSet, further improvements have been initiated to address additional root causes identified in the post adjustment cycle. Liquid amoxicillin dosing for heavier children will be added to the SmartSet Adding hand foot mouth to the diagnosis list in the SmartSets Continue to do live demonstrations of SmartSets at division meetings to encourage their use in busy clinics Continue to work with clinic MA staff on appropriate strep testing Will use lessons learned from this SmartSet in developing future SmartSets (ie making them symptom based and not final diagnosis based) 26. Do other parts of the organization(s) face a similar problem? If so, how will the project be conducted so that improvement processes can be communicated to others for spread across applicable areas? It is likely that other clinics seeing pediatric patients at UMHS face the same decisions regarding testing and treating patients with a presenting symptom of pharyngitis. We will share the use of the SmartSet, the process of dissemination, and the educational tools that were found to be effective in this project with other faculty who provide primary care for children (family medicine and med/peds). We will continue to include education around these issues in discussions with residents working in continuity clinics. We included lessons learned from the use of this SmartSet in the recent development of another SmartSet based on an UMHS clinical care guideline (otitis). Understanding of the process used in this project can help inform development and optimization of other guideline based Smartsets into clinical workflow to be used in Pediatrics and other departments/divisions as well. J. Physician Involvement Note: To receive Part IV MOC a physician must both: a. Be actively involved in the QI effort, including at a minimum: Work with care team members to plan and implement interventions Interpret performance data to assess the impact of the interventions Make appropriate course corrections in the improvement project b. Be active in the project for the minimum duration required by the project 27. Physician s role. What are the minimum requirements for physicians to be actively involved in this QI effort? a. data collection and reporting for three collection periods: baseline data 8/1/13 1/31/14 round 1 data 5/1/14 7/1/14 round 3 data 8/1/14 10/1/14 b. Interpreting baseline data and planning intervention: attendance at March 18, 2014 and April 15, 2014 meetings for review of UMHS pharyngitis guideline and SmartSet demonstration, involvement in discussions around root cause analysis c. Implement intervention by incorporating SmartSet into clinical workflow d. Interpreting initial intervention data and planning additional changes: attendance at July 15, 2014 meeting and involvement in discussion e. Interpreting post intervention data and planning for future changes/ maintenance: attendance at 10/21/14 meeting and involvement in discussion.

28. How are reflections of individual physicians about the project utilized to improve the overall project? Suggestions made by individual providers at division meetings were incorporated into the process development. This input was included to facilitate development of consensus on areas on which to prioritize efforts in terms of process/workflow improvement. 29. How does the project ensure meaningful participation by physicians who subsequently request credit for Part IV MOC participation? All providers who participated were required to demonstrate active longitudinal participation by: collecting and submitting data during each stage of the project as well as attending specified meetings to discuss and review the data and process. The project leads monitored the participation of all individuals. For faculty who were unable to attend some of the division meetings for legitimate reasons (family emergency. Medical leave), we did allow an excused absence. These faculty then self reflected on group data and replied to project leads with these reflections and plans for further implementation. Only one excused absence was allowed for each faculty member. 30. What are the specialties and subspecialties of the physician anticipated to participate in the project and the approximate number of physicians in each specialty/subspecialty? Pediatrics: 31 providers completed and are eligible for MOC credit for this project. K. Project Organizational Role and Structure 31. UMHS QI/Part IV MOC oversight this project occurs within: x University of Michigan Health System Overseen by what UMHS Unit/Group? General Pediatrics Is the activity part of a larger UMHS institutional or departmental initiative? x No Yes the initiative is: Veterans Administration Ann Arbor Healthcare System Overseen by what AAVA Unit/Group? Is the activity part of a larger AAVA institutional or departmental initiative? No Yes the initiative is: An organization affiliated with UMHS to improve clinical care The organization is: The type of affiliation with UMHS is: Accountable Care Organization type (specify which): BCBSM funded, UMHS lead Collaborative Quality Initiative (specify which): Other (specify): 32. What is the organizational structure of the project? [Include who is involved, their general roles, and reporting/oversight relationships.] Project oversight: Terry Bravender, Associate Chair of QI for department of Pediatrics Project leaders: Two pediatric providers responsible for design, enrollment, ensuring participation of enrollees, data analysis, and running the monthly meetings. The project leaders directly oversaw all

Project participants: The other pediatric providers were responsible for enrolling, their own data collection, attendance at meetings, and contributing to the iterative process over time. We have shared our results with the MA staff at our clinics as well. 33. To what oversight person or group will project level reports be submitted for review? Primarily to Terry Bravender MD, Associate Chair QI Dept Pediatrics Secondarily to Dr. Valerie Castle MD Dept Pediatrics Chair, and Dr. David Spahlinger FGP Executive Director.

2014 General Pediatrics MOC project on Pharyngitis (Compliance with clinical care guideline and SmartSet use) Preliminary Data from 35 clinicians on 576 patients: Diagnoses Pharyngitis 329 57% Strep throat/ scarlet fever 139 24% Mono 11 2% other 97 17% Smart Set Use Yes 75 13% No 495 86% Not sure 5 1% Test results Not indicated 38 8% Negative rapid test 388 78% Positive rapid test 132 27% Treatment Provided Amoxicillin suspension 85 17% PCN tablets 15 3% Cephalexin suspension or tablets 8 2% PCN IM 1 0% Azithromycin suspension or tablets 11 2% Other antibiotic 28 5% No antibiotic 429 72%

2014 General Pediatrics MOC project on Pharyngitis (Compliance with clinical care guideline and SmartSet use) First round of data after further teaching on the SmartSet -from 34 clinicians on 446 patients from April-June 2014: Diagnoses Pharyngitis 200 46% Strep throat/ scarlet fever 172 37% Mono 9 2% other 66 15% Smart Set Use Yes 199 44% No 245 55% Not sure 3 1% Test results Not indicated 24 5% Negative rapid test 256 56% Positive rapid test 166 36% Treatment Provided Amoxicillin suspension 115 26% PCN tablets 17 4% Cephalexin suspension or tablets 7 2% PCN IM 2 0% Azithromycin suspension or tablets 17 4% Other antibiotic 17 4% No antibiotic 271 59% Did treatment follow UM clinic care guideline? yes 412 92.4% no 31 7.0% not sure 3 0.6%

2014 General Pediatrics MOC project on Pharyngitis (Compliance with clinical care guideline and SmartSet use) Final round of data after discussion on integrating SmartSet use -from 33 clinicians on 383 patients from July-October 2014: Diagnoses Pharyngitis 225 59% Strep throat/ scarlet fever 77 20% Mono 7 2% other 74 19% Smart Set Use Yes 174 45% No 209 54% Not sure 7 2% Test results Not indicated 54 14% Negative rapid test 268 69% Positive rapid test 68 17% Treatment Provided Amoxicillin suspension 52 14% PCN tablets 5 1% Cephalexin suspension or tablets 6 2% PCN IM 1 0% Azithromycin suspension or tablets 3 1% Other antibiotic 8 2% No antibiotic 314 79% Did treatment follow UM clinic care guideline? yes 372 97% no 13 2% not sure 5 1%