HAB/NQC HIV Cross-Part Care Continuum Collaborative (H4C) Frequently Asked Questions

Similar documents
Indianapolis Transitional Grant Area Quality Management Plan (Revised)

Quality Management Program

GRANTEE CONTRACT MANAGEMENT SYSTEM

What Is Required for a Clinical Quality Management Program? Help! August 29, 2017

Ryan White HIV/AIDS Part C Capacity Development Program Pre-Application Technical Assistance Conference Call HRSA January 26, 2017

2016 NATIONAL RYAN WHITE CONFERENCE ON HIV CARE & TREATMENT

HIV/AIDS BUREAU 2012 Grantee Satisfaction Survey: Response and Results

Instructions for Completing the Performance Framework Template

HIV-SPECIFIC QUALITY METRICS FOR MANAGED CARE

The Improvement Journey; From Beginning to Continued Improvement

THE GLOBAL FUND to Fight AIDS, Tuberculosis and Malaria

Ryan White HIV/AIDS Program Part D Women, Infants, Children, and Youth (WICY) Grants Supplemental Funding

The following questions have been frequently asked and the corresponding answers are detailed in this document: Frequently Asked Questions...

Guidance notes to accompany VTE risk assessment data collection

MANAGED CARE READINESS

Working together to improve HIV/AIDS services in Nevada and the Las Vegas TGA

MENTAL HEALTH SERVICES

45 CFR 75 Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Health and Human Services Awards

Baltimore-Towson EMA Part A Quality Management (QM) Plan I. Introduction

HIV SERVICES ACUITY TOOL PILOT IMPLEMENTATION MEETING. October 16, 2014

Ryan White Services Division Infectious Disease Bureau. Client Services Provider Manual FY Ryan White HIV/AIDS Treatment Extension Act Part A

THE POSITIVE ACTION MSM AND TRANSGENDER PROGRAMME

2017 ANNUAL PROGRAM TERMS REPORT (PTR)/ ALLOCATIONS INSTRUCTION MANUAL

CASE MANAGEMENT POLICY

2014 ANNUAL RYAN WHITE HIV/AIDS PROGRAM SERVICES REPORT (RSR) INSTRUCTION MANUAL

Improving Access To Care: Using Community Health Workers to Improve Linkage and Retention in HIV Care

Michigan Department of Community Health Part D Program QM Plan January 2008 Page 1 of 6

REQUEST FOR PROPOSALS CURE EPILEPSY AWARD

2011 Quality Management Plan Wake Forest University Baptist Medical Center Infectious Diseases Specialty Clinic Ryan White Program

Revised Progress Update and Disbursement Request. March 2016 Geneva, Switzerland

NQC Part B Quality Management Program Assessment Tool

Q: How does the Assessment of Fair Housing (AFH) compare to the Analysis of Impediments (AI)?

Ryan White Part A. Quality Management

Ryan White All Grantee Meeting ENROLLMENT & ELIGIBILITY: HOW TO MANAGE THE PATIENT SLIDING FEE SCALE AND CAP ON CHARGES. Jana D.

2017 ANNUAL PROGRAM TERMS REPORT (PTR)/ALLOCATIONS INSTRUCTION MANUAL

What is Quality Improvement? How can Key Principles be Applied in HIV Care? The Quality Academy Tutorial 2

Cleveland TGA Service Standard of Care

Ryan White Eligibility Determination and Recertification: Improving Efficiency

Federal Demonstration Partnership (FDP) DATA Act Section 5 Grants Pilot Update and CDER Library Test Model Brief. May 5, 2016

State Partnership Performance Measures

Randomized Controlled Trials to Test Interventions for Frequent Utilizers of Multiple Health, Criminal Justice, and Social Service Systems

HIV/AIDS SURVEILLANCE HIV STATE FUNDS SEXUALLY TRANSMITTED DISEASES COMMUNICABLE DISEASE EXPANSION BUDGET

UNFPA shall notify applying organizations whether they are considered for further action.

FY19 Accountability Court Grant Application Webinar COUNCIL OF ACCOUNTABILITY COURT JUDGES (CACJ)

TECHNICAL ASSISTANCE GUIDE

2013 Annual Ryan White HIV/AIDS Program 10/7/2013 Data Training

Ryan White HIV/AIDS Treatment Extension Act

How to Implement a Gaps Analysis Framework to Guide Quality Improvement in ART Programs

PCORI Funding Announcement (PFA):

FY19 Adult English Literacy PROGRAM Grants Request for Proposals and Application Instructions Date of Issue: January 26, 2018

HIV/AIDS SURVEILLANCE

FY 2017 ADAP Emergency Relief Funds

NIGERIA. AIDS Prevention Initiative in Nigeria (APIN) Capacity Building for the Quality Management Programme. AIDS Prevention Initiative Nigeria

Centralizing Client Level Data Improvements

Jeffress Trust Awards Program in Interdisciplinary Research Frequently Asked Questions FAQs ( ) Eligibility

3. STANDARD COMMITTEE ITEMS Reminder: Meeting attendance confirmation required at least 48 hours prior to meeting date. data review.

Ryan White HIV/AIDS Program Part B Repor9ng Requirements

Tips for PCMH Application Submission

Grants Program Request for Proposal (RFP)

Collaborative Operations and Services Grant Program GUIDELINES Revised January 15, 2014

Establishing an HIV/AIDS Pharmacy Practice in an Underserved Inner City Environment Facilitators and Barriers

Exhibit A GENERAL INFORMATION

Instructions for Matching Funds Requests

o Recipients must coordinate these testing services with other HIV prevention and testing programs to avoid duplication of efforts.

1. Identify pre-ati predictors of post-treatment control (PTC) or delay to rebound 2. Diversify the HIV cure clinical study population

Frequently Asked Questions to the Southern HIV Impact Fund Request for Proposals Updated August 12, 2017

Children s Medicaid System Transformation: HCBS Rates & SPA Rate Code Review. December 21, 2017

QUALITY IMPROVEMENT PROGRAM Mounta in Counties CARE & Case Management Program s

Introduction to Data Submission

MOC AACN Research Grant

Analysis and Use of UDS Data

Clinic-Based Retention in Care: Description, Outcomes, and Lessons Learned Jenna Donovan, MPH Byrd Quinlivan, MD Aimee Wilkin, MD Amy Heine, NP

Transgender Leadership Initiative Request for Proposals

C.H.A.I.N. Report. Update Report #30. The Impact of Ancillary Services on Entry & Retention to HIV Medical Care in New York City

Time and Effort Documentation

MENTAL HEALTH / SUBSTANCE ABUSE QI NETWORK April 19, 2013 at 2:00 p.m. Ryan White Part A Program Office 115 S. Andrews Ave., Ft. Lauderdale, FL 33301

Eligible Professional Core Measure Frequently Asked Questions

RFP DOH STATE HOUSING OPPORTUNITIES FOR PERSONS WITH AIDS (HOPWA) SERVICES Questions and Answers

MEDICARE CCLF ANALYTICS: MEDICARE ANALYTICS DATA ENGINE (MADE)

FY2019 Competitive Grant FAQs January 19, 2018

Cross-Site Data Reporting and. Evaluation. Phillip W. Graham, DrPH. Pamela Roddy, PhD. PFS Cross-site Evaluation Project Director CSAP, PEPC COR

What are the potential ethical issues to be considered for the research participants and

Commonwealth Health Research Board ("CHRB") Grant Guidelines for FY 2014/2015

ATTACHMENT II EXHIBIT II-C Effective Date: February 1, 2018 HIV/AIDS SPECIALTY PLAN

Accelerated Translational Incubator Pilot (ATIP) Program. Frequently Asked Questions. ICTR Research Navigators January 19, 2017 Version 7.

Jumpstarting population health management

The Leapfrog Hospital Survey Scoring Algorithms. Scoring Details for Sections 2 9 of the 2017 Leapfrog Hospital Survey

Fostering Effective Integration of Behavioral Health and Primary Care in Massachusetts Guidelines. Program Overview and Goal.

Income/Revenue Diversification

GUIDE: Reporting Template_Tuberculosis

2019 PANCREATIC CANCER ACTION NETWORK CATALYST GRANT. Program Guidelines and Application Instructions

DA: November 29, Centers for Medicare and Medicaid Services National PACE Association

WHITE PAPER. Taking Meaningful Use to the Next Level: What You Need to Know about the MACRA Advancing Care Information Component

Healthcare Worker Orientation Package on the Differentiated Care Operational Guide Participant s Manual January 2017

Measure #46 (NQF 0097): Medication Reconciliation Post-Discharge National Quality Strategy Domain: Communication and Care Coordination

HUD INTERMEDIARY TOOLKIT: REPORTING

MEDICARE CCLF ANALYTICS: MEDICARE ANALYTICS DATA ENGINE (MADE)

CureSearch Young Investigator Awards in Pediatric Oncology Drug Development Request for Applications and Guidelines

FULTON COUNTY, GEORGIA OFFICE OF INTERNAL AUDIT FRESH and HUMAN SERVICES GRANT REVIEW

DIRECT SERVICES GRANT PROGRAM

Transcription:

HAB/NQC HIV Cross-Part Care Continuum Collaborative (H4C) Frequently Asked Questions A) General 1) What is the H4C Collaborative? H4C is an initiative undertaken by the HRSA HIV/AIDS Bureau (HAB) and the National Quality Center (NQC) with a focus on increasing access to HIV care and viral load suppression. The initiative is based on a model developed by the Institute for Healthcare Improvement s Breakthrough Series Collaborative. Its ultimate aim is to increase the number of HIV-infected individuals with undetectable viral loads using the HIV Care Continuum as a framework, thus improving the quality of HIV care and related health outcomes. It is a peer learning opportunity of regional HIV providers across Ryan White funding streams, whose underlying assumption is that a single grantee or single Ryan White Part funding stream cannot achieve this aim alone. 2) What are the Collaborative s major goals? The three primary goals of the H4C Collaborative are to: Build a state s capacity for closing gaps across the HIV Care Continuum to ultimately increase viral load suppression rates for individuals living with HIV Align quality management goals across all Ryan White HIV/AIDS Program Parts within a state to jointly meet legislative quality management mandates Implement joint quality improvement activities to advance the quality of care for people living with HIV within a region and to coordinate HIV services seamlessly across Parts 3) Who is involved? The HIV Cross-Part Care Continuum Collaborative (H4C) will focus on states that have been identified by HAB with the potential for measurable improvements: AR, MO, MS, NJ, and OH. H4C will engage grantees from all Ryan White Part funding sources in these five states and will benefit regional teams focusing on viral load suppression while fostering a quality management infrastructure built to self-sustain their efforts beyond the formal HAB/NQC sponsorship of the Collaborative. 4) What are Learning Sessions? Participating Response Teams (cross-functional groups of 5-10 local quality leaders representative of Ryan White grantees across the entire state/region and at least one consumer) will meet together with the H4C faculty every four to six months during the Collaborative to learn from each other, to share experiences, to receive coaching from assigned improvement coaches and to develop new plans for action and tests for change. The final meeting will transition this Collaborative to Page 1 of 6 H4C Learning Session 1 Rockville, MD

community leadership and will take stock of progress made, lessons learned and best practices revealed during the HAB/NQC management phase to share with other grantees. 5) What are Action Periods? The time between Learning Sessions is called an Action Period. During Action Periods, grantees and sub-grantees participating in the State Teams work within their agency to test and implement improvements. Grantees try out multiple changes and collect data to measure the impact of the changes. H4C participants remain in continuous contact with other agencies in their state, their state Response Team, HAB, and NQC to learn from each other s best practices. 6) Where can I ask for help? Please contact the National Quality Center at 212 417 4730 or Michael@NationalQualityCenter.org if you have more questions about this Collaborative or other services provided by NQC. For detailed information about H4C please visit our website at NationalQualityCenter.org/Collaboratives. B) Measures and Data Collection 1) What are the selected measures for H4C? Who selected them? Performance measurement plays an important role throughout H4C, evaluating the impact of changes made to improve the quality and systems of HIV care. Please keep in mind that the end point of this Collaborative is improving care for people living with HIV, not measurement alone. In collaboration with quality improvement experts and input from the September 24 th, 2013 Vanguard Meeting in DC, the HIV/AIDS Bureau (HAB) and the National Quality Center (NQC) decided on the following list of required measures consistent with HAB measure definitions: 1. Percentage of patients, regardless of age, with a diagnosis of HIV with a HIV viral load less than 200 copies/ml at last HIV viral load test during the measurement year (HAB Measure: HIV Viral Load Suppression) 2. Percentage of patients, regardless of age, with a diagnosis of HIV prescribed antiretroviral therapy for the treatment of HIV infection during the measurement year (HAB Measure: Prescription of HIV Antiretroviral Therapy) 3. Percentage of patients, regardless of age, with a diagnosis of HIV who had at least one medical visit in each 6-month period of the 24-month measurement period with a minimum of 60 days between medical visits (HAB Measure: HIV Medical Visit Frequency) 4. Percentage of patients, regardless of age, with a diagnosis of HIV who did not have a medical visit in the last 6 months of the measurement year (HAB Measure: Gap in HIV Medical Visits) In addition, these required measures will include an analysis for disparities by stratifying by race/ethnicity, gender, and age. State teams will be given the option to stratify their data in additional ways. For more information about which stratifications your state is analyzing, contact your Response Team. For further measure details, contact your State Response Team, NQC Coach or NQC Faculty. Page 2 of 6 H4C Learning Session 1 Rockville, MD

2) What are the due dates for the State Response Team to submit the performance measurement report? What is the reporting timeframe for each reporting cycle? The following table shows the measurement periods for each report due date. Note that Gap, ARV Prescription, and Viral Suppression are one year measures while Medical Visit Frequency is a two year measure. If you are a CAREWare user, please run the reports in the performance measurement module using the last day of the measurement period as your AS OF date. CAREWare will calculate one year versus two years automatically. You may find the measures to upload into your local CAREWare at this link. Contact Michael Hager with any questions (Michael@NationalQualityCenter.org). Report Due Dates One Year Measurement Periods Two Year Measurement Periods April 1, 2014 January 1, 2013 - December 30, 2013 January 1, 2012 - December 30, 2013 June 2, 2014 March 1, 2013 - February 31, 2014 March 1, 2012 - February 31, 2014 August 1, 2014 May 1, 2013 - April 31, 2014 May 1, 2012 - April 31, 2014 October 1, 2014 July 1, 2013 - June 31, 2014 July 1, 2012 - June 31, 2014 December 1, 2014 September 1, 2013 - August 31, 2014 September 1, 2012 - August 31, 2014 February 2, 2015 November 1, 2013 - October 31, 2014 November 1, 2012 - October 30, 2014 April 1, 2015 January 1, 2014 - December 30, 2014 January 1, 2013 - December 30, 2014 June 1, 2015 March 1, 2014 - February 31, 2015 March 1, 2013 - February 31, 2015 August 3, 2015 May 1, 2014 - April 31, 2015 May 1, 2013 - April 31, 2015 3) Do participating agencies have to measure all measures and all disparities? Grantees that fund or provide medical services needs to submit performance data every other month. Grantees that subcontract for medical care need to collect the performance and disparity data from their subgrantees. The entire spreadsheet should be completed including the required disparity data. See the H4C Measures Info Sheet document for details regarding each measure. 4) In addition to the measure definitions, what do I need to know about these measures? Note that the HIV Medical Frequency measure is over a 24-month measurement period, while the other measures are over a 12-month measurement year. Also, a lower percentage for the gap in HIV medical visits measure represents better performance. 5) Are all Ryan White HIV/AIDS Program-funded grantees and agencies to submit performance measurement data bimonthly? Yes, all grantees and agencies within the H4C States that provide adult and/or pediatric HIV medical care are to submit performance measurement data. Grantees who subcontract HIV medical care are to collect performance data from each agency (i.e., subgrantees). For example, a Part C network would collect and contribute performance measurement data from each of the agencies in that network. The State Response Team is responsible for collecting and consolidating all data from all agencies to send to NQC. 6) I am a Part C/Part D grantee (direct recipient of Ryan White Program funds), but also a Part A/B agency. What are the expectations for submitting data? Page 3 of 6 H4C Learning Session 1 Rockville, MD

It is the ultimate goal of this Collaborative that all grantees are collecting and reporting data on the Collaborative measures. An organization that is a Part C/D grantee as well as a Part A/B agency would only have their performance data counted once for each of the appropriate measures. The State Response Team will need to develop a methodology to assure that each organization s performance data are only counted once per measure regardless of the number of contracts (either directly from HAB or from another Ryan White grantee). 7) I am a Part A or Part B grantee, do I have to collect data from my subcontractor? Yes. Part A or B grantees with subcontractors that provide medical care need to collect performance data from their subcontractors. The State Response Team will develop a state-specific data collection methodology to avoid duplicative counting of grantees and agencies receiving multiple Ryan White funding. (See previous FAQ.) 8) I already collect performance data, but the H4C measures are different. Do I need to submit data? Yes. The aim of this Collaborative is to harmonize performance measures and data collection methodologies across Ryan White grantees within the state. The H4C measures represent HAB s core measures, which all grantees are strongly encouraged to utilize. HAB and NQC recognize the data collection efforts of busy HIV clinics and hope that the Collaborative will lead to streamlined measure definitions, un-duplicative data collection efforts for all Parts and flexible information systems. 9) I just submitted RSR data. Do I need to submit H4C data to my Response Team? Yes, these data sets serve different purposes; however, some data elements are similar. The Response Team should explore opportunities to reduce data entry redundancies. 10) Should each participating Part A, B, C and D grantee compute their own score based on the Collaborative measures? Each grantee is encouraged to know their own performance scores. The State Response Team will develop a state-specific approach to data collection that allows for individual grantee and agency data analyses as well as aggregated state-wide scores. 11) How do I know that my performance data were acceptable? If you submit your performance data to the State Response Team, check with the assigned individual(s) to clarify the accuracy of your submission. Routine feedback by the Response Team is critical to allow for corrections in the data submission process and to suggest improvements to grantees and agencies. Once the Response Team submits the bi-monthly data report, the Collaborative Faculty will carefully review each data report and provide constructive feedback. 12) Can I expect negative consequences for low performing scores? Participation in the Collaborative is not intended to be punitive. Rather, the aim of the Collaborative is to improve HIV care across programs, learn from the other grantees and agencies and improve the data collection process. Lower than expected performance scores provide opportunities for learning and will guide individual grantees and the Response Team in their efforts to prioritize improvement activities. 13) Does my agency need IRB approval for submitting data? Page 4 of 6 H4C Learning Session 1 Rockville, MD

The performance data that are reported during the course of this Collaborative are for the purpose of quality improvement and not for research. Current regulations exempt organizations to request IRB approval since aggregate data findings are used for quality improvement and no individual patient data are being reported. Check with your local agency if you have concerns. 14) Patients may be seen by multiple medical providers over time within the state. Are we expected to un-duplicate these patients when reporting to H4C? No. We do not expect the Response Team to un-duplicate individual patients data even though patients may have received services at more than one location. It is expected, however, that the Response Team will un-duplicate the agency data. Agency data will be counted only once in the bimonthly aggregated data report. May want to say something that the improvement is targeted at the grantee/agency level not at the patient level. 15) Is a sampling methodology approach allowable? The goal is for all patients receiving medical care to be included. Please inform your state s coach if you have grantees that cannot submit performance measure and disparity data for all patients. 16) Should fee-for-service medical providers be included in the measurement process? Performance data should be incorporated from all agencies, including fee-for-service, to calculate the performance scores. This Collaborative may provide the impetus to include these additional providers in the routine reporting (since the grantee pays for these services). 17) In the reporting template, numerators and denominators are requested for each measure. Are we to sum the numerators and denominators for all subgrantees reporting to grantees? Instructions for data collection are located in the first tab of the Data Collection Tool. The Response Team will need to un-duplicate the list of agencies to ensure that each agency/grantee is counted only once when reporting. Once reviewed, then sum the denominator and numerator data (all individual patients) and compute the scores. Both the aggregated and disaggregated data are then shared with the Faculty. 18) When asked to report these measures stratified by various factors such as race/ethnicity, does the Response Team need to submit the actual stratified reports? Yes. The stratified reports need to be reported by each entity only to the Response Team and Response Team will report this data to the H4C Faculty. The entire spreadsheet should be completed by each entity and the response team will aggregate the data for reporting to H4C Faculty. 19) I have additional questions. Whom should I ask? Please contact your Response Team for their support. The Response Team is supported by an NQC coach. C) Reporting 1) What are the different kinds of reporting that will occur during H4C? Page 5 of 6 H4C Learning Session 1 Rockville, MD

During H4C, multiple different reports will be generated by Response Teams and by coaches, including: (1) performance measures reports, (2) care continuum data, (3) quality improvement strategies, and (4) state team progress reports. (1) Performance Measures Reports See the Measurement section for detailed information on this reporting. These reports are based on the four HIV/AIDS Bureau core measures and use the data collection tool developed by the National Quality Center. These are reported every other month, where agencies submit to their Response Team and Response Team aggregate data to send to H4C faculty. (2) Care Continuum Data With the help of state epidemiologists, each Response Team is expected to completed the state s care continuum. Although no H4C template will be provided, all states should use the CDC model for the care continuum. Care continuum reporting will have no set deadlines, but should occur twice a year. (3) Quality Improvement Strategies Each agency should submit periodic updates on innovations they are using in the field to improve performance on the local care continuum or on their state s quality improvement project. The Response Team is expected to review the items submitted by their state team members. These submissions will occur via the National Quality Center s latest project, an online intervention-sharing platform. While there are no deadlines for this reporting, agencies and state teams are expected to report when new interventions are implemented or other updates have occurred. (4) State Team Progress Report This report will be completed every other month by the coach in collaboration with the Response Team. The form is available on the H4C Participants GlassCubes and addresses areas such as: quality management infrastructure, performance measurement, care continuum, quality management plan, quality improvement project updates, capacity building, consumer involvement, progress on state aim statement, and technical assistance needs. These reports are meant to aid the H4C faculty and coaches in identifying need and providing assistance. 2) What are the deadlines for these items? Report Performance Measures Report Care Continuum Data Quality Improvement Strategies State Team Progress Report Timeline Every other month; see Measures FAQ Twice a year As projects are implemented or updated Every other month; with performance measures report Page 6 of 6 H4C Learning Session 1 Rockville, MD