P C R C. Physician Clinical Registry Coalition. January 1, [Submitted online at: https://www.regulations.gov/document?d=cms ]

Similar documents
P C R C. Physician Clinical Registry Coalition. [Submitted online at: ]

P C R C. Physician Clinical Registry Coalition. February 8, 2018

Submitted electronically:

Medicare Physician Fee Schedule. September 10, 2018

December 19, Dear Acting Administrator Slavitt:

April 26, Ms. Seema Verma, MPH Administrator Centers for Medicare & Medicaid Services. Dear Secretary Price and Administrator Verma:

MIPS (Merit-based Incentive Payment System) Clinical Practice Improvement Activities

The Society of Thoracic Surgeons

March 28, Dear Dr. Yong:

Here is what we know. Here is what you can do. Here is what we are doing.

2017/2018. KPN Health, Inc. Quality Payment Program Solutions Guide. KPN Health, Inc. A CMS Qualified Clinical Data Registry (QCDR) KPN Health, Inc.

Here is what we know. Here is what you can do. Here is what we are doing.

Kate Goodrich, MD MHS. Director, Center for Clinical Standards & Quality. Center for Medicare and Medicaid Services (CMS) May 6, 2016

RE: Next steps for the Merit-Based Incentive Payment System (MIPS)

CMS-3310-P & CMS-3311-FC,

December 19, Dear Acting Administrator Slavitt:

2017 Transition Year Flexibility Improvement Activities Category Options

MACRA Frequently Asked Questions

MACRA Quality Payment Program

Statement for the Record. American College of Physicians. Hearing before the House Energy & Commerce Subcommittee on Health

RE: CMS-1677-P; Medicare Program; Request for Information on CMS Flexibilities and Efficiencies

Are physicians ready for macra/qpp?

Statement for the Record. American College of Physicians. U.S. House Committee on Ways and Means Subcommittee on Health

Merit-Based Incentive Payment System: 2018 Performance Year

MACRA and the Quality Payment Program. Frequently Asked Questions Edition

June 27, CMS 5517 P Merit-Based Incentive System (MIPS) and Alternative Payment Model (APM) Incentive Under the Physician Fee Schedule

CMS Quality Payment Program: Performance and Reporting Requirements

QUALITY PAYMENT PROGRAM

MACRA Implementation: A Review of the Quality Payment Program

May 11, The Honorable Seema Verma Administrator Centers for Medicare & Medicaid Services

The three proposed options for the use of CEHRT editions are as follows:

RE: Request for Information: Centers for Medicare & Medicaid Services, Direct Provider Contracting Models

AAWC ALERT Call for Action from Physicians

December 19, Dear Acting Administrator Slavitt:

Re: Payment Policies under the Physician Fee Schedule Proposed Rule for CY 2014; 78 Fed. Reg. 43,281 (July 19, 2013); CMS-1600; RIN 0938-AR56

March 6, Dear Administrator Verma,

November 16, Dear Ms. Frizzera,

Final Meaningful Use Rules Add Short-Term Flexibility

WHITE PAPER. Taking Meaningful Use to the Next Level: What You Need to Know about the MACRA Advancing Care Information Component

June 27, Dear Acting Administrator Slavitt:

Evaluation & Management ( E/M ) Payment and Documentation Requirements

MACRA MACRA MACRA 9/30/2015. From the Congress: A New Medicare Payment System. The Future of Medicare: A Move Toward Value Driven Healthcare W20.

Re: CMS Code 3310-P. May 29, 2015

MACRA Quality Payment Program

From Surviving to Thriving in the QPP World

June 27, Dear Secretary Burwell and Acting Administrator Slavitt,

Quality Payment Program Year 2: 2018 MIPS Participation. An Introductory Guide for CRNAs in 2018

Seema Verma Centers for Medicare & Medicaid Services Department of Health and Human Services Attn: CMS-1696-P P.O. Box 8016 Baltimore, MD

RE: Medicare Program; Request for Information Regarding the Physician Self-Referral Law

June 25, Dear Administrator Verma,

State Medicaid Recovery Audit Contractor (RAC) Program

Thank CMS for New Process for Evaluation of CPT Codes and Support Proposed Change to Eliminate the Use of Refinement Panels

CONTENTS. Introduction...3. Current State of Regulatory Burden...4. Burden Level by Regulatory Issue...5. The Move Toward Value...

Prior to implementation of the episode groups for use in resource measurement under MACRA, CMS should:

Washington Update. Agenda

2014 CMS PROPOSED PHYSICIAN FEE SCHEDULE OVERVIEW & ANALYSIS

Re: [CMS-5061-P] Medicare Program: Expanding Uses of Medicare Data by Qualified Entities

The MIPS Survival Guide

2011 Melanoma Physician Quality Reporting (PQRS): FREQUENTLY ASKED QUESTIONS

MACRA Open Call December 5 th, 2016

CMS Priorities, MACRA and The Quality Payment Program

QUALITY PAYMENT PROGRAM YEAR 2 CY 2018 PROPOSED RULE Improvement Activities Component Reporting Requirements. No change.

Quality Payment Program October 14, 2016

The Quality Payment Program: Overview & Roles and Responsibilities

June 27, Mill Road, Suite 1300, Alexandria, VA P F

Overview of the EHR Incentive Program Stage 2 Final Rule published August, 2012

The Healthcare Roundtable

STS offers the following comments regarding the proposed changes outlined in the Notice of Proposed Rulemaking.

Overview of the EHR Incentive Program Stage 2 Final Rule

Overview of Quality Payment Program

1875 Connecticut Avenue, NW, Suite 650 P Washington, DC F

Comments to the CMS Request for Information, Merit-based Incentive Payment System and Promotion of Alternative Payment Models

Error! Unknown document property name.

ACCOUNTABLE CARE ORGANIZATION & ALTERNATIVE PAYMENT MODEL SUMMIT

Centers for Medicare & Medicaid Services: Innovation Center New Direction

June 27, Dear Acting Administrator Slavitt:

December 19, Dear Acting Administrator Slavitt:

Decoding the QPP Year 2 Quality Measure Benchmarks and Deciles to Maximize Performance

MACRA and MIPS. How Medicare Meaningful Use and PQRS are Changing

Quality Payment Program MIPS. Advanced APMs. Quality Payment Program

June 19, Submitted Electronically

Understanding Medicare s New Quality Payment Program

May 31, Ms. Seema Verma Administrator Centers for Medicare & Medicaid Services Department of Health and Human Services Baltimore, MD

June 27, Submitted electronically via

Assignment of Medicare Fee-for-Service Beneficiaries

Legislative Update Wipfli CAH/RHC Conference

MIPS Deep Dive: 9 steps to Reporting. Sharon Phelps QPP Webinar Series Webinar 4 June 20, 2017

MACRA FLEXIBILITY & THE MACRA FINAL RULE. Compliance & Opportunity for Your Practice

Leveraging the accredited CME system to simplify clinician participation in the Quality Payment Program:

Stage 3 and ACI s Relationship to Medicaid MU Massachusetts Medicaid EHR Incentive Program

RE: Medicare Program; CY 2018 Updates to the Quality Payment Program (CMS P)

How CME is Changing: The Influence of Population Health, MACRA, and MIPS

Board of Directors. June 27, 2016

Highlights of the 2018 Medicare Physician Fee Schedule (MPFS) Final Rule

The Evolving Landscape of Healthcare Payment: Incentive Programs and ACO Model Optimization. Quality Forum August 19, 2015

2017 Transition Year Flexibility Advancing Care Information (ACI) Category Options

REPORT OF THE BOARD OF TRUSTEES

Maximizing Your Potential Under MIPS Oregon MACRA Playbook Conference

MIPS Advancing Care Information: Tips, Tools and Support Q&A from Live Webinar March 29, 2017

Measures That Matter: Simplifying Clinical Quality

Transcription:

Ms. Seema Verma, MPH Administrator Centers for Medicare & Medicaid Services Department of Health and Human Services Attention: CMS-5522-FC P.O. Box 8016 Baltimore, MD 21244-8016 P C R C Physician Clinical Registry Coalition [Submitted online at: https://www.regulations.gov/document?d=cms-2017-0082-1300] Re: CMS-5522-FC Medicare Programs: CY 2018 Updates to the Quality Payment Program Dear Ms. Verma: The undersigned members of the Physician Clinical Registry Coalition (the Coalition) appreciate the opportunity to comment on the final rule on the calendar year (CY) 2018 updates to the Quality Payment Program (QPP) established under the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) (Pub. L. 114-10) (the Final Rule). 1 The Coalition is a group of medical societies and other physician-led organizations that sponsor clinical data registries that collect and analyze clinical outcomes data to identify best practices and improve patient care. We are committed to advocating for policies that encourage and enable the development of clinical data registries and enhance their ability to improve quality of care through the analysis and reporting of clinical outcomes. Over 75% of the members of the Coalition have been approved as qualified clinical data registries (QCDRs) and most of the other members are working towards achieving QCDR status. The Coalition submitted comments to the Centers for Medicare and Medicaid (CMS) on the CY 2018 proposed rule, 2 which requested that CMS implement a variety of changes and clarifications for the 2018 performance period to encourage the use of QCDRs and other clinical outcomes registries. Specifically, we asked that CMS make changes to the QCDR measures review process, simplify the QCDR self-nomination process, increase the credit for the use of a clinical outcomes registry in the ACI category, encourage clinician-led QCDRs and clinical outcomes registries by awarding these registries increased credit under the improvement activities and ACI categories, create separate benchmarks for electronic and manual reporting of QCDR measures, and give QCDRs and other clinical outcomes data registries the option to assist 1 82 Fed. Reg. 53,568 (Nov. 16, 2017) [hereinafter Final Rule]. 2 82 Fed. Reg. 30,010 (June 30, 2017).

Page 2 virtual groups with reporting. In response to our comments, CMS increased the topped out measure scoring cap to 7 points, 3 awarded 10 percentage points for reporting to a public health agency or clinical data registry regardless of whether an immunization registry is available, 4 and clarified the ability of clinical outcomes registries to assist virtual groups. 5 CMS also finalized its proposal on the simplification of the QCDR self-nomination process, 6 improvements to the self-nomination application, 7 and assignment of IDs to record and track ownership of QCDR measures. 8 The Coalition appreciates and applauds these actions. However, the Coalition still has significant concerns about several issues related to QCDRs and other clinical outcomes data registries. Specifically, we urge CMS to implement the following changes and clarifications to further encourage the use of QCDRs and other clinical outcomes registries: (1) Create an organized, transparent, and consistent QCDR measures review process (2) Keep the QCDR measure approval process separate from the standards used for the Call for Quality Measures process and do not create more stringent standards for QCDR measures (3) Grant measures with high performance 7 points under the scoring cap in CY 2018 (4) Define the data inaccuracies or errors which could result in probation or suspension for QCDRs 1. CMS Should Create an Organized, Transparent, and Consistent QCDR Measure Review Process The Coalition s most significant concern remains the lack of transparency and consistency inherent in the QCDR measures review process. The Coalition has been addressing these concerns with CMS for several years, including the review of QCDR measures within the Physician Quality Reporting System (PQRS). We previously sent Pierre Yong, MD, MPH, MS, Director of the Quality Measurement and Value-Based Incentives Group, letters dated October 29, 2016 and July 11, 2017 regarding the unstructured and disorganized process that many of our QCDR members faced during the requested consolidation of proposed non- PQRS QCDR measures and the 2017 QCDR measure review process, respectively. The Coalition attended a call with Dr. Yong and his team on August 1, 2017, where we discussed the concerns raised in our July 11, 2017 letter and possible solutions. We also submitted our concerns on the QCDR measure review process as comments to the CY 2018 QPP proposed rule, which specifically 3 Final Rule at 53,726. 4 Id. at 53,663. 5 Id. at 53,610-611. 6 Id. at 53,811. 7 Id. at 53,812. 8 Id. at 53,813-14.

Page 3 detailed the experiences of Coalition members during the 2017 process and requested a more organized, transparent, and consistent QCDR measures review process for 2018. Despite these efforts, Coalition members experienced many of the same difficulties during the QCDR measure review process for the 2018 performance period. In the final rule, CMS stated the following: We understand the commenters concerns, but would like to note we have been working to implement process improvements and develop additional standardization for the 2018 performance period self-nomination and QCDR measure review, in which consistent feedback is communicated to vendors, additional time is given to vendors to respond to requests for information, and more detailed rationales are provided for rejected QCDR measures. Furthermore, through our review, we intend to communicate the timeframe in which a decision reexamination can be requested should we reject QCDR measures. 9 CMS discussed its intention to assign specific personnel to communicate QCDR decisions and its use of an internal decision tracker to track all decisions made on QCDRs and QCDR measures. 10 CMS also stated it is working on a standardized review process and timelines. 11 However, Coalition members did not see any evidence of such efforts to improve the QCDR measures review process during the 2018 performance period. Accordingly, the Coalition urges CMS to make broad changes to its QCDR measure review process to incentivize registries to develop new QCDR measures and continue to self-nominate as QCDRs. Despite the Coalition s explicit description of the opaque, disorganized, and contradictory process its members experienced during the 2017 QCDR measure review process through numerous avenues, CMS did not include any specific proposals to fix the process in the Final Rule. While the Coalition acknowledges that there has been improvement with CMS s responsiveness to concerns and willingness to discuss QCDR measures, Coalition members experienced many of the same frustrations with CMS during the 2018 QCDR measure review period as it did the previous year. Specifically, as the following examples demonstrate, Coalition members were subject to impractical timelines, rejection of measures without clinical rationale, inconsistent feedback and unclear rejection methodology, and a disjointed measure-by-measure review process. Impractical Timelines. During the 2018 measures review process, CMS and its contractors frequently set unreasonable deadlines for Coalition members to make changes to measures or replace certain measures. For example, CMS emailed a Coalition member on a Friday at 5:01pm asking for a modified measure specification by the end of the day on the following Monday. Another Coalition member reported that it received an email 9 Id. at 53,810. 10 Id. 11 Id.

Page 4 from CMS at 9:30 am that requested approval of edits to a measure by 12:00 pm that same day. CMS also set unrealistic expectations regarding Coalition members availability to attend calls to discuss measures. For example, after a Coalition member requested a call, CMS emailed the member the next day at 10:45am requesting a call at 11:00 am. CMS also asked several Coalition members to combine clinically different measures and merge measures to create a multi-strata measure within a week or less. Registries need time to confer with clinical experts to develop responses to CMS requests or combine measures. The deadlines CMS and its contractors set during the 2018 QCDR measures review process were extremely challenging, and at times impossible, to meet. The Coalition requests at least a full working week to modify measures, and significantly more than a week to combine or merge measures. A CMS directive to consolidate measures typically requires several months of testing and analysis. Rejection of Measures without Regard for Clinical Rationale. Multiple Coalition members report receiving standardized explanations for rejections of measures without any consideration for the clinical area. Specifically, CMS rejected measures because they were already standards of care or were not sufficiently robust. However, in most of these cases the registries provided data that demonstrated a gap in care that clinician experts in the field validated. CMS did not appear to understand the clinical justifications behind many of these measures. In addition, CMS did not ask for a detailed clinical rationale on the self-nomination application. However, in order to appeal rejections, Coalition members had to put together detailed clinical justifications. One Coalition member reported that after receiving a rejection, the registry spent hours explaining the clinical rationale in writing and had multiple emergency calls with clinical experts. In order to avoid these kinds of issues in the future, the Coalition requests that the self-nomination application asks for the clinical rationale for a measure in addition to the gap in care. The Coalition also requests that CMS provides a detailed, customized explanation for its rejection of each individual measure. Inconsistent Feedback and Unclear Rejection Methodology or Consolidation Rationale. CMS does not appear to have a consistent review process for the materials provided to the agency to support the approval of QCDR measures, which leads to inaccurate feedback and decisions. For instance, one Coalition member stated that CMS provided feedback on two measures and the member attempted to set up a call to discuss all of the measures under review. But, before the call was scheduled, CMS sent another notification that those same measures would be denied because the measures did not have a follow-up or plan of care even though both measures had a baseline plan of care component and follow-up. CMS clearly did not carefully analyze these measures or confused these measures with other measures. Additionally, one Coalition member stated that a CMS contractor provisionally approved two measures, and then a week later a different CMS contractor rejected those same two measures. CMS contractors were also unprepared when discussing measures with

Page 5 Coalition members, often asking basic questions that had already been answered by the measure owner in the initial application or as a response to the initial feedback on a measure. Coalition members report that the final decisions on appealed measures were based on the opinions of one or two high-level CMS officials who were not involved in the earlier review of these measures. Therefore, final decisions on measures often did not follow the previous feedback and appeared to be independent from the rest of the review process. CMS also does not fully explain its methodology for rejecting measures. Coalition members reported that some of their measures were rejected based on infrequent use. If the only standard CMS uses to determine if a measure is valid is the number of times it was reported, specialty QCDRs, who created measures specifically for their members before the low-volume threshold excluded a large number of Medicare providers from the MIPS program, could be unfairly penalized. Measures should not be rejected based on a mathematical formula that equates the number of times the measure has been reported with the viability of the measure for the MIPS program. When rejecting new measures without an explicit reason, CMS sometimes stated that the registries can still use the measures for internal quality improvement purposes. By making this statement, CMS implied that it does not understand physician motivation for reporting measures, as physicians will not be as likely to voluntarily report measures if they are not approved for QCDR reporting purposes. In addition, QCDRs should not be required to combine measures without a clear rationale from CMS, especially when the QCDR has evidence that the combined measure would be of lesser quality than the individual measures and would run counter to the purposes of quality improvement. We also request the development of a more consistent and standardized process to contest and demonstrate that merging and consolidating measures is not appropriate. Disjointed Communication/Review. Several Coalition members experienced a completely disjointed measures review process. Review of a single measure was spread out over multiple CMS contractors, responses to submissions and feedback were intermittent and sporadic, and discussions with CMS were inefficient because CMS refused to discuss more than one measure at a time. For example, twelve hours after a call to discuss a measure, CMS gave feedback to a Coalition member about a different measure and the member had to schedule another separate call to discuss this new measure. One Coalition member reported that despite efforts to discuss multiple measures on a scheduled call, CMS would only discuss a single measure. As a result of this slow and inefficient process, some registries were not able to discuss feedback on all of their measures by the end of the review period. Scheduling calls through the JIRA review site was also disorganized. CMS contractors frequently corresponded through JIRA to set up calls and it was difficult to align schedules through the system. JIRA also had inconsistent email delivery, so registries

Page 6 had to check the system regularly to ensure they did not miss a message. The time scheduled for calls about an individual measure was also typically not long enough to have a thorough discussion about a measure. CMS contractors scheduled calls in thirtyminute windows and often were late getting on the calls. In addition, having multiple contractors working with a single QCDR s measure review causes inaccuracies. One Coalition member stated it worked with 9 separate contractors during the review process, and each contractor had a different way of analyzing measures and communicating feedback. Commenting in JIRA is also difficult to track when a QCDR submits a large number of measures. For example, when a Coalition member submitted 12 measures, every measure received a comment from a contractor, which forced the QCDR to review and reply to each response individually. While we understand that CMS is reviewing more than a thousand measures over a short timeframe, the piecemeal review of measures creates impractical timelines and additional work for CMS and measure owners. In order to reduce its burden, CMS could give new measures conditional approval for a two-year test period. A two-year test period will help both CMS and measure owners, because CMS will have fewer measures to review each year and measure owners will have time to collect sufficient data on each measure to make a stronger case for each measure s approval. The Coalition requests that CMS streamline the measures review process. Specifically, the Coalition would like CMS to provide feedback on all of the measures submitted by a registry within a single comment and schedule calls to discuss the feedback on all of these measures at once. The Coalition urges CMS to develop an online appointment system to schedule calls, schedule each call for at least 1.5 hours, and provide links to the previous history of the measures to ensure all parties have the necessary information during the call. The Coalition also requests that CMS reconsider our earlier proposals to maintain measures for a minimum of 2 years as long as the measures don t have substantive changes. This policy would reduce the number of measures that CMS has to review every year. 2. CMS Should Not Align QCDR Measure Approval with the Call for Quality Measures Process or Create More Stringent Standards for QCDR Measures In the Final Rule, CMS stated that it is interested in elevating the standards for which QCDR measures are selected and approved for use. 12 Specifically, CMS requested comments on whether the standards used in selecting and approving QCDR measures should align with the standards used in the Call for Quality Measures process. 13 While the Coalition agrees that QCDR measures should aim to be of the highest caliber, we strongly disagree with aligning these standards. Under the Call for Quality Measures process, QCDR measures would need to be used 12 Id. at 53,814. 13 Id.

Page 7 by the QCDR for one to two years before they can be listed on the self-nomination application. For most medical societies, the appeal of maintaining a QCDR and developing QCDR measures lies in the ability to create specialty-specific measures in a timely manner and allow clinicians to report meaningful outcomes. Increasing the time required for QCDR measures to be approved removes the incentives for self-nominating as a QCDR. Second, CMS stated it is under pressure to develop standards for QCDR measures that are more stringent than the standards for MIPS measures. In addition, in the Final Rule, CMS created higher standards for reviewing and determining topped-out QCDR measures than MIPS measures. As there is no statutory mandate that QCDR measures should be held to a higher standard than MIPS measures, these policies only serve to discourage the development of QCDRs. Coalition members reported increasing difficulty obtaining approval of QCDR measures, including measures that were approved in the prior year. When CMS requests changes to a single measure each year, it interferes with the establishment of reliable benchmarks and creates confusion for providers. If QCDR measure standards continue to become more stringent, Coalition members may no longer self-nominate as QCDRs. The Coalition also opposes any requirement for QCDRs to fully test (i.e., conduct reliability and validity testing of) QCDR measures by the time of submission of new measures during the selfnomination process. QCDRs do pilot test their quality measures prior to submission to CMS for approval, but typically, a measure cannot be fully tested until clinicians have collected more extensive data. This often can only happen after the measure has been approved for use in the MIPS process. Requiring extensive testing prior to approval of the measure by CMS would only delay the measure development process without significantly improving the quality or validity of the measure. Indeed, in some cases, requiring more advance testing will deter good measures from being developed at all. 3. CMS Should Grant Measures with High Performance 7 Points in the 2018 Performance Period In the Final Rule, CMS set a scoring cap of 7 points for topped out measures. 14 However, Coalition members reported that during a recent call after the Final Rule was issued, Dr. Dan Green indicated that CMS would give measures with high performance only 3 points in for the 2018 performance period. Dr. Green s justification for this scoring methodology is that these measures might not be able to be benchmarked in a meaningful way that will show variation. However, a 3-point cap is an abrupt adjustment that does not acknowledge the need for QCDRs to develop a broader set of data for individual measures. Measures initially identified as toppedout may later show room for improvement if additional clinicians report the measure. As a scoring cap of 7 points reflects better than average performance on measure achievement, the Coalition strongly requests that CMS award measures with high performance 7 points in the 2018 performance period. 14 Id. at 53,726.

Page 8 4. CMS Should Define its Policy for Placing QCDRs on Probation or Suspension for Data Inaccuracies and Errors In the Final Rule, CMS discussed its process for imposing probation on or disqualifying a thirdparty intermediary. 15 While the Final Rule does not make any changes or clarifications to the process, CMS stated that it received a number of comments on this topic and that it appreciates the input received. 16 The Coalition requests that CMS define what data inaccuracies or errors could result in a QCDR being placed on probation or suspension. While CMS provides each QCDR with a Data Issue Report on an annual basis, it is unclear which issues are significant and would create penalties. In addition, it is unclear if CMS would penalize any QCDRs for any inaccuracies or errors that may be outside the scope of the Data Issue Report. Because CMS has issued very little guidance on this topic, QCDRs do not know what checks to put into place to identify and track potential data inaccuracies or errors. Conclusion The Coalition appreciates the opportunity to comment on the Final Rule. We strongly support the expansion of the use of QCDRs and other clinical outcomes data registries to help ease clinicians burdens for submitting data under MIPS. While the Coalition greatly appreciates many of the improvements in the Final Rule, the considerations we have proposed will remove burdens from the QCDR measure review process and create further incentives to use third-party submission mechanisms. We urge CMS to adopt the Coalition s suggested changes to facilitate and promote the use of QCDRs and other clinical outcomes data registries. These changes will allow the use of registries to grow and ultimately result in even greater improvements in the quality of patient care. Thank you for the opportunity to submit these comments. If you have any questions, please contact Rob Portman at Powers Pyles Sutter & Verville PC (rob.portman@powerslaw.com or 202-872-6756). Respectfully submitted, AMERICAN ACADEMY OF DERMATOLOGY ASSOCIATION AMERICAN ACADEMY OF NEUROLOGY AMERICAN ACADEMY OF OTOLARYNGOLOGY-HEAD AND NECK SURGERY AMERICAN ACADEMY OF PHYSICAL MEDICINE AND REHABILITATION AMERICAN ASSOCIATION OF NEUROLOGICAL SURGEONS/NEUROPOINT ALLIANCE AMERICAN COLLEGE OF EMERGENCY PHYSICIANS 15 Id. at 53,819. 16 Id.

Page 9 AMERICAN COLLEGE OF GASTROENTEROLOGY/GIQUIC AMERICAN COLLEGE OF SURGEONS AMERICAN GASTROENTEROLOGICAL ASSOCIATION AMERICAN JOINT REPLACEMENT REGISTRY AMERICAN SOCIETY FOR GASTROINTESTINAL ENDOSCOPY/GIQUIC AMERICAN SOCIETY FOR RADIATION ONCOLOGY AMERICAN SOCIETY OF ANESTHESIOLOGISTS/ANESTHESIA QUALITY INSTITUTE AMERICAN SOCIETY OF CLINICAL ONCOLOGY AMERICAN SOCIETY OF NUCLEAR CARDIOLOGY AMERICAN SOCIETY OF PLASTIC SURGEONS AMERICAN UROLOGICAL ASSOCIATION COLLEGE OF AMERICAN PATHOLOGISTS SOCIETY OF INTERVENTIONAL RADIOLOGY SOCIETY OF NEUROINTERVENTIONAL SURGERY THE SOCIETY OF THORACIC SURGEONS