Benchmarking Laboratory Quality

Similar documents
THE VALUE OF CAP S Q-PROBES & Q-TRACKS

CAP Companion Society Meeting at USCAP 2009 Quality Assurance, Error Reduction, and Patient Safety in Anatomic Pathology

Originally defined by Lundberg, 1 a critical value represents

uality Management Tools

Master. Laboratory General Checklist. CAP Accreditation Program

Voluntary national programs to track laboratory quality,

Physician Satisfaction With Clinical Laboratory Services. A College of American Pathologists Q-Probes Study of 81 Institutions

Q-Tracks. A College of American Pathologists Program of Continuous Laboratory Monitoring and Longitudinal Performance Tracking

International Journal of Biological & Medical Research

CME/SAM. Determination of Turnaround Time in the Clinical Laboratory

PG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes

CAP Laboratory Improvement Programs. Staffing Benchmarks for Clinical Laboratories

PURPOSE: This policy provides an overview of SHANDS Jacksonville Laboratory s commitment to the care and safety of the patients we serve.

Quality Laboratory Practice and its Role in Patient Safety

Wristband Errors in Small Hospitals

From customer satisfaction survey to corrective actions in laboratory services in a university hospital

Policy Subject Index Number Section Subsection Category Contact Last Revised References Applicable To Detail MISSION STATEMENT: OVERVIEW:

SAFE PRACTICE 14: LABELING DIAGNOSTIC STUDIES

Allen D. Leman Swine Conference

The prevalence of preanalytical errors in a Croatian ISO accredited laboratory

PT/EQA for the Total Laboratory Testing Cycle: Focus on Pre-Examination

Asian Journal of Research in Biological and Pharmaceutical Sciences Journal home page:

Plan for Quality to Improve Patient Safety at the POC

Practice Levels and Educational Needs for Clinical Laboratory Personnel

Using the epoc Point of Care Blood Analysis System Reduces Costs, Improves Operational Efficiencies, and Enhances Patient Care

Clinical Laboratories West Virginia University Hospitals. Resident Orientation

Heart of America POC Group Quality Management Making it Meaningful

Joint Commission Laboratory Accreditation: Why It Is Right For Your Organization

COMMISSION ON LABORATORY ACCREDITATION. Laboratory Accreditation Program TEAM LEADER ASSESSMENT OF DIRECTOR & QUALITY CHECKLIST

Standards for Laboratory Accreditation

Preanalytical Errors in Laboratory - Their Consequences and Measures to Reduce Them

The effect of education and 4-year experience in the evaluation of preanalytical process in a clinical chemistry laboratory

Hospitals have a responsibility to ensure that physicians

Highmark Reimbursement Policy Bulletin

01/12/14. Nomen Omen: Analytical performance goals Performance goals. Performance criteria. Quality specifications

Laboratory Risk Assessment: IQCP and Beyond. Ron S. Quicho, MS Associate Project Director Standards and Survey Methods, Laboratory July 18, 2017

What one lab has learned about using Real Time Analytics: A case study

Organisation of a Clinical Laboratory. Peter O Loughlin SA Pathology

CLIA s New IQCP Requirements Are in Effect, or Are They?: Implementing Laboratory Risk Management Now to Ensure Success

Centers for Medicare and Medicaid Services (CMS) Survey and Certification Group (SCG) Mission:

Patient Satisfaction in Phlebotomy

PATIENT SAFETY/ORIGINAL RESEARCH

Driving Clinical Excellence in Microbiology with Consolidation, Real-Time Dashboards and Physician Concierge Services

Lab Quality Confab Process Improvement Institute. New Orleans, LA. John Waugh 11/3/2015

CAP Forensic Drug Testing Accreditation Program Standards for Accreditation

National Survey on Consumers Experiences With Patient Safety and Quality Information

Internal Quality Assurance Framework Microbiology

Quality Management Building Blocks

Danette L. Godfrey, MS, MT (ASCP) Senior Product Manager, Accreditation Programs cap.org

Plan for Quality to Improve Patient Safety at the POC

EDUCATIONAL COMMENTARY KEY COMPONENTS OF AN INDIVIDUALIZED QUALITY CONTROL PLAN

Improving Your POC Program: An Upside Down Map. Sheila K. Coffman MT(ASCP)

REPOSITIONING OUR CLINICAL LABORATORIES FOR EFFECTIVE AND EFFICIENT HEALTHCARE DELIVERY. By Prof. Ibironke Akinsete Chairman PathCare Nigeria

PRE AND POST EXAMINATION ASPECTS

TESTIMONY OF THOMAS HAMILTON DIRECTOR SURVEY & CERTIFICATION GROUP CENTER FOR MEDICAID AND STATE OPERATIONS CENTERS FOR MEDICARE & MEDICAID SERVICES

ASSEMBLY BILL No. 940

Exploring the Initial Steps of the Testing Process: Frequency and Nature of Pre-Preanalytic Errors

Improving Quality of Patient Care in an Emergency Department. A Laboratory Perspective

3. Does the institution have a dedicated hospital-wide committee geared towards the improvement of laboratory test stewardship? a. Yes b.

QC Explained Quality Control for Point of Care Testing

Tutorial: Basic California State Laboratory Law

These incidents, reported by the Pennsylvania Patient Safety Authority, are

2/15/2017. Reducing Mislabeled and Unlabeled Specimens In Acuity Adaptable Units

Disclosures. Relevant Financial Relationship(s): Nothing to Disclose. Off Label Usage: Nothing to Disclose 6/1/2017. Quality Indicators

Barbara De la Salle UK NEQAS

Introduction: The Need for Effective Execution in Healthcare

The 1999 Institute of Medicine report increased national

Case Study High-Performing Health Care Organization December 2008

Internal Quality Assurance Framework Anatomical Pathology

Laboratory Errors Patient Safety

IQCP January Is Coming Fast What Do I Do?!? Jean Ball Bold, MBA, MT(HHS), MLT(ASCP

Physician satisfaction and emergency (stat) laboratory turnaround time during various developmental stages

Laboratory QA. Quality-Improvement Measures as Effective Ways of Preventing Laboratory Errors. Rachna, Agarwal, MD 1 * ABSTRACT

Garbage in garbage out! Dr Mike Cornes: Principal Clinical Scientist Royal Wolverhampton NHS Trust

A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory

IQCP. Ensuring Your Laboratory s Compliance With Individualized Quality Control Plans. November/December 2016

SUTTER MEDICAL CENTER, SACRAMENTO DEPARTMENT OF LABORATORY MEDICINE. Rules and Regulations

Three Steps to Streamline Laboratory Operations:


A Battelle White Paper. How Do You Turn Hospital Quality Data into Insight?

Analysis of 340B Disproportionate Share Hospital Services to Low- Income Patients

Updated 6/9/2009 RESIDENT SUPERVISION: A. Anatomic Pathology:

Table of Contents. Introduction: Letter to managers... viii. How to use this book... x. Chapter 1: Performance improvement as a management tool...

5/8/2015. Individualized Quality Control Plans (IQCP) Changes to the CMS Quality Requirements. CLIA Quality Control Evolution of the Process

College of American Pathologists. Senior Director, Legislation and Political Action Position Profile October 2012

Clinical Pathologist Procedure Pathologist Pathologist Analytic/Diagnostic Quality Plan

time to replace adjusted discharges

Standards for Forensic Drug Testing Accreditation

Are We Ready and How Do We Know? The Urgent Need for Performance Measures in Hospital Emergency Management

Patient safety and risk management in medical laboratories: theory and practical application

Critical Access Hospital Quality

Robert L. Schmidt, MD, PhD, MBA, Jeanne Panlener, MT(ASCP), and Jerry W. Hussong, DDS, MS, MD

Individualized Quality Control Plan (IQCP) Frequently Asked Questions Date: May 5, 2015 (last updated 08/21/2017)

CME/SAM. Maxwell L. Smith, MD, 1 Trent Wilkerson, 1 Dana M. Grzybicki, MD, 2 and Stephen S. Raab, MD 1. Abstract

Learning Objectives. Individualized Quality Control Plans. Agenda. Another Way To Determine QC? Hooray!!!! What is QC?

CE Update [generalist compliance/regulation management/administration and training] COLA Accreditation An Educational Experience

Preliminary Assessment on Request for Licensure Medical Laboratory Science Professionals Summary of Testimony and Evidence.

Performance Measurement of a Pharmacist-Directed Anticoagulation Management Service

TITLE 37. HEALTH -- SAFETY -- MORALS CHAPTER HOSPITALS HOSPITAL MEASURES ADVISORY COUNCIL. Go to the Ohio Code Archive Directory

STANDARDS Point-of-Care Testing

Transcription:

Benchmarking Laboratory Quality Paul Valenstein, MD, 1 Frank Schneider, MD 2 ( 1 Department of Pathology, St. Joseph Mercy Hospital, Ann Arbor, MI, 2 Department of Pathology, Duke University Medical Center, Durham, NC) DOI: 10.1309/MPNFHFWBRTYUYBEP Abstract Proficiency testing is the oldest form of laboratory benchmarking. Today, laboratory managers can compare local laboratory quality to industry averages in many domains. This review discusses practical aspects of benchmarking laboratory quality when benchmarking is a good idea, common misunderstandings about what benchmarking can tell us, and examples of quality benchmarks that are widely used in the clinical laboratory industry. When we benchmark a product or service, we compare its performance to an external standard. When we benchmark clinical laboratory quality, we compare local laboratory quality to an external quality standard drawn from studying many laboratories or a few best performers. Clinical laboratories began benchmarking quality more than 6 decades ago, when a group of a dozen laboratories in Philadelphia decided to compare their results for hemoglobin testing. 1 Results were so widely discrepant that the Committee on Laboratories of the Pennsylvania Medical Society initiated a program to compare the accuracy of a number of chemistry tests throughout the state. This initiative gave birth to proficiency testing as we know it today. During the ensuing decades, interest in measuring the quality of laboratory services extended beyond the analytic domain to involve preanalytic and post-analytic processes, such as the accuracy of patient identification or the faithfulness with which critical results are called to caregivers. 2-4 Congress passed the Clinical Laboratory Improvement Amendments of 1988 (CLIA 88) in response to public concerns about erroneous cytopathology reports resulting in patient deaths. Instead of focusing on cytopathology, legislators and regulators scrutinized the entire laboratory medicine sector and developed standards for general laboratory performance. A decade later, the National Academy of Sciences Institute of Medicine claimed that as many as 1 million patients per year suffer injuries due to medical errors. 5 The Joint Commission (JC) shortly thereafter issued its Patient Safety and Quality Improvement Goals, a set of declarations designed to address common problems in clinical medicine. Today, laboratory managers can compare local laboratory performance to industry benchmarks in any of a number of quality and safety domains that include preanalytic, analytic, and post-analytic processes involving all sections of the clinical laboratory. This review discusses practical aspects of benchmarking laboratory quality when benchmarking quality is a good idea, common misunderstandings about what benchmarking can tell us, and examples of quality benchmarks that are widely used in clinical laboratories. We will draw heavily on the College of American Pathologists (CAP) Q-Probes 6 and Q-Tracks 4 program, because these services have generated the most widely used benchmark data about laboratory quality. We will not discuss benchmarking individuals who work in the laboratory or benchmarking laboratory finances, but interested readers can turn to several recent reviews. 7-10 When is Quality Benchmarking Useful? There are 4 main reasons to benchmark quality in the laboratory setting. Quality Improvement Some methods for improving quality do not require external benchmarking. Shewhart s plan-do-check-act (PDCA) is one such method. Yet laboratories that integrate benchmarking into their quality management programs enjoy several distinct benefits. First, benchmarking allows management to identify problem areas that require increased attention. Leadership at a typical laboratory struggles to decide which aspects of service are most in need of attention, since every part of an operation can benefit from increased focus and resources. Benchmarking helps managers identify areas where local performance lags behind industry averages. Second, many multi-institutional benchmarking studies identify practice variables that are associated with better performance. Managers who participate in benchmarking programs can learn about and adopt these practices to improve performance locally. Third, benchmarking results can motivate laboratory staff to take quality improvement efforts seriously. It is often easier to rally support for local quality improvement initiatives when benchmarking suggests that a large number of other facilities already demonstrate superior performance. Finally, when benchmarking of a single quality metric is performed longitudinally (over time), managers can use benchmarking data to determine whether the pace of quality improvement in the local laboratory is typical of other facilities that have committed themselves to improving performance in the same area. Satisfying Accreditation Requirements All United States laboratory accrediting organizations require that accredited laboratories maintain quality management programs. CLIA serves as the underlying legal framework and federal mandate for these requirements. 11 The CAP accreditation program has the most developed set of requirements, including a requirement that accredited laboratories monitor key indicators of quality and compare performance against a benchmark. 12 Joint Commission requirements are more abstract but also require accredited organizations to collect data to monitor their performance. 13 The International Standardization Organization (ISO) addresses the quality management plan in the management Downloaded 108 from https://academic.oup.com/labmed/article-abstract/39/2/108/2504678

requirements section of its ISO 15189 standard for medical laboratories. 14 While it is possible to satisfy the requirements of CAP, JC, and ISO by using internal benchmarks (such as past performance) or setting standards through customer feedback, many laboratories satisfy regulatory requirements by benchmarking some aspects of their service against external norms. This can be accomplished by subscribing to benchmarking services or engaging consultants to provide comparative industry data. Satisfying Payer Requirements The Centers for Medicare and Medicaid Services (CMS) is the largest single payer of pathology services and has the authority to deny payment to laboratories that do not satisfy CLIA requirements. 15 Payers other than the CMS (particularly health maintenance organizations) periodically establish performance expectations for clinical laboratories that are permitted to serve plan beneficiaries. Laboratories can use data from benchmarking services to help demonstrate compliance with managed care plan quality requirements, and also to help managed care organizations set realistic goals. One of us (PV) negotiates contracts regularly with managed care organizations on behalf of a consortium of 60 hospital-based laboratories and has used benchmarking data to help plans set achievable targets as well as to motivate individual laboratories to improve performance. Positioning the Laboratory in a Competitive Marketplace In all industries, delivery of a high-value product helps to distinguish one provider from others who compete in the same market space. The value of laboratory services is generally measured in terms of cost and quality, and benchmarking can help laboratories make the case that their quality is comparable or superior to competitors. Benchmarking data can be used in promotional materials or newsletters to educate laboratory customers about the quality of a local laboratory in comparison to industry averages. Benchmarking data can also be used to educate customers who have unrealistic expectations about levels of quality that can be achieved, particularly in contentious areas such as turnaround time. Finally, laboratory involvement in a benchmarking service can signal to customers in a general way that management is engaged in continuous quality improvement and is not fearful of comparing local performance with others within the industry. Important Considerations and Common Misunderstandings Despite the many uses of benchmarking laboratory quality, we have found that confusion about the benchmarking process is widespread. In this section, we address some of the more common misunderstandings. Quality is not One-Dimensional There is no Composite Quality Score Perhaps the most common misunderstanding about the benchmarking process is the belief that benchmarking can be used to identify laboratories of overall high and low quality. Individuals who systematically study laboratory quality have come to appreciate that quality is not one-dimensional. A laboratory that excels in one aspect of service (say, turnaround time) does not necessarily excel in other aspects of service (say, the proportion of reported results that must be corrected). Excellent performance on microbiology proficiency testing does not make up for a troponin assay that is out of control. Busy executives who lead hospitals with many departments understandably desire a bottom line assessment of laboratory quality, and some laboratories do perform well on many quality measures. Yet trying to compress laboratory quality into a single dimension is like trying to describe an individual s intelligence using a single number. Too much information is lost. We believe it is best for laboratory management to report separately on laboratory performance for each aspect of quality that has been benchmarked. A balanced, multi-variable score card provides more useful information than a single composite score. Measuring Quality Does not Necessarily Improve Quality Several benchmarking studies have shown that laboratories that regularly report on a particular aspect of quality tend to perform better on that quality measure than facilities in which regular monitoring is not taking place. 16 Benchmarking quality over time is also associated with improved performance. 4 Laboratories measuring analytical quality through proficiency testing, for example, seem to show improved performance over time, even though proficiency testing was undertaken primarily for CLIA compliance. 17 The association between monitoring and performance has led many to incorrectly conclude that the simple act of monitoring improves laboratory quality. For most aspects of quality, the evidence for a cause-and-effect relationship is weak. Oftentimes, management and staff at laboratories that monitor quality are more committed than average to quality improvement, and it is their commitment to process improvement that is responsible for good laboratory performance, rather than monitoring per se. Research on the effects of monitoring shows that improvement due to monitoring itself tends to be slight and not long-lasting. After 6 to 12 months, the novelty of monitoring wears off, and performance reverts to baseline levels. For this reason, it is unfortunate that the quality assurance programs of many laboratories consist entirely of a series of monitors without any concrete programmatic steps to improve or maintain performance. Laboratory professionals like to measure things measurement is, in a sense, our business but the act of measuring quality produces only modest and short-lived quality improvement. More is required. Aspects of Quality that are Easiest to Benchmark are not Necessarily the Most Important Albert Einstein had a sign on his door at the Institute for Advanced Studies at Princeton that read, not everything that can be counted counts, and not everything that counts can be counted. Some aspects of laboratory quality that are difficult to measure such as the ethics of laboratory staff may be more important than qualities that are easily measured. Even when performance can be counted (quantified), the easiest way to measure quality is not always the best. For example, measurement of within-laboratory turnaround usually begins with the time specimens are accessioned into a laboratory information system, because this time is easier to acquire than the time specimens physically arrive in the laboratory. Yet specimens may sit for many minutes in the laboratory before they are accessioned into the computer. 18 Similarly, measurement of outpatient waiting time usually begins with the time a patient is registered, rather than the time a patient walks in the door of a phlebotomy service center. It is difficult to measure the actual time a specimen arrives in the laboratory or the time a patient walks into a Downloaded labmedicine.com from https://academic.oup.com/labmed/article-abstract/39/2/108/2504678 February 2008 Volume 39 Number 2 LABMEDICINE 109

phlebotomy area, but these starting points offer a more meaningful measure of turnaround and patient wait time than the time of accessioning or patient registration. Good Quality Does not Always Mean High Cost The College of American Pathologists has performed a number of internal studies comparing performance on various quality measures with laboratory staffing levels and costs. Correlations have been weak and are often statistically insignificant. While it is true that many quality issues can be addressed with additional staff or more modern equipment, some laboratories appear to meet the challenge without adding expense. While it is true that laboratories that are part of private hospitals, on average, perform better on many quality measures than government-run hospitals where budgets are presumably tighter, associations are not strong, and there are many government-run hospitals that perform better than their private counterparts. As a general rule, staffing levels and capital expenditures have not been strong predictors of laboratory quality in benchmarking studies. Improperly-Performed Benchmarking May Produce Misleading Results Many efforts to benchmark operations suffer from methodological shortcomings that limit the usefulness of results. One of the most common shortcomings is the use of an inadequate sample size that does not allow local laboratory performance to be estimated accurately. Reliance on dated benchmark data can also present problems, because industry performance is improving for many quality measures. For this reason, the use of published performance norms that are more than 5 years old may be misleading. Finally, laboratories without access to detailed data collection instructions may perform internal measurements that are incompatible with benchmark data, because sampling procedures or definitions are different, or because certain activities are inappropriately included in or excluded from local benchmarks. Table 1_Selected Quality Metrics Suitable for Interlaboratory Benchmarking Quality Metric Domains Addressed by Metric Pre- Post- Analytic General Lab Anatomic Clinical Chemistry/ Micro- Trans- Patient Customer Turnaround analytic analytic Systems Pathology Pathology Hematology biology fusion Safety Satisfaction Time Medicine Blood culture contamination Blood product wastage Completeness of cancer diagnoses adequacy of cancer reporting Corrected results Critical result reporting Gynecologic cytology- biopsy correlation Inpatient test result availability Order entry accuracy Patient satisfaction, specimen collection Patient wristband accuracy Physician satisfaction with lab services Proficiency testing performance Red blood cell utilization Specimen acceptability Specimen ID errors Stat test turnaround time Downloaded 110 from https://academic.oup.com/labmed/article-abstract/39/2/108/2504678

When benchmarking laboratory quality, it is best to be guided by a benchmarking service or reputable consulting firm that has field-tested its data collection procedures and published its methods and performance data. Not Every Laboratory can be a Best Performer In the fictional town of Lake Wobegon, all children are reported to be above average. Sadly, the laws of statistics hold more firmly in the rest of the United States. Only 50% of laboratories can perform above the median in any quality benchmarking study, and fully 25% of laboratories are destined to fall into the bottom quartile. Since there is rarely any direct evidence that patients in below-the-median institutions are being harmed, there is no need for laboratory managers to declare a crisis when benchmarking shows a laboratory to be below average or even in the bottom quartile. What management should recognize, however, is that local performance in the bottom quartile signals an opportunity to improve operations. When staff at most other facilities have figured out how to perform at a higher level, local demands from caregivers for better service should be taken seriously. Industry Performance In a review of quality monitors used in 572 United States laboratories, Gayken and colleagues found that the most common monitors used by laboratories were proficiency testing (98%), quality control (96%), personnel competency testing (95%), turnaround time (94%), and patient identification accuracy (90%). 19 The 3 most common monitors are all required by U.S. regulations. Preanalytic monitors were less commonly employed, even though most laboratory problems relate to preanalytic issues. 20 No external benchmark data was available for many of the monitors in use, which meant that laboratory managers reviewing monitoring results were unlikely to have had a strong sense of how local performance compared with the laboratory industry as a whole. Although monitoring without benchmarks can be useful for tracking quality, we believe it is a good idea to benchmark a variety of laboratory processes, including different analytic testing disciplines and aspects of the pre- and post-analytic domains of laboratory service. Benchmarking data can be obtained from proprietary databases held by consultants and several published surveys conducted by state health departments and the Centers for Disease Control and Prevention. However, most multi-institutional benchmark data has been developed from the CAP Q-Probes and Q-Tracks benchmarking services. Q-Probes provides a onetime snapshot of laboratory performance in a particular area of service, whereas Q-Tracks additionally provides laboratories with the opportunity to trend performance over time. More than 100 manuscripts have been published in the peer-reviewed literature that describe industry performance as measured by the Q-Probes and Q-Tracks programs, and old manuscripts and data collection tools are available to the public without charge from the CAP Web site (www.cap.org). Some of the quality measures that we have found well suited for benchmarking are shown in Table 1, along with the domains addressed by each measure. Table 2 shows relative industry performance data for 3 additional metrics, and illustrates the format in which the authors of benchmark studies typically present results. Managers who wish to use external benchmarking as part of their quality management program should select measures for which published benchmark data are no more than 5 years old, and should use the same data collection methods used at facilities that contributed to the benchmark database. Not all laboratory monitoring must have external benchmarks, of course. External benchmarks are most useful when management is concerned that performance in a particular area may not be up to par, or when management believes that external benchmarking will help motivate individuals to take a known problem seriously and do something about it. Other than proficiency testing for so-called regulated analytes, laboratories are not currently required to benchmark specific aspects of their operations. At some point in the future, however, the federal government or major accreditors may explicitly require laboratories to measure their performance using other specific quality metrics, and to compare their performance with industry norms. LM 1. Belk WP, Sunderman FW. A survey of the accuracy of chemical analyses in clinical laboratories. Am J Clin Pathol. 1947;17:853 861. 2. Hollensead SC, Lockwood WB, Elin RJ. Errors in pathology and laboratory medicine: Consequences and prevention. J Surg Oncol. 2004;88:161 181. Table 2_Examples of Industry Performance Relative Performance of Laboratories 1 Benchmark No. of Labs Year Published Reference 25th 50th (Median) 75th Automated complete blood counts with a manual review, scan, 263 2006 21 18.5% 26.7% 39.1% or differential (%) Specimen identification errors detected before release of test 120 2006 22 69.7% 86.1% 95.0% result (%) Receipt-to-report turnaround time for stat troponin ordered in 159 2004 23 51 39 31 the Emergency Department (minutes) 1 Higher percentiles do not necessarily indicate better performance. For example, in the case of manual review of automated differentials (reference 21), staff in the 75th percentile laboratory performed some sort of manual procedure in 39.1% of automated differentials while staff in the 25th percentile reviewed 18.5%. The authors of this study did not indicate that any particular frequency of review was more desirable. For specimen identification errors and turnaround time, higher percentiles and fewer minutes, respectively, indicate better performance. Downloaded labmedicine.com from https://academic.oup.com/labmed/article-abstract/39/2/108/2504678 February 2008 Volume 39 Number 2 LABMEDICINE 111

3. Berte LM. Patient safety: Getting there from here Quality management is the best patient safety program. Clin Leadersh Manag Rev. 2004;18:311 315. 4. Zarbo RJ, Jones BA, Friedberg RC, et al. Q-Tracks. A College of American Pathologists program of continuous laboratory monitoring and longitudinal performance tracking. Arch Pathol Lab Med. 2002;126:1036 1044. 5. Kohn L, Corrigan J, Donaldson M, eds. To Err is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000. 6. Bachner P, Howanitz PJ. Using Q-Probes to improve the quality of laboratory medicine: A quality improvement program of the College of American Pathologists. Qual Assur Health Care. 1991;3:167 177. 7. Wilkinson DS, Reynolds D. Using benchmarking to manage your laboratory. Clin Lead Manage Rev. 2003;16:5 8. 8. Valenstein P, Sours R, Wilkinson DF. Staffing benchmarks for clinical laboratories. A College of American Pathologists Q-Probes Study of staffing at 151 institutions. Arch Pathol Lab Med. 2005;129:467 473. 9. Valenstein P, Praestgaard A, Lepoff R. Six year trends in expense, productivity, and utilization of seventy three clinical laboratories. Arch Pathol Lab Med. 2001;125:1153 1161. 10. Howanitz PJ, Valenstein P, Fine G. Employee competence and performancebased assessment. Arch Pathol Lab Med. 2000;124:195 202. 11. Clinical Laboratory Improvement Amendments, 42 CFR 493. 12. College of American Pathologists, Laboratory Accreditation Program, Laboratory General Checklist GEN.20316. 13. The Joint Commission, CAMLAB, Standard PI.1.10 EP1. 14. ISO 15189: 2003. 15. 42 CFR 493.1808. 16. Howanitz PJ. Quality assurance measurements in departments of pathology and laboratory medicine. Arch Pathol Lab Med. 1990;114:1131 1135. 17. Ehrmeyer SS, Laessig RH. Has compliance with CLIA requirements really improved quality in US clinical laboratories? Clin Chem Acta. 2004;346:37 43. 18. Valenstein P. Preanalytic delays as a component of laboratory turnaround time. Lab Med. 1990;21:448 451. 19. Gayken J, Noble M, Taylor J, et al. IQLM and CLMA take a snapshot of America s hospital laboratory quality management. Clin Leadersh Manag Rev. 2005;19(5):E2. 20. Bonini P, Plebani M, Ceriotti F. Errors in laboratory medicine. Clin Chem. 2002;48:691 698. 21. Novis DA, Walsh M, Wilkinson D, et al. Laboratory productivity and the rate of manual peripheral blood smear review. A College of American Pathologists Q-Probes Study of 95,141 complete blood count determinations performed in 263 institutions. Arch Pathol Lab Med. 2006;130:596 601. 22. Valenstein P, Raab SS, Walsh MK. Identification errors involving clinical laboratories. A College of American Pathologists Study of patient and specimen identification errors at 120 institutions. Arch Pathol Lab Med. 2006;130:1106 1113. 23. Novis D, Jones B, Dale J, Walsh M. Biochemical markers of myocardial injury test turnaround time. A College of American Pathologists Q-Probes Study of 7020 troponin and 4368 creatine kinase MB determinations in 159 institutions. Arch Pathol Lab Med. 2004;128:158 164. Downloaded 112 from https://academic.oup.com/labmed/article-abstract/39/2/108/2504678