The World Bank Human Development Network - Education System Assessment and Benchmarking for Education Results (SABER)

Similar documents
SAINT KITTS AND NEVIS

Sustainable Energy Technical Assistance (SETA) for a Competitive OECS

C. Agency for Healthcare Research and Quality

Occasional Paper on Review of Main Debt Management Activities March 2016 February 2017

Terms of Reference for end of project evaluation

Health Research 2017 Call for Proposals. Evaluation process guide

Expression of Interest. Architect for the Construction of a Bank Residence

Patients registered at a GP Practice

Analysis of the results of the Survey applied to the NSO in the countries of Latin America and the Caribbean September 2011

REPORT ON THE REGIONAL TRAINING WORKSHOP ON ICT INDICATORS

8 TH ICP Executive Board Meeting. February 24, 2013 New York, NY

This scholarship is awarded on a first come, first serve basis in accordance with eligibility.

Financial Management

Efforts towards improved coordination of data collection at the international level

Economic and Social Council

EFFECTIVE DATE: 10/04. SUBJECT: Primary Care Nurse Practitioners SECTION: CREDENTIALING POLICY NUMBER: CR-31

Report No. D May 14, Selected Controls for Information Assurance at the Defense Threat Reduction Agency

Bridgetown, Barbados

Quality Management Program

Economic and Social Council

Basic Course: Mental Health and Chronic Diseases

Follow-up Meeting of the ODS Offices of the English Speaking Caribbean Network

THE CARIBBEAN Broadcasting AWARDS RULES

UEFA CLUB LICENSING SYSTEM SEASON 2004/2005. Club Licensing Quality Standard. Version 2.0

REQUEST FOR PROPOSALS

Code of Governance of Irish Institutes of Technology. Annual Governance Statement and Statement of Internal Control - reporting arrangements to HEA

Research Co-ordinator Orthopaedics Position Description

Annex A Summary of additional information about outputs

Health Workforce Planning Techniques and the Policy Context International Health Workforce Collaborative 6 May 2013, Quebec City

COMMISSION IMPLEMENTING REGULATION (EU)

National Standards for the Conduct of Reviews of Patient Safety Incidents

UTH hltli The University of Texas Health Science Canter at Houston

MINISTRY OF EDUCATION AND HUMAN RESOURCES DEVELOPMENT DRAFT POLICY STATEMENT AND GUIDELINES FOR GRANTS TO EDUCATION AUTHORITIES IN SOLOMON ISLANDS

Procurement Support Centre

Developing a framework for the secondary use of My Health record data WA Primary Health Alliance Submission

2. This SA does not apply if the entity does not have an internal audit function. (Ref: Para. A2)

NHS Sickness Absence Rates. January 2016 to March 2016 and Annual Summary to

Economic and Social Council

2011 ICP INTERIM PROGRESS REPORT TO THE EXECUTIVE BOARD

The overall objective of the programme is to improve the quality of Norwegian teacher education and schools in Norway.

H2020 Work Programme : Spreading Excellence and Widening Participation Call: H2020-TWINN-2015: Twinning Frequently Asked Questions (FAQ)

Characterization of the Process for Creating Harmonized Statistics on Information and Communication Technologies in the National Statistics Offices

2011 ICP Progress Report

HT 2500D Health Information Technology Practicum

Monthly and Quarterly Activity Returns Statistics Consultation

Findings from the Balance of Care / NHS Continuing Health Care Census

pic National Prescription Drug Utilization Information System Database Privacy Impact Assessment

Improving the quality of the JODI Database

POSITION DESCRIPTION

Managing Issues Addressing the Challenges of Using Administrative Data for Statistical Purposes in Sri Lanka.

FREQUENTLY ASKED QUESTIONS FOR THE DIRECT ASSISTANCE GRANT SCHEME

Concept note for the side event on ICT statistics to the 3rd session of the Committee on Statistics of ESCAP

Department of Health and Mental Hygiene Springfield Hospital Center

PART I HAWAII HEALTH SYSTEMS CORPORATION STATE OF HAWAII Class Specifications for the

4.10. Ontario Research Fund. Chapter 4 Section. Background. Follow-up on VFM Section 3.10, 2009 Annual Report. The Ministry of Research and Innovation

Department of Health and Mental Hygiene Alcohol and Drug Abuse Administration

Office of the Inspector General Department of Defense

Minnesota health care price transparency laws and rules

CIP Publications Policy

Bank of america executive assistant

Curriculum Vitae of Her Excellency Dame Pearlette Louisy

CSU COLLEGE REVIEWS. The California State University Office of Audit and Advisory Services. California State Polytechnic University, Pomona

Community Health Centre Program

Quality Management Plan

Collier County Clerk of the Circuit Court Internal Audit Department. Audit Report Parks and Recreation Audit - Part II Revenues

International Health Regulations (IHR) Implementation status in the Americas

F O U R T E E N T H A N N U A L R E P O R T ON C C S T A C T IV IT IE S

Customized ICT Solutions for Caribbean Growth

a GAO GAO AIR FORCE DEPOT MAINTENANCE Management Improvements Needed for Backlog of Funded Contract Maintenance Work

HANDBOOK FOR THE INDIGENOUS ECONOMIC DEVELOPMENT FUND. January 2018

ENTERPRISE INCOME VERIFICATION (EIV) SECURITY POLICY

The Criminal Justice Information System at the Department of Public Safety and the Texas Department of Criminal Justice. May 2016 Report No.

Research Audits PGR. Effective: 12/04/2013 Reviewed: 12/04/2015. Name of Associated Policy: Palmetto Health Administrative Research Review

FREQUENTLY ASKED QUESTIONS FOR SPECIAL CALL FOR PROPOSAL FOOD SAFETY

POST INCUBATION FUNDING PRESENTATION

Diagnostic Waiting Times

Policies Targeting Payer Harmonization: The Provider Perspective

Florida MIECHV Initiative Provider Quality Assurance Monitoring Procedure Manual

Do you know of a young person making a positive difference to the lives of other people in your community or country?

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL

The Norwegian Cooperation Programme in Higher Education with Russia

Information System Security

RESEARCH PROJECT GUIDELINES FOR CONTRACTORS PREPARATION, EVALUATION, AND IMPLEMENTATION OF RESEARCH PROJECT PROPOSALS

GAO DOD HEALTH CARE. Actions Needed to Help Ensure Full Compliance and Complete Documentation for Physician Credentialing and Privileging

GUIDELINES FOR CRITERIA AND CERTIFICATION RULES ANNEX - JAWDA Data Certification for Healthcare Providers - Methodology 2017.

Scioto Paint Valley Mental Health Center

Practice Review Guide April 2015

Practice Review Guide

SANDWATCH REGIONAL SYMPOSIUM

REQUEST FOR PROPOSALS: PROFESSIONAL AUDITING SERVICES

Work of Internal Auditors

METHODOLOGY FOR INDICATOR SELECTION AND EVALUATION

Migrant Education Comprehensive Needs Assessment Toolkit A Tool for State Migrant Directors. Summer 2012

This policy has implications for all managers, staff, board members, students, apprentices and trainees, contractors and volunteers.

The Plan will not credential trainees who do not maintain a separate and distinct practice from their training practice.

Physiotherapy UK 2018 will take place on October, at the Birmingham ICC.

Terms of Reference: Mid-term evaluation. Caribbean Water Initiative (CARIWIN) Project number: S

GRANT APPLICATION GUIDELINES. Global Call for Proposals

III. The provider of support is the Technology Agency of the Czech Republic (hereafter just TA CR ) seated in Prague 6, Evropska 2589/33b.

22 nd June, 2016 Gary Gore (Interim Chairman) Presented at Organization of American States Sustainable Cities Workshop, June 21-24

Transcription:

The World Bank Human Development Network - Education System Assessment and Benchmarking for Education Results (SABER) Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized 67664 SAINT LUCIA Education Management Information System (EMIS) COUNTRY REPORT Emilio Porta, Jennifer Klein, Gustavo Arcia and Harriet Nannyonjo February 2012

Acknowledgements This report was prepared by a team led by Emilio Porta, Senior Education Specialist at the Human Development Network/Education at the World Bank; and consisting of Gustavo Arcia, Consultant to the Human Development Network/Education of the World Bank and Senior Economist at Analítica LLC in Miami, Florida; Jennifer Klein, Consultant to the Human Development Network/Education at the World Bank, and Harriet Nannyonjo, Senior Education Specialist, LCSHE, World Bank. The report was prepared under the guidance of Elizabeth King, Robin Horn and Chingboon Lee. The views expressed here are those of the authors and should not be attributed to the World Bank Group. All data contained in this report is the result of collaboration between the authors, the Organization of Eastern Caribbean States, and participants in the benchmarking exercise. All errors are our own. This benchmarking study arose from an active partnership between the Education Reform Unit of the Organization of Eastern Caribbean States () and the World Bank. The benchmarking exercise was done during an workshop conducted in Castries, St. Lucia, from January 23 to January 28, 2011, with the participation of government officials from Antigua & Barbuda, the Commonwealth of Dominica, Grenada, St. Kitts & Nevis, St. Lucia, and St. Vincent & the Grenadines. A delegate from Montserrat also attended as an observer. The workshop and benchmarking exercise were done under the invaluable leadership of Marcellus Albertin, Head of the Education Reform Unit (OERU) at the. His unflagging support, enthusiasm, and institutional supervision were fundamental for the cooperation of all participants and for the success of the workshop. To him we owe a great deal of gratitude. We would like to thank the OERU staff that helped us with workshop logistics, especially Emma Mc Farlane- Jouavel and Beverly Pierre. We would also like to thank the workshop participants: Doristeen Etinoff, Priscilla Nicholas, and Patricia George from Antigua & Barbuda; Ted Serrant, Robert Guiste, and Weeferly Jules from Dominica; Pauleen Finlay, Michelle Peters, and Imi Chitterman from Grenada; Gregory Julius from Monserrat; Quinton Morton, Ian Gregory, and Laurence Richards from St. Kitts & Nevis; Kendall Khodra, Nathalie Elliott, Sisera Simon, Evariste John, and Valerie Leon from St. Lucia; Dixton Findlay, Keith Thomas, and Junior Jack from St. Vicent & Grenadines; Darrel Montrope, Jacqueline Massiah, Sean Mathurin, and Loverly Anthony- Charles from the. Abbreviations EMIS MOE OECD SABER SEAT UIS UNESCO Education Management Information System Ministry of Education Organization for Economic Cooperation and Development Organization of Eastern Caribbean States System Assessment and Benchmarking for Education Results SABER EMIS Assessment Tool UNESCO Institute for Statistics United Nations Educational, Scientific, Cultural Organization

SAINT LUCIA ESTABLISHED Aspect of Data Quality Benchmark Prerequisites of Quality Assurances of Integrity Methodological Soundness Accuracy and Reliability Serviceability Accessibility 1

BACKGROUND Education Data in St. Lucia With the growing demand for timely and accurate data, the Ministry of Education (MOE) in St. Lucia embarked on a project to implement an Education Management Information System (EMIS) for all schools across the island. Due to financial constraints, it was initially implemented in public secondary schools, but has expanded over time to include primary schools and Sir Arthur Lewis Community College. EMIS STAFF. At the national level, no additional staff members have been hired to manage the operation of the EMIS, but each secondary school has one additional staff member who was employed and trained to coordinate EMIS activities at the school. Principals and teachers of primary schools were also trained to support the collection of education data through a series of workshops on the Intime Program. EMIS DATA. St. Lucia collects data on primary and secondary schools annually, including: Ü School background information Ü Student data: Enrolment, repeaters, dropouts, transfers, graduates and distribution by age. Ü Staff data: Teachers employed, non-teaching staff, teacher training and teacher movement Ü Conditions of buildings, furniture and equipment Ü Revenue and expenditure Also, Attendance Forms are collected each month that include: Ü Daily student attendance by grade and gender Ü Teacher attendance and punctuality, which is measured on an actual and official basis. FACILITIES AND EQUIPMENT. Computers are available in both primary and secondary schools and additional computers were made available at secondary schools for EMIS use. Network infrastructure at most secondary schools needs repair and upgrading. Electronic EMIS databases and data files reside on the senior statistician s machine with access privileges granted to other statisticians. Files containing confidential information are password protected. Hard copies of data are stored in filing cabinets under lock and key. DATA COLLECTION. Data is collected from all learning institutions in St. Lucia. St. Lucia s EMIS connects all public secondary schools with the Maplewood Software, 40 of the 75 public primary schools with the Intime Program, and the Sir Arthur Lewis Community College with the Sonis System. Schools with EMIS facilities may submit data electronically. Annual Education Census Questionnaires are sent out in October of each year with the specification that the data submitted must reflect what exists on October 31 st of that year. Attendance data is also collected on a monthly basis using standard attendance forms. DATA PROCESSING. Data is submitted to district offices, where it is verified and validated before being forwarded along to the MOE. Verification and validation are also done at the MOE, and the MOE statisticians address any errors or inconsistencies they find. After verifying and validating the data, they are aggregated and analyzed in Excel. Because of limited personnel, attendance tables are generated each term or sometimes each year. PUBLICATIONS. Two main EMIS documents are published on an annual basis: 1) The Education Digest and 2) The Attendance Report, which is not available to the public. Copies of the Education Digest are distributed to the data providers including schools, MOE departments, all government ministries, and local, regional and international organizations. The report can also be downloaded online through the Central Statistics Office and MOE websites. 2

The EMIS in St. Lucia ESTABLISHED: In January 2011, St. Lucia s EMIS was assessed using the SABER-EMIS Assessment Tool (SEAT) and overall, the EMIS was categorized as ESTABLISHED (0.63). Among the six Organisation of Eastern Caribbean States () countries, St. Lucia s score was ranked third behind Dominica (0.65) and St. Kitts and Nevis (0.65). Figure 1. SABER EMIS Scores in the St. Lucia had the highest score of the countries on Assurance of Integrity (0.64) and outperformed the average on all of the SEAT s Aspects of Quality except 1) Methodological Soundness (0.67) and 2) Accuracy and Reliability (0.58). St. Lucia s lowest score was on Accessibility (0.56), but the score was still above the average. The next sections of this country report will analyze St. Lucia s performance on the subcomponents of each Aspect of Quality in order to present a detailed portrait of the strengths and weaknesses of St. Lucia s EMIS and many concrete actions that the country can take to improve education data quality. Table 1. SABER EMIS Scores in the Countries (2011) Pre-Requisites of Quality Assurances of Integrity Methodological Soundness Accuracy and Reliability Dominica Antigua Grenada St. Kitts St. Vincent St. Lucia 0.70 0.52 0.68 0.66 0.45 0.64 0.61 0.58 0.53 0.61 0.44 0.64 0.55 0.83 0.67 0.67 0.83 0.67 0.69 0.70 0.48 0.58 0.53 0.58 0.60 Serviceability 0.61 0.29 0.79 0.43 0.68 0.55 Accessibility 0.47 0.47 0.69 0.61 0.36 0.56 0.53 Overall 0.65 0.46 0.62 0.65 0.52 0.63 0.59 Latent 0 0.3 0.31-0.59 0.6-0.79 0.8-1 3

PREREQUISITES OF QUALITY ESTABLISHED: St. Lucia has ESTABLISHED (0.64) the Prerequisites of Quality (Figure 2) necessary to support an EMIS and was the only country to have staff, facilities, computer resources, and financing commensurate with EMIS activities (Table 2, 0.5). Laws exist to protect confidentiality of individual/personal data and individuals are informed of their rights (0.3). Laws also exist to establish the collection and dissemination of education data, but responsibilities for these actions are not clearly defined by the laws (0.1). Institutions are legally obligated to share data with the MOE, but no penalties are established if institutions fail to report (0.4). No formal agreements exist to ensure data sharing and coordination among agencies, but agencies informally share data and collaborate (0.2). Figure 2. Prerequisites of Quality ESTABLISHED St. Lucia s only LATENT score resulted from a lack of processes to monitor the quality of data processes (0.9): No formal reviews or external reviews are carried out and user feedback on quality is not collected. Quality is a main objective, but it is not enforced by management (0.8). St. Lucia could further establish the Prerequisites of Quality by formalizing EMIS responsibilities and informal agreements and focusing more on quality. Table 2. Prerequisites of Quality: Subcomponents St. Lucia Benchmark 0.1 Responsibility for collecting and disseminating education data is clearly specified 0.2 Data sharing and coordination among different agencies are adequate 0.3 0.4 0.5 0.6 0.7 Individual/personal data are kept confidential and used for statistical purposes only Statistical reporting is ensured through legal mandate and/or measures to encourage response Staff, facilities, computing resources, and financing are commensurate with the activities Processes and procedures are in place to ensure that resources are used efficiently Education statistics meet user needs and those needs are monitored continuously 1.00 1.00 0.8 Processes are in place to focus on quality 0.9 Processes are in place to monitor the quality of data processes 0.00 0.10 0.11 Processes are in place to deal with quality considerations in planning the stat program Mechanisms exist for addressing new and emerging data requirements Latent 0.79 0.58 0.63 0.63 0.63 0.33 0.58 0.54 4

ASSURANCES OF INTEGRITY ESTABLISHED: St. Lucia scored the highest of all the countries on the Assurances of Integrity and was classified as ESTABLISHED (0.64). Choices of sources, statistical techniques and decisions on dissemination are sound (Table 3, 1.3) and advance notice of major changes in methodology, source data, and statistical techniques is usually given in publications (1.8). Figure 3. Assurances of Integrity in the ESTABLISHED Only informal mechanisms protect the professional independence of the data producing institution, but EMIS staff is aware of established ethical practices and generally adhere to them (1.1). The terms and conditions under which statistics are collected, processed, and disseminated are difficult to find but they are available to the public (1.5). The statistical agency also comments publically on technical errors, provides technical explanations and comments on major misinterpretations (1.4). The professionalism of EMIS staff is currently MATURE because of established guidelines for staff behavior that are actively enforced (1.9). Also, while staff are recruited and promoted based on professional credentials, professionalism could be further promoted by encouraging staff to publish and by establishing a peer review process (1.2). Table 3. Assurances of Integrity: Subcomponents St. Lucia Benchmark 1.1 Statistics are produced on an impartial basis 0.25 1.2 Professionalism of staff is actively promoted 1.3 1.4 Choices of data sources and statistical techniques are made solely by statistical considerations Agency is entitled to comment on erroneous interpretation and misuse of statistics 1.5 Terms and conditions are available to the public 1.6 Public is aware of internal governmental access to statistics prior to their release 1.7 Products of education statics agency are clearly identified 1.8 1.9 Advanced notice is given of major changes in methodology, source data, and statistical techniques Guidelines for staff behavior are in place and are well known to the staff 1.00 0.38 0.42 0.83 0.58 0.33 0.38 0.71 0.83 5

METHODOLOGICAL SOUNDNESS ESTABLISHED: In terms of Methodological Soundness, St. Lucia s EMIS is ESTABLISHED (0.67). St. Lucia scored just below the average (0.69) and had the same score as both Grenada and St. Kitts (Figure 4). St. Lucia s overall structure, concepts and definitions have proper documentation but definitions do not conform with regional and international standards (Table 4, 2.1) established by the UNESCO Institute for Statistics (UIS) and the Education Reform Unit (OERU). St. Lucia also follows the International Standard Classification of Education (ISCED) in all education sector data except expenditure data (2.3), which cannot currently be disaggregated by ISCED classification. Expanding the use of ISCED to expenditure data would ensure complete consistency with ISCED and improve St. Lucia s score on this subcomponent. Figure 4. Methodological Soundness in the countries ESTABLISHED Currently, St. Lucia s EMIS produces between 71 to 90 percent of UIS indicators annually, which results in a EMERGING benchmark on the scope of statistics sub-component (2.2). Expanding the scope of statistics produced to 100 percent of UIS and OECD indicators is ideal and can enable additional domestic, regional, and international education policy analysis. Table 4. Methodological Soundness: Subcomponents St. Lucia Benchmark 2.1 2.2 2.3 Overall structure, concepts and definitions follow regionally and internationally accepted standards, guidelines, and good practices Scope is in accordance with international standards, guidelines, or good practices Classification systems are consistent with international standards, guidelines, or good practices 0.83 0.42 0.83 6

ACCURACY AND RELIABILITY EMERGING: The Accuracy and Reliability of St. Lucia s EMIS data is EMERGING (0.58). St. Lucia scored slightly below the average (0.60) on this Aspect of Quality, but scored higher than both St. Vincent and Antigua (Figure 5). St. Lucia s sub-component scores all fall into the EMERGING and ESTABLISHED range with no LATENT or MATURE scores. This indicates that St. Lucia has a foundation for all the sub-components of Accuracy and Reliability and needs to build upon the foundation to improve. For example, St. Lucia could: improve procedures to update, standardize, and properly reference source data (3.2) ensure that education data are provided within six months after the end of the school year to other source providers (3.3) routinely assess other data sources and train staff to handle these data sources (3.4) always validate intermediate results against other information (3.7) improve systematic processes to investigate statistical discrepancies (3.8; 3.9) conduct studies of revisions (3.10) Figure 5. Accuracy and Reliability EMERGING Table 5. Accuracy and Reliability: Subcomponents St. Lucia Benchmark 3.1 3.2 Source data are obtained from comprehensive data collection that takes into account country-specific conditions Data are reasonably confined to the definitions, scope, classifications, and time of recording required 3.3 Source data are timely (6 months after event) 3.4 3.5 3.6 3.7 3.8 3.9 3.10 Other data sources, such as censuses, surveys, and administrative records, are routinely assessed Data compilation employs sound statistical techniques to deal with data sources Other statistical procedures (data editing, transformations, and analysis) employ sound statistical techniques Intermediate results are validated against other information where applicable Statistical discrepancies in intermediate data are assessed and investigated Statistical discrepancies and other potential indicators or problems in statistical outputs are investigated Studies and analyses of revisions are carried out routinely and used internally to inform the processes 0.25 0.25 0.58 0.46 0.42 0.79 0.63 0.67 0.92 0.71 0.33 7

SERVICEABILITY ESTABLISHED: The Serviceability of St. Lucia s EMIS data is ESTABLISHED (0.68) and is far above the average (0.55). St. Lucia s lowest score on any subcomponent was, which indicates that a strong foundation for Serviceability is currently in place. Figure 6. Serviceability in the ESTABLISHED St. Lucia met the MATURE benchmark for Periodicity by producing an annual census of enrolments, teachers, schools, and financial data (Table 6, 4.1), but the timeliness of releasing the data could be improved. Currently administrative census data are available six to 12 months after the initiation of the school year (4.2) when ideally this data should be released within two months. Time series are available for five to 10 years (4.4), but procedures for data revisions could be improved (4.6). Cross-checks are only done in an ad-hoc fashion (4.3) and comparison checks show that there is roughly a 11 to 20 percent difference between school-reported figures and data from other sources (4.4). St. Lucia could improve its Serviceability by publishing administrative data within two months of the initiation of the school year, increasing the availability of time series data to 10 years, and strengthening systems for revisions and cross-checks. Table 6. Serviceability: Subcomponents St. Lucia Benchmark 4.1 Periodicity follows dissemination standards 1.00 4.2 Timeliness follows international dissemination standards 4.3 Statistics are consistent within the dataset 4.4 4.5 Statistics are consistent or reconcilable over a reasonable period of time Statistics are consistent or reconcilable with those obtained through other data sources and/or statistical frameworks 4.6 Revisions follow a regular and transparent schedule 4.7 Preliminary and/or revised data are clearly identified 1.00 0.96 0.63 0.71 0.54 0.33 0.21 0.46 8

ACCESSIBILITY EMERGING: Accessibility was St. Lucia s lowest score (0.56/EMERGING), but sub-component scores ranged from MATURE to LATENT. St. Lucia scored highly on data presentation: EMIS statistics are clearly presented with disaggregation and underlying data for charts (5.1). Also, St. Lucia releases data on a pre-announced schedule (5.3) typically releases data to all users at the same time (5.4). There are procedures in place for releasing non-published data and non-confidential data upon users request (5.5), and metadata are available, but not publicized (5.6). Figure 7. Accessibility in the EMERGING St. Lucia could improve Accessibility of the EMIS by: Disseminating data electronically (5.2). Making metadata available to the public (5.6). Creating a data catalog so users can request data in a level of detail that meets their needs (5.7). Creating a catalog of publications and services (5.9). Identifying contact points to offer assistance to users (5.8). Accessibility is one of the key missions of an EMIS because it creates and maintains the public image of the EMIS and enables greater accountability. It is imperative for all levels of administration in St. Lucia to focus on developing a more accessible EMIS. Table 7. Accessibility: Subcomponents St. Lucia Benchmark 5.1 Statistics are presented to facilitate proper interpretation and comparisons (layout, clarity of texts, tables, and charts) 1.00 5.2 Dissemination media and format are adequate 0.25 5.3 Statistics are released on a pre-announced schedule 1.00 5.4 Statistics are made available to all users at the same time 5.5 Statistics not routinely disseminated are made available upon request 1.00 5.6 Documentation on concepts, scope, classifications, basis of recording, data sources, and statistical techniques is available, and differences from internationally accepted standards, guidelines, or good practices are annotated 5.7 Levels of detail are adapted to the needs of the intended users 0.00 5.8 Contact points for each subject field are publicized 0.25 5.9 Catalogs of publications and other services, including information on any charges, are widely available 0.00 Latent Latent 0.96 0.54 0.38 0.79 0.58 0.38 0.38 0.00 9