What is SDI? www.worldbank.org/sdi www.sdindicators.org @SDI4Africa http://www.youtube.com/watch?v=qcuzdu3_nmc
Team SDI GAYLE MARTIN WALY WANE CHRISTOPHE ROCKMORE EZEQUIEL MOLINA OBERT PIMHIDZAI ZELALEM DEBEBE RAIHONA ATAKHODJAYEVA 2
A bold Africa-wide initiative that tracks performance and quality of service delivery in primary schools and at frontline health facilities across countries and over time. A unique Partnership An ambitious Vision that SDI will be a highly trusted data source, anticipated by policymakers, NGOs and the media every 2-3 years, and be used to inform policies, track performance and hold officials accountable 4
Service Delivery Indicators
What are SDI Surveys? Facility-level surveys n=300-400 facilities (per sector) Primary data collection by a national research institute Nationally representative sample, also statistically significant for rural/urban and by provider type, level of facility Consistent methodology for comparable data across countries Vision for fast turnaround: Survey results within 7 months Datasets within 9 months 6
SDI Sampling Strategy Population Frame Strategy Public and private non-profit facilities [Private providers] Tiers: Primary facilities (health posts and health centers) and first level of hospitals (usually district hospitals) Facility list from the MoH; all other sources (e.g. IEs etc.) - Stratification: rural/urban; - Plus a 10% allowance for replacement - Excluding facilities randomly selected for pre-testing - Survey weights: facilities and providers - Other country-specific sampling needs Kenya - 294 facilities; 1,856 health providers Nigeria - 1,172 facilities; 6,040 health providers (5,754; patient simulations; [6 additional states completed] Uganda - 400 facilities; 1,507 health providers; (736 patient simulations) 7
Service Delivery Indicators Focus on aspects that are notoriously difficult to measure at scale Inputs Measures of structural quality i.e., a tracer approach Unit of analysis: health facilities and health providers Provider ability Measures of process quality Provider effort Measures of productivity, effort and absence
Service Delivery Indicators Provider ability Malaria, Diarrhoea with severe dehydration, Pneumonia, Diabetes and Tuberculosis, Neonatal asphyxia, Post partum haemorrhage Provider effort Time spent with patient Caseload Absence Equipment Scale, thermometer, sphygmonometer, refrigeration, sterilization equipment Drugs SARA drugs Infrastructure Electricity, water, sanitation Provider ability Measures of process quality Inputs Measures of structural quality Provider effort Measures of productivity, effort and absence
Instrument Modules (see handout) Health facility information Staff Roster Clinical Vignettes Expenditure tracking Exit survey Indicators (see handout) INPUTS - Infrastructure availability - Medical equipment availability - Drug availability - Management and Supervision PROVIDER EFFORT - Absence rate - Caseload per provider PROVIDER ABILITY - Diagnostic accuracy, Treatment accuracy - Adherence to clinical guidelines - Management of maternal and neonatal complications INPUTS - Sources of funding (incl user fees) and Use of resources - Governance, Accountability and Transparency - Patient-provider interaction - Patient perceptions of time and expense - Asset index
New Developments Additional patient simulations Chronic diseases (hypertension, diabetes maintenance etc. ANC visits Family planning visits Provider Motivation and Burnout Community-based services Competence in core responsibilities (diarrhea w severe dehydration, mother within 48 hrs post partum, growth monitoring and malnutrition detection) Time use Equipment /materials
Measurement Issues and Trade-offs Tracer approach vs Inventory approach: Measures proxies of what really matters SDI doesn t tell you everything you need to know about service delivery performance E.g. we don t expect providers to know only the conditions used in the patient simulations E.g. we don t expect only the 30 drugs we sample to be in a facility 14
Measurement Issues and Trade-offs Looks at the functional availability of inputs 15
Measurement Issues and Trade-offs Not aggregated into a single index Technical consideration: Aggregation requires weighting and agreement on the weights:» Who will determine the weights?» What should the weights be?» Should the weights be fixed (or vary) over time? Utility and Political Salience: Index value has no inherent meaning (and may detract from its political salience and use as a mobilizing force) Greatest utility: ranking and relative performance is assessed (HDI) Ranking not only requires that the universe be agreed upon ex ante, but that it remains relatively fixed over time 16
Measurement Issues and Trade-offs Production function approach to service delivery: -- in the same place at the same time Public Private (non-profit) Clean water 75.4% 97.3% Toilet 94.8% 97.2% Electricity 68.4% 90.1% Only 49% of public facilities had clean water AND sanitation AND electricity (as opposed to the simple average: 795%) 17
Measurement Issues and Trade-offs Equity considerations Typically requires household survey data Selection bias when exit surveys are sourced; but there is some value nonetheless Indirect ways: Choose the lowest level of facilities as this is where the poorest quintiles are most likely to seek care Focus on PHC services that address conditions that are most prevalent among the poor 18
Measurement Issues and Trade-offs Objective versus subjective indicators In an effort to capture the consumer s service delivery experience, many subjective measures: facility-based exit interviews or consumers interviewed in household surveys Problem: Perceptions are influenced by expectations Full quality of the service may only be fully appreciated several weeks after the health facility visit (by when healing may have occurred) Often lacks credibility among health professionals etc. 19
Direct Observation Standardized Patients Patient case Simulations/ Vignettes Measurement Issues and Trade-offs Provider Competence and Actual Performance Technically correct Pro - Good picture of reality Con Comparability lost due to case mix/selection bias Con - Hawthorne effect Pro Comparability Pro No Hawthorne effect Pro Comparability Con Designed to measure knowledge and competence (know-do gap) Con - Hawthorne effect Administrativel y feasible Con - Very time and labor intensive Con- Higher level of ethical clearance (timeconsuming) Con Training of skilled actors is time intensive, costly. Con Some simulations cannot be acted (PPH, neonatal asphyxia) Con - What to do in the case of suggested invasive investigation or treatment? Low level of ethical clearance (as no patient is directly involved) Politically supportable? Con - Damage relations with medical profession? - Tricking them - Secretly adding to their workload Notes Ethics clearance higher bar of clearance Ethics clearance higher bar of clearance Pro Actionable data Pro Acceptable measure of competence
Measurement Issues and Trade-offs Other considerations Depth-frequency (of survey) tradeoffs Indicators with meaning to non-technical audiences Etc. 21
Some Results Health
INPUTS Kenya Nigeria Senegal Tanzania Uganda (Public health facilities only) Minimum infrastructure 39% 18% 39% 19% 47% Minimum equipment 77% 25% 53% a 78% a 18% Drugs availability 52% 45% 78% b 76% b 40% Drugs availability children 69% 47% -- -- 34% Drugs availability mothers 41% 44% -- -- 23% Vaccines availability 83% 73% -- -- 58% EFFORT Absence rate 29% 29% 20% 21% 47% Caseload per day 8.7 1.5 -- -- 10.0 Time spent with patients 39 min 29min ABILITY Share of providers able to Correctly diagnose common conditions c 74% 36% 34% 57 58% Adhere to clinical treatment guidelines c 43% 31% 22% 35% 35% Correctly manage maternal and neonatal complications d 44% 17% -- -- 20% a Only 3 items were considered: weighing scale, thermometer and stethoscope as opposed to 2 additional items in the other countries: refrigerator and sterilizing equipment. b Only 15 drugs were considered as opposed to 10 priority drugs for children and 16 priority drugs for mothers. a Acute diarrhea with dehydration, Malaria with anemia, Pneumonia, Tuberculosis, and Diabetes. b Post-partum hemorrhage, and Neonatal asphyxia.
Determinants of Diagnostic Accuracy
Teacher absence, teacher knowledge and student test scores (1) (2) (3) (4) Student's Math score Urban 0.0130 0.0106 0.0118 0.0114 (0.0108) (0.0106) (0.0105) (0.0108) Private school 0.104*** 0.0993*** 0.0968*** 0.0966*** (0.0121) (0.0124) (0.0122) (0.0125) District Poverty Rate (2005-06) -0.000519-0.000460-0.000457-0.000463 (0.000447) (0.000452) (0.000442) (0.000438) Fraction correct on English section 0.0855 0.0909 0.0803 0.0837 (0.0695) (0.0687) (0.0673) (0.0675) Fraction correct on Maths Section -0.0148-0.0192-0.00895-0.00918 (0.0456) (0.0450) (0.0433) (0.0435) Fraction correct on Pedagogy Section 0.112* 0.113* 0.120* 0.123** (0.0629) (0.0630) (0.0611) (0.0610) Absent from school -0.0242 (0.0311) Absent from class -0.0450** (0.0225) Time spent on teaching and learning tasks 0.00486*** (0.00184) Time spent on other classroom tasks 0.00628 (0.00967) Province effect Yes Yes Yes Yes Constant 0.654*** 0.668*** 0.616*** 0.615*** (0.0642) (0.0626) (0.0646) (0.0662) Observations 301 301 300 300 R-squared 0.372 0.384 0.401 0.398 Regressions on school averages. Robust standard errors in parentheses. *** p<0.01, ** p<0.05, * p<0.1 Teacher s pedagogy seems to matter for student s math score than teacher s math knowledge Absence from class, rather than absence from school, appears to be what really matters for student s test score Student test s score is increasing in teacher s time spent on teaching and learning tasks. Time spent on other classrooms tasks (e.g. maintain discipline) has no significant impact on test score.
What does it cost per country? 27
SDI Steering Committee LEMMA SENBET African Economic Research Consortium l Expert Consultation 31 January 2012 NATHALIE DELAPALME Mo Ibrahim Foundation RUTH LEVINE Hewlett Foundation RITVA REINIKKA World Bank ORY OKOLLOH Omidyar MWANGI KIMENYI Brookings Institution JAKOB SVENSSON Stockholm Univ. LEONARD WANTCHEKON Princeton Univ. SHANTAYANAN DEVARAJAN World Bank MTHULI NCUBE African Development Bank AGNES SOUCAT African Development Bank 28
SDI Technical Panel JAKOB SVENSSON JISHNU DAS OTTAR MAESTAD JAMES HABYARAMINA TESSA BOLD DEON FILMER 29
www.worldbank.org/sdi www.sdindicators.org 30