Preparing for 21 st Century Program Evaluation Yolanda Yugar Leslie McConnell Allegheny Intermediate Unit State Evaluators for PA 21 st Century In this session Getting started Data collection Analysis & Reporting Using evaluation findings Why evaluate? Improve programs use findings to identify strengths and weaknesses and develop plans Maximize services to serve student needs Demonstrate accountability to stakeholders (parents, students, community, partners, etc.) Provide information for promotion, future grant applications Inform sustainability efforts and increase efficiency Use evaluation to work smarter, not harder! Program Evaluation 1
Evaluations are used to Demonstrate accountability Fulfill reporting requirements Assess needs Improve programs Determine relationship between cost and benefit Determine feasibility and replicability Evaluation involves Connecting performance with examination process Collecting & analyzing data Interpreting data and making comparisons Considering results with expectations Considering findings implications Making recommendations and plans for program improvement 21 st CCLC Evaluation Focus Evaluation and reporting are primarily focused on the performance measures outlined in the RFA/your application and GPRA measures Student academic gains School attendance & discipline Behavioral & social benefits Implementation, activities Grantees may collect/examine additional components Program Evaluation 2
Government Performance and Results Act (GPRA) http://www2.ed.gov/programs/21stcclc/ performance.html All grantees must have an external evaluator. The local evaluator should not only produce the local evaluation report but also collaborate with the grantee throughout the grant period. All grantees must have an external evaluator. The local evaluator s role may vary depending on grantee needs. Be sure that your expectations of the local evaluator reflect what is in the evaluator s agreement with your organization. At minimum, the local evaluator should complete an annual, comprehensive, summative local evaluation report. Additionally, they may Program Evaluation 3
Common Evaluator Roles Develop or help to develop the overall plan for evaluating the intervention. Help staff understand evaluation and how it can help in planning and implementing more effective programming. Make recommendations for improvement. Help staff use data gathering methods or instruments in an appropriate and reliable way. Support grantees in identifying existing or developing data instruments. Collect data from various sources. Common Evaluator Roles Analyze and interpret data. Enter data/information into online state/federal reports and/or prepare data/information so programs can enter it themselves. Conduct site visits, interviews, or focus groups. Provide ongoing data/evaluation-related technical support. Role of the Program in Evaluation Sharing with the evaluator the program s needs and requirements Understand what is in the RFA and the grantee s application Collaborating to develop an appropriate and implementable evaluation plan Collecting, organizing, and maintaining program-level data (i.e. attendance) Collecting student data (especially if grantee is also the school) Disseminating evaluation findings Using evaluation findings Ensure that evaluation requirements are met Program Evaluation 4
Role of the AIU State Evaluation Team Provide grantees with information and technical assistance for data collection, analysis, & reporting Communicate with grantees about evaluation requirements and reporting Prepare state-level reports that summarize data from monitoring and grantee reports Manage the PA Grantee Report online system, distribute grantee logins Provide grantees/evaluators with evaluation resources What AIU Does with Grantee Reports/Data Export raw entries from federal and state systems Analyze data overall and by cohort Examine grantee completion of reporting components Examine grantee outcomes Provide recommendations to PDE about PA 21 st CCLC Produce: Summary report of findings for PDE Grantee results for technical assistance, planning, & training In this session Getting started Data collection Analysis & Reporting Using evaluation findings Program Evaluation 5
Data security requirements vary by who you serve. Family Educational Rights and Privacy Act (FERPA) Data safeguarding plan template Student data permission form Formal agreement between the program and LEAs outlining data needs and timeline There s a lot of information to collect about kids. Student demographics PSSA/PASA for Grades 3-8 School attendance Report card grades School discipline/behavior PASA/Keystone Exam for Grade 11 PPICS Teacher Survey Student feedback Feeder school Program attendance by student There s a lot of information to collect about the program. Operations: Weeks Hours Days Program strategies Partners and their contributions Parent activities Staffing Student activities Stakeholder feedback Program Evaluation 6
Collect data as you go. Data often cannot be recreated later accurately. Collect, enter, and organize information as you go to save time later and preserve accuracy. Teacher Survey Administer on paper, via email, or online Know how to access classroom teachers Give teachers advance notice Reinforce that it is required Follow up with non-responsive teachers In this session Getting started Data collection Analysis & Reporting Using evaluation findings Program Evaluation 7
Make sure the analysis method is appropriate to the data source. Report card grades: ½ letter grade or 5 percentage points (federal method) Year to year comparison of PSSA/PASA must be consecutive years Consult assessment guides There are different ways to cut data. By cohort By center By feeder school By grade or grade band By program attendance o 1-29, 30-59, 60-89, 90+ days o Regular attendee (30+ days) vs. non-regular attendee Reporting Requirements Federal: (system TBD) due October 31 each year State: PA Grantee Report due October 31 each year Monitoring (report is completed by program officer) one per 3- year grant cycle Quarterly Performance Report (due 15 days after the close of each quarter) and Year-End Performance Report Local evaluation report due October 31 each year Expenditure reports due monthly Equipment Inventory due October 31 each year Program Evaluation 8
You need to know Each cohort is reported separately For federal reporting, centers must be reported separately For PA Grantee Report centers must be combined Reports completed in summer/fall 2015 will cover summer 2014 and school year 2014-15 Federal Reporting (anticipated content) Basic grantee and center information Feeder schools Implementation (operations, activities, staffing) Student demographics and program attendance Report card grades (fall to spring) for students attending 30+ days during reporting period (summer & SY or SY only) Teacher surveys for students attending 30+ days during reporting period (summer & SY or SY only) Reading & math state assessment data (PSSA/PASA Grades 3-8, PASA Grade 11) or Keystone Exam Literature & Algebra I (Grade 11 only) Student results reporting in PPICS is by individual student Teacher Survey (anticipated) One survey per student Must be student s school-day classroom teacher Reading/language arts or math Suggested: Select teacher based on student need Must use required questions/format Methods of collection: Paper form Word form Web-based survey: www.aiu3.net/teachersurvey21c Combination of paper/web Program Evaluation 9
PA Grantee Report Content Evaluation implementation Local assessment results (i.e. 4Sight, DIBELS, etc.) School attendance School discipline What grantees include in the PA Grantee Report is based on what they identified in their applications. Parent, student, school administrator, partner, staff feedback Student results reporting in PA Grantee Report is aggregate student results Local Evaluation Report Content Analysis, interpretation, explanation, and/or graphs/tables of implementation and outcomes data (including what may be included in federal and state reports) Examination of results compared to local performance indicators (see your application) Optional: Case studies, interview or focus group results, site visit findings, other feedback, other areas of interest, etc. Never include individual student data in local report(s) Students should not be identifiable in reports Your local evaluation report should Have an introduction, overview, and/or provide some context about the program Identify the individual or group that produced the report Provide operational and implementation details in addition to outcomes Include evaluator recommendations for improvement Be comprehensive covering all aspects of implementation and results. Program Evaluation 10
Your local report is both a product and a tool. Overview Narrative and visuals Operations & implementation Interpretation of results Local performance measure progress/achievement Recommendations In this session Getting started Data collection Analysis & Reporting Using evaluation findings Findings must be shared with stakeholders. Advisory board meetings Newsletters Parent meetings Website Program Evaluation 11
Evaluation findings are used to improve programs. Evaluator recommendations Strengths keep doing Areas of decline or deficiency Stakeholder feedback New grant applications Other tips Have one complete data source that everyone uses for reporting Reporting should be as accurate and complete as possible Don t wait until September to start thinking about the reports due in October Evaluation and reporting are not just about compliance Coming up next Quarterly Performance Reports (QPR) Evaluation resources Data tools Program Evaluation 12
Contact AIU Evaluators with any questions. Leslie McConnell Leslie.McConnell@aiu3.net 412-394-5821 Yolanda Yugar Yolanda.Yugar@aiu3.net 412-394-5939 Program Evaluation 13