LESSONS LEARNED FROM EVALUATIONS OF PCBR PROGRAMS: PILOT STUDY

Similar documents
DONOR PERSPECTIVES ON PLACE-BASED PHILANTHROPY

Building to Grow: Internal Capacity Building for Small Foundations

INNAUGURAL LAUNCH MAIN SOURCE OF PHILOSOPHY, APPROACH, VALUES FOR FOUNDATION

The Nonprofit Marketplace Bridging the Information Gap in Philanthropy. Executive Summary

THE ROLE AND VALUE OF THE PACKARD FOUNDATION S COMMUNICATIONS: KEY INSIGHTS FROM GRANTEES SEPTEMBER 2016

GLOBAL PHILANTHROPY LEADERSHIP INITIATIVE

Staffing Grants Management

SAMPLE. Small Grant Reporting Form PROGRESS REPORT

2015 Lasting Change. Organizational Effectiveness Program. Outcomes and impact of organizational effectiveness grants one year after completion

Donor and Grantee Customer Satisfaction Survey Findings

Resources Guide. Helpful Grant-Related Links. Advocacy & Policy Communication Evaluation Fiscal Sponsorship Sustainability

the Winthrop Rockefeller Foundation Moving the Needle 2.0 strategic plan

Building the Capacity of Capacity Builders

Is Grantmaking Getting Smarter? Grantmaker Practices in Texas as compared with Other States

SHOULD I APPLY FOR AN ARC FUTURE FELLOWSHIP? GUIDELINES

Evaluating Age-Friendly Work: Moving Towards Sustainability AARP Age-Friendly Communities Network Event October 7, 2014

2018 Couch Family Foundation Early Childhood Education Provider Application

ACCELERATION IN MEXICO: INITIAL DATA FROM MEXICAN STARTUPS

2018 Grants for Change REQUEST FOR PROPOSALS

The National Action Plan to Improve Health Literacy

Healthy Eating Research 2018 Call for Proposals

The New York Women s Foundation

The Libra Foundation

CPRIT PEER REVIEW FY 2017 HONORARIA POLICY 1. Peer Review Structure

THE STATE OF GRANTSEEKING FACT SHEET

Defining and Tracking Grant Outcomes

COMMON GRANT APPLICATION FORMAT

How Will We Know if Our Capacity-Building Support is Working?

REQUEST FOR PROPOSAL. Community Assessment in Disaster: Framework, Process, and Tools

The Funding Landscape: Federal, Foundation, and Corporate Grantmaking Prepared for Temple University

The Future of Community Foundations: The Next Decade

Centre for Cultural Value

VIBRANT. Strategic Plan Executive Summary

Office of Grants & Sponsored Research PRE AWARD GUIDE. Grantsmanship, Concept Development, and Prospecting

REQUEST FOR PROPOSALS

Making a Difference: Evaluating Your Philanthropy. Act boldly. Give wisely.

2018 Couch Family Foundation Community Grant

Community Leadership Project Request for Proposals August 31, 2012

Streamlining Assessment Report

Statement of Guiding Principles

Update on the Nonprofit Sustainability Initiative. September 2015

Philanthropy Fellows Program: Building the Next Generation of Philanthropy Professionals

FROM GRANTS TO GROUNDBREAKING:

Request for Proposal

Evaluation of The William and Flora Hewlett Foundation s Organizational Effectiveness Program

Audit of Engage Grants Program

Lessons Learned in Successfully Mentoring BS-DNP toward Scholarly Projects

MASONIC CHARITABLE FOUNDATION JOB DESCRIPTION

Organizational Effectiveness Program

Call for Proposals Building Research Capacity in Least Developed Countries

State of the Nonprofit Sector in the San Fernando Valley

2016 Grants for Change

Highlights 2016 Gifts to Charitable Organizations

Request for Proposal (RFP) Released: Friday, September 16, 2016

Pathway to Business Model Innovation Getting to Fueling Impact

Report on 2016 Direct Charitable Activities

Social Enterprise Sector Strategy Page 1

2001 Rural Development Philanthropy Baseline Survey ~ Updated on June 18, 2002

DUNHAM FUND DUNHAM. The Mission of the Dunham Fund. Online Grant Guidelines FUND. 8 East Galena Boulevard

Background Paper for the Meeting of National Focal Points on Improving Future National Reporting to the Commission on Sustainable Development

Community Grant Program Application Instructions

Coordinated Funding. Lessons from a Place-Based Grantmaking Collaborative

Principal Skoll Awards and Community

STEM Academy Project Based Learning

FUNDING COHORTS. Microsoft Silicon Valley 2014 YouthSpark Cohort Program. A Summary Report

Reference Services Division Presents The Foundation Center Databases. Foundation Directory Online Professional

Glossary of Nonprofit Terms

Healthy Greenville Grant Initiative. Request for Proposal (RFP)

DEPARTMENT OF LABOR AND EMPLOYMENT DIVISION OF EMPLOYMENT & TRAINING th Street, Suite 1200 Denver, Colorado

DonorSearch Products ProspectView Screening

Request for Proposals Evaluation of the Respite Partnership Collaborative

AmeriCorps State Formula Grant Competition. Operating and Planning Grants REQUEST FOR APPLICATIONS

Social Impact Bond Technical Assistance Lab Proposals Requested

IMPACTING AND PRESERVING THE FUTURE FOR ALL OF US Silicon Valley Community Foundation

Request for Proposals

Donors Collaboratives for Educational Improvement. A Report for Fundación Flamboyán. Janice Petrovich, Ed.D.

Notice of Support Availability: Pay for Success Administrative Data (PFS-AD) Training and Technical Assistance Services Issued: April 13 th, 2017

THE PHILANTHROPIC LANDSCAPE

Welcome to the Foundation Center s. Grantseeking Basics

United Way of Central New Mexico

Blue Cross Blue Shield of Massachusetts Foundation Expanding Access to Behavioral Health Urgent Care

2019 COMMUNITY-ACADEMIC RESEARCH PARTNERSHIPS REQUEST FOR PROPOSALS RESEARCH ON STRATEGIES TO PREVENT AND ALLEVIATE POVERTY IN MICHIGAN

HUMAN INTERACTION RESEARCH INSTITUTE Founded 1961!!! Using Behavioral Sciences to Help Nonprofit Organizations Handle Innovation and Change

Philanthropic Foundations

Grant Application Packet. Office of Sponsored Programs Seminole State College

THE FOUNDATION PROJECT. Summary Report

Funder Collaboration in Support of Strategic Restructuring: A Grantmakers Gathering

Truman State University How To Develop A Proposal: Some General Information

Grants Manager. Candidate Information Pack

Healthy Greenville. FY 2019 Grant Initiative. Request for Proposal (RFP)

VALLEY NONPROFIT RESOURCES Established 2007

EMERGING LEADERS IN PUBLIC HEALTH APPLICATION PACKET. Application Packet COHORT III

Partner (Stakeholders) Assessment Report of Findings

BUSINESS PLAN

REQUEST FOR INTEREST. Proposed Imperial County Health & Wellness Fund

Stewardship Principles for Corporate Grantmakers

Room for Improvement

Justice and Mental Health Collaboration Program Grant Management and Budget Webinar

October 30 th, :00PM EDT. Special Cases in Proposal Development: Large-Scale, Multidisciplinary and/or Multi-Organizational Proposals

2018 COMMUNITY HEALTH IMPACT PROGRAM

Transcription:

HUMAN INTERACTION RESEARCH INSTITUTE Founded 1961 Using Behavioral Sciences to Help Nonprofit Organizations Handle Innovation and Change LESSONS LEARNED FROM EVALUATIONS OF PCBR PROGRAMS: PILOT STUDY Thomas E. Backer, PhD, Jane Ellen Bleeg & Kate Groves June 2007 Introduction The idea for the Lessons Learned From Evaluations of PCBR Programs Project came from a study completed in 2006 for the Carnegie Corporation (Backer, Bleeg & Groves, 2006). The study indicated, among other findings, that about 2/3 of all foundation programs for capacity building have conducted or are conducting some sort of evaluation, and increasingly report that they have some results available. This finding suggests that a new feature could be added to the online Philanthropic Capacity Building Resources database, providing summaries of "lessons learned" from these evaluations as part of PCBR s program profiles. Funders seeking to improve their current work or to create new capacity-building initiatives may find this added information valuable. Also, the lessons learned for individual programs can be synthesized into an overview of capacity-building evaluation, a topic which to date has been addressed by only one major study (Linnell, 2003). Such a synthesis could be used in a number of ways to stimulate dialogue and improve philanthropic strategy and practice in the capacity-building field. Bruner Foundation awarded a 2007 grant supplement for a small pilot study to begin exploring how this Project might best be undertaken. In this brief report, methods used to gather and analyze an initial set of data on evaluation lessons learned for the PCBR programs are presented, along with results obtained. The report ends with a set of suggested next steps to implement the full Project - expanding and refining these data, and thereby making them more valuable to the nonprofit sector and to grantmakers investing in capacity building. Study Method The PCBR database was analyzed to provide summary data on four items of information routinely collected for all PCBR programs in the system as of Spring 2006 (the most recent system update): (1) whether evaluation of the program is conducted; (2) whether results from evaluation are available; (3) how often evaluation is conducted; and (4) what evaluation methods are used. New data were gathered via a brief e-mail survey in January 2007, asking funders of programs that reported doing some type of evaluation (either completed or in process ) whether they could provide a copy of an evaluation report, and for slightly more detailed information on evaluation methods used, as well as input about the impact of completed evaluations. 5435 Balboa Boulevard, Suite 115, Encino CA 91316 818/386-9137 C Fax: 818/386-9582 C E-mail: HIRILA@aol.com 1

Follow-up phone calls were made to those receiving the e-mail survey who did not respond initially, increasing the response rate. The last survey responses were received in March 2007. Physical files for evaluation reports were set up, and an electronic file was established for electronic reports. Results PCBR Database Analysis PCBR currently contains 368 programs. Of these, 223 program profiles indicate that some type of evaluation is in process (86) or has been completed (137). For these 223 programs, respondents report they do the following (none of these responses have been independently verified): 115 provide evaluation results on request 108 do not provide evaluation results 70 conduct ongoing evaluation of the program 43 evaluate annually 63 evaluate periodically (on a varying schedule) 47 do not state what is the frequency of evaluation 63 conduct an external evaluation 75 conduct an internal evaluation 24 combine both internal and external approaches 61 do not state whether evaluation is internal or external 20 use surveys as a primary evaluation method 12 use interviews 3 use records/materials review 3 use a participatory evaluation approach 2 collect reports from grantees 111 use multiple/mixed datagathering methods for evaluation 20 use some other method not fitting any of the above categories 52 do not state what method is used 138 of the foundations whose programs are evaluated list only one such program 21 list two programs that are evaluated 11 list three programs that are evaluated 2 list five programs that are evaluated PCBR contains profiles both of active programs and programs that have completed their period of grantmaking or direct service. For the 47 completed programs that did an evaluation (foundations supporting these programs had indicated they would no longer be active as of the first quarter of 2007), 27 had already completed the evaluation, while evaluation was still in process for the other 20. 2

E-Mail Survey Analysis Sponsoring foundations provided responses to the January 2007 e-mail survey for a total of 99 programs (sent to foundations operating the 223 programs reporting an evaluation activity). Of those respondents: 28 programs provided an electronic copy of an evaluation report 1 program provided a printed copy 26 respondents indicated they compile only internal reports, and did not provide further information 44 respondents indicated they did not create an evaluation report as such Aside from those providing substantive input, 44 respondents replied to the survey but indicated that they do not have any information available on their evaluation activities, while 80 did not respond at all. Thus the total response rate to the e-mail survey was 64.1%. Of the 29 programs that sent copies of evaluation reports: 8 came from independent foundations 7 from community foundations 6 from funder collaboratives 5 from family foundations 3 from public foundations The PCBR database currently contains 86 community foundation programs, 64 independent, 40 family, 27 funder collaboratives, 28 public, 6 corporate, and 12 other. Seven of the 29 reports were from completed programs; the rest were for programs still active as of the beginning of 2007. While no formal content analysis was conducted of the 29 evaluation reports, it is apparent that the programs evaluated in these reports had some common themes. Many of these programs included some sort of leadership development activity as a part of nonprofit capacity building, and a number of others featured opportunities for strengthening nonprofits use of technology. Several of the programs for which evaluation reports were received emphasized community development and community building as a major aspect. Other common areas of focus were enhancing nonprofit communications abilities, or increasing the capacity of nonprofits to network with each other. These programs included both very large, well-funded multi-year foundation initiatives and smaller, more targeted efforts. Among the respondents to the e-mail survey, use of grantee reports as an evaluation method was more common than in the larger sample of all PCBR programs having evaluations. Use of other evaluation methods was similar to those reported in the larger sample. A total of 47 respondents to the e-mail survey provided input about the impact of evaluation activities for their capacity-building program. Responses clustered in three major categories: (1) evaluation led to adjustments in the structure and operation of the capacity-building program itself; (2) changes were made in administrative procedures around application for funding, level of funding available, use of needs assessments, etc.; and (3) evaluation results were used to justify investment in a program with trustees or other stakeholders, and to reinforce for foundation staff that they are moving in the right direction. In several cases, evaluation revealed that a program had been replicated elsewhere. Further efforts to gather evidence about impact of evaluations may be helpful in increasing the utility of these data. 3

Next Steps - Implementing the Lessons Learned Project Pilot study results show that there is a large body of evaluative evidence available for capacitybuilding grantmaking and direct service programs of American foundations, both for active programs and for those whose work has been completed. The pilot study also makes clear that most of these findings are not easily accessible, and that they have not been analyzed comparatively or synthesized. Moreover, there is a fair amount of input about what impact these evaluative activities have had on philanthropic strategy and practice, but this too has not been analyzed or synthesized. Thus, full implementation of the Lessons Learned Project design described at the beginning of this report appears to have potential value. It is estimated that about six months will be required to complete the following activities, once funding has been secured: 1 - New PCBR Evaluation Section Format First, a new format for the PCBR program profiles will be created for review by PCBR s advisors and the funder(s) of the Lessons Learned project. Based upon pilot study results, the revised profile may include an evaluation section like the following (starred items are additions or changes in current components): Program Evaluated: Yes, No, Underway Evaluation Results Available: Yes, No * Frequency of Program Evaluation: Ongoing, Annually (Calendar or Fiscal Year), Periodically, No Information * Type of Program Evaluation: External Evaluation, Internal Evaluation, No Information * Evaluation Methods Used: Surveys, Focus Groups, Interviews, Document Reviews, Site Visits, Grantee Self-Reports, Participatory Evaluation, Other, No Information (check as many as apply) * Summary of Evaluation Lessons Learned and Impact on Foundation Capacity-Building Work: (brief narrative for both topics - a preliminary draft sample is attached) Edited evaluation sections will be prepared for each program in PCBR as of the time this work begins, using the new format. These will be e-mailed for editorial review to the foundation staff who submitted the raw material. In many cases, the program profile already contains all information needed except for every item except the summary at the end. Following review of the draft revised profiles, they will be finalized and uploaded into the PCBR database. The PCBR data-gathering procedure also will be revised so that new programs added to the system will have this information requested from the beginning. 2 - Synthesis Report Next, the collection of lessons learned and impact statements from all PCBR programs will be analyzed to yield a synthesis in the form of a short report, which when finalized will be published electronically on the PCBR website. The report will be sent to the Lessons Learned funders, PCBR s advisors, and to all those whose programs are in the system. Once completed, PCBR will publicize availability of both the synthesis report and the updated database via press releases to some 20 media outlets in the philanthropic and nonprofit sectors, and send an e-mail announcement of it to all those whose programs are listed in the PCBR database. These results then will be used to address the larger issues outlined in the next section. 4

Addressing Larger Issues Results from this pilot study also indicate that the larger Lessons Learned Project could address a number of important larger issues about nonprofit capacity building, both from the nonprofit and philanthropic perspectives: * How to address the limits of evaluation where capacity building is concerned - resources are rarely available to undertake evaluation designs involving control groups, gathering longitudinal data, or gathering data from large populations (Linnell, 2003). * How to identify and include innovations in evaluation, such as the evaluation dashboard concept (Backer, Bleeg & Groves, 2004). * How to package results from evaluations of capacity-building practitioners and decision-makers (including funders), and how methods for evaluation can be tailored to the practical needs of small, community-based programs where the resources for doing evaluation and paying for it are likely to be especially limited (Backer & Barbell, 2004). * How to translate lessons learned from evaluations of programs supported by large foundations or funder collaboratives (of which there were quite a few in this research) to individual foundations, many of which are small. * What can be learned collectively from evaluations about how to design a capacity-building grantmaking or direct service program - setting objectives, undertaking due diligence, assessment of capacity-building needs by nonprofits and by funders, use of intermediaries for funding and/or providing capacity-building services, use of funder collaboratives, design of evaluation and use of evaluation results, and philanthropic strategy regarding capacity building. * How capacity building complements alternative philanthropic strategies - e.g., use of loan guarantees, partnerships with businesses for supporting capacity building through discounts or free merchandise, efforts to promote sustainability beyond the initial investing foundations (Backer & Barbell, 2006). * How nonprofit organizational capacity building fits with community capacity building (which is how the term is more likely used in the international domain). * How to conduct overall assessments of the field and its impact, which hasn t yet been done (Backer, Bleeg & Groves, 2006 found that while a large number of foundations evaluated their programs, most invest only modestly in this activity, with a median cost of $15,000 and about 1% of total philanthropic investment). * What roles national associations like Grantmakers for Effective Organizations and Alliance for Nonprofit Management might take in further development of evaluation strategies for foundation capacity-building programs, and in responding to the challenges raised here (e.g., through conference presentations or sharing of materials on their websites) * What roles might be taken in regard to the same issues by large intermediaries, major management support organizations, regional initiatives for capacity building, and other parts of the national capacity-building infrastructure (Backer & Barbell, 2004). 5

* How input from the evaluations of lessons learned may bear on the content and format of PCBR itself - and on the future of this program, now in its sixth year of operation. Initial discussion of these issues can be part of the synthesis document described above. Additional progress on these and other topics of importance in evaluation of foundation capacity-building grantmaking and direct service programs might be made through three follow-up activities, each of which would require additional resources: (1) a small convening of foundations with an interest in funding capacity building and its relationship to the nonprofit sector (such as the Kellogg, Kresge and Mott Foundations); (2) publication of the synthesis as a learning guide through Fieldstone, one of the most important publishers in the capacity-building field - this would require additional resources both for editorial preparation and for a publication subsidy; and (3) presentations about these issues at conferences of GEO, ANM and other national organizations. This pilot study was funded by a grant from the Bruner Foundation. PCBR has been funded by a consortium of foundations, currently the Bruner and Surdna Foundations. 6

References Backer, T.E., Bleeg, J.E. & Groves, K. (2006). Exploring foundation financial investments in nonprofit capacity building. Encino, CA: Human Interaction Research Institute. Backer, T.E., Bleeg, J.E. & Groves, K. (2004). The expanding universe: New directions in nonprofit capacity building. Washington, DC: Alliance for Nonprofit Management. Backer, T.E. & Barbell, I. (2006). Keeping a good program going: Five steps to engage local philanthropy. Encino, CA: Human Interaction Research Institute. Backer, T.E. & Barbell, I. (2004). Models for local infrastructure. The Nonprofit Quarterly, 12, Special Issue, 50-56. Linnell, D. (2003). Evaluation of capacity building: Lessons from the field. Washington, D.C.: Alliance for Nonprofit Management. 7

EVALUATION LESSONS LEARNED SAMPLE NARRATIVE - DRAFT Castle Colleagues Program - Samuel N. and Mary Castle Foundation The Castle Colleagues program, funded by the Castle Foundation, provides leadership and management development training for directors of early childhood education programs in Hawaii, thus increasing their capacity as educational institutions. To plan next steps effectively, Castle wanted data about impact of the program. Working with one of the first Castle Colleagues, as part of the Colleague s research study for a graduate degree, an expanded assessment was conducted of Castle Colleagues Program s capacity-building impact. The Colleagues were surveyed by written questionnaire, both to provide five-year follow-up data to an earlier (2000) program evaluation, and to query how they perceive current needs regarding leadership development and professional support. Results from the evaluation are being used for program enhancement, and to support possible replication of the program in other regions. Lessons Learned Before becoming Castle Colleagues, half the respondents had had no prior leadership, management or administrative training in their roles as early childhood education center directors. Since they became Castle Colleagues, survey respondents reported: 60% have gone back to school or taken college courses 75% have attended workshops, community-based training and/or conferences 75% have given professional presentations 85% say they are more likely to call a peer for advice, support or resources 85% say they are more likely to call other resource people 70% say they are more likely to take on a leadership role 65% say they are more likely to speak up at a meeting 65% say they are more likely to write a grant proposal. The resulting impact on the early childhood education centers these individuals run is clear: 75% now have more programs for children 75% have more activities for parent involvement 70% have increased staff salaries 65% report the quality of their facilities is better 60% have more community partners 55% have reduced staff turnover 50% have submitted more grant proposals than they had before the program 45% had received more grants than before 40% report an increase in appropriate Board involvement. Impact on Foundation The evaluation results (both from the first study in 2000 and the more recent one) deepened the Castle Foundation s interest in leadership development for executive directors of early childhood education centers - the latter a principal focus of Castle s philanthropic efforts. Moreover, the foundation has encouraged program recipients to submit a joint proposal to local funders, for support that can ease the path to continuing education for early childhood site directors and staff. Evaluation findings also enabled the Castle Foundation to make mid-course corrections in its capacitybuilding intervention after every session. This has helped to increase the overall impact of the program. 8