User-Friendly Ideas for Project Evaluation. Broader Impacts Evaluation Workshop November 28, 2012

Similar documents
Integrating Broader Impacts into your Research Proposal Delta Program in Research, Teaching, and Learning

Overview of the NSF REU Program and Proposal Review

Spring 2014: NSF CAREER presentation and panel discussion

Broader Impacts. Siva S. Panda

NSF-BSF COLLABORATIONS IN BIOLOGY. Theresa Good Acting Division Director Molecular and Cellular Biosciences September 2017

Preparing for Proposal Writing

How to Prepare an NSF Summary Page. Julie Longo Technical Writer Howard R. Hughes College of Engineering March 1, 2013

NSF Faculty Early Career Development (CAREER) Program. April 23, 2015

except for the medical sciences. Fully integrated with education Train tomorrow s scientists and engineers

Proposal Writing Workshop

SEIRI SEED Grant (SSG) 2018 Request for Proposals

Request for Proposals SD EPSCoR Research Infrastructure Improvement Track-1 Award

Applying for Graduate Research Fellowships

ACCOMPLISHMENTS: What was done? What was learned?

National Science Foundation Annual Report Components

Funding opportunities available at the NSF

Applying for Graduate Research Fellowships

Getting to Know NSF s Education Directorate: Relevant Grant Programs, Grant Writing, and the Proposal Review Process

Access this presentation at:

A GUIDE FOR PROPOSAL WRITING NATIONAL SCIENCE FOUNDATION

Innovative Technology Experiences for Students and Teachers (ITEST) Program

NSF Graduate Research Fellowship

National Science Foundation Fall Grants Conference Pittsburgh, PA - November 14 & 15 - Carnegie Mellon University

Webinar NSF Postdoctoral Research Fellowships in Biology (PRFB)

NSF Grant Funding. Okhee Lee Department of Teaching and Learning March 8, 2013

Utah Programmatic Terms and Conditions. iutah - innovative Urban Transitions and Aridregion Hydrosustainability. Post-Award Management

Instructions for National Science Foundation (NSF)-style proposals

NSF-BSF COLLABORATIONS IN BIOLOGY. Dr. Michelle Elekonich, September 2015

PROPOSAL WRITING: 10 Helpful Hints and Fatal Flaws

NSF Grad (and Other) Fellowships: Why Apply?

Innovative Technology Experiences for Students and Teachers (ITEST) Program

RESEARCH & EDUCATION INNOVATION (REI) AWARDS In Microbiome Research

National Aeronautics and Space Administration

FIRST AWARDS In Climate or Energy Research or Atomic/Molecular/Optical Science

National Science Foundation Ins and Outs. Larry Gottlob Program Director, SBE/BCS/PAC Associate Professor, Dept. of Psychology

Writing Doctoral Dissertation Proposals for Social, Behavioral, and Economic Sciences (SBE)

NSF Faculty Early-Career Development Program

1890 CAPACITY BUILDING GRANT 2011 Proposal Components

Experimental Program to Stimulate Competitive Research (EPSCoR)

How to Write a Winning Proposal

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: National Defense Education Program (NDEP) FY 2012 OCO

!"#$%&"'()*$+,-'./))0,'1234' '

HUBBLE SPACE TELESCOPE CYCLE 22 EDUCATION & PUBLIC OUTREACH GRANT CALL FOR PROPOSALS

Virginia Sea Grant Graduate Research Fellowship Deadline: November 13, 2015

Mathematics/Statistics NSF GRFP Seminar Writing Studio

Basics of NSF NSF. Current realities Trends and opportunities. Review Process How to get your dreams fulfilled

National Science Foundation. GRFP Key Elements. NSF Graduate Research Fellowship Program (GRFP) GRFP Unique Features

Engineering Research Centers (ERC)

The NSF Graduate Research Fellowship Program

The NSF Graduate Research Fellowship Program

National Science Foundation Graduate Research Fellowship Program. What are NSF s Goals? Advice for writing any proposal

Grant proposals... Which funding agency?

NURSING PROGRAM STANDARDS REVISED AND APPROVED BY THE FACULTY OF THE NURSING PROGRAM

Demystifying the Funding Process at the National Science Foundation

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Mathematics/Statistics NSF GRFP Seminar Information Session

GREAT EVALUATION PLAN

AIGA Design Faculty Research Grant overview, application instructions and important dates

National Science Foundation Doctoral Dissertation Research Improvement Grants. Damon Talbott, Ph.D. Office of Graduate Studies

Azrieli Foundation - Brain Canada Early-Career Capacity Building Grants Request for Applications (RFA)

The National Science Foundation. Kam K. Leang Associate Professor Department of Mechanical Engineering

Federal Funding for Native Languages: National Science Foundation s Documenting Endangered Languages Program

Possible Outline for CAREER Project Description

2013 Call for Proposals. Canadian Breast Cancer Foundation (CBCF) Canadian Institutes of Health Research (CIHR)

Broader Impacts Towards a Strategic Approach for Oregon State University

APPLYING FOR EXTERNAL RESEARCH FUNDING / ATT SÖKA OM EXTERNA FORSKNINGSMEDEL LAURA J. DOWNING, PROF. OF AFRICAN LANGUAGES

Secrets of Successful NSF CAREER Proposals

ARG/AR-WITAG ELIGIBILITY AND GUIDELINES

The NSF Graduate Research Fellowship Program

Government Perspectives on University-Industry Engagement

NSF Graduate Research Fellowship Program Handbook. Table of Contents

Requests for Proposals

REQUEST FOR PROPOSALS Q?Crew LEAD TEEN VOLUNTEER COORDINATOR STATEMENT OF WORK

Pediatric Cataract Initiative - Small Research Grant Request for Proposals Deadline: October 14, 2011

REQUEST FOR PROPOSALS THE ROSE HILLS FOUNDATION INNOVATOR GRANT PROGRAM RESEARCH FELLOWSHIP APPLICATION

2016 NSF Grad Fellowship Workshop

Faculty Early Career Development (CAREER) Program. Program Solicitation NSF

SUNY Excels. Performance Improvement Plan. September Columbia-Greene Community College. Chief Student Affairs Officer:

MENTOR-CONNECT TUTORIAL

With Graduate Student Preconference May 27 th, 2017

NSF Grad (and Other) Fellowships: Why Apply?

National Science Foundation Graduate Research Fellowship Program (GRFP)

Advice on Writing Grant Proposals Dennis W. Lindle

Initiative for Food and AgriCultural Transformation (InFACT) The Ohio State Discovery Themes

Master of Science in Nursing Program. Nurse Educator / Clinical Leader Orientation Handbook for Preceptors. Angelo State University

Faculty Early Career Development (CAREER) Program Proposal Writing Webinar Monday, April 17, 2017

Hints for Economists in NSF Interdisciplinary Competitions. Nancy Lutz. Resources For the Future March 2011

EPSRC Monitoring and Evaluation Framework for the portfolio of Centres for Doctoral Training (CDT s) Updated January 2011

KNOWLEDGE ALLIANCES WHAT ARE THE AIMS AND PRIORITIES OF A KNOWLEDGE ALLIANCE? WHAT IS A KNOWLEDGE ALLIANCE?

Martha R.C. Bhattacharya, PhD Washington University in St. Louis

Alberta SPOR Graduate Studentship in Patient-Oriented Research. Program Guide

Discovery Innovation Application

D.N.P. Program in Nursing. Handbook for Students. Rutgers College of Nursing

Faculty Early Career Development (CAREER) Program. National Science Foundation Organizational Structure

National Science Foundation NSF 101

Fellowship Master List - Table of Contents

OBTAINING STEM SUPPORT FROM PRIVATE FOUNDATIONS: A TEAM APPROACH

Nurse Practitioner Student Learning Outcomes

Participant Support Costs Guidance

: A MODEL FOR PREPARING THE NSF CAREER PROPOSAL

Transcription:

User-Friendly Ideas for Project Evaluation SP@ISU Broader Impacts Evaluation Workshop November 28, 2012 Mack Shelley University Professor Departments of Statistics and Political Science Mari Kemis Assistant Director Research Institute for Studies in Education (RISE)

Today s Objective The objective for today is to focus on key aspects of what NSF expects to see in a project proposal how to evaluate broader impacts help with your grant writing evaluate already-funded projects

Why Does This Matter? All NSF proposals are required to provide very specific information about two official merit-review criteria of the proposed effort: Intellectual Merit Broader Impacts Let s take a closer look at what these criteria really require. We ll focus today on broader impacts, but keep in mind that the intellectual merit must be articulated clearly and strongly, too, and that your broader impacts components of the proposal should be aligned with the intellectual merit components.

Intellectual Merit Advance knowledge and understanding Qualified PI Explore creative, original or potentially transformative concepts Well organized and conceived proposal Sufficient access to resources

Broader Impacts How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, disability, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?

Which Means What, Exactly? Your one-page Project Summary must specifically address the project's intellectual merit and broader impacts. If the Summary does not specifically address both review criteria in separate statements, the proposal will be returned without review. The Data Management Plan will be reviewed as part of the intellectual merit or broader impacts of the proposal, or both, as appropriate. Mentoring activities provided to postdoctoral researchers supported on the project, as described in a one-page supplementary document, will be evaluated under the Broader Impacts criterion.

Broader Impact Themes and Categories Work Force Development: REU, Undergraduate, Post-doc, Peer Mentoring, Course Integration Society: Policy/ Management, Societal Impact, Environment Public Dissemination: Public Accessibility, Public Outreach, COSEE, Informal Science Education K12 Education: K12 Casual Talks, K12 Curricula, K12 Formal Activities, K12 Student Mentoring, K12 Teachers, RET, GK12 Instrumentation and Facilities: Includes Technological Innovation, Infrastructure Multi-user Facilities, CREST Collaboration: Includes Interdisciplinary, Industrial, and International Collaborations NSF Sponsored Programs: AGEP, CREST, DLESE, GK12, IGERT, LSAMP, OEDG, RET, REU Underrepresented Minorities: OEDG, LSAMP, CREST, AGEP, Outreach to Underrepresented Minorities, and Underrepresented Minorities

Promote Teaching, Training and Learning Engaging with the broadest possible spectrum of the K-12 community K-12 Teachers K-12 Students K-12 Administrators K-12 Guidance Counselors

Think about Broader Impacts through the Lens of Intellectual Merit Metrics Advance knowledge and understanding A qualified PI Explore creative, original or potentially transformative concepts A well organized and conceived proposal Sufficient access to resources

Advance Knowledge and Understanding What are stumbling blocks for student success/engagement in STEM fields? What ethnic/gender groups are underrepresented in target science field? Teachers and students What resources are traditionally unavailable in the K12 environment that can realistically enhance students learning? What are common fundamental conceptual gaps? The review panel will look closely to make sure the PI has identified these points.

PI Qualifications Teaching experience Understanding of standards/testing requirements Familiarity with K-12 environment Experience mentoring The review panel will need to be convinced that the PI has identified his/her qualifications that are relevant to the success of the proposal

Explore creative, original or potentially transformative concepts Partner with existing successful programs (RETs for In-service educators/reus for pre-service educators) Reach out to the Educational Research Community within the home institution Leverage existing NSF investments in education It is important to note that the PI does not need to reinvent the wheel for the sake of Broader Impacts

Well Organized and Conceived Broader Impacts flow from the PI s science proposal Demographics are considered Educational arena addressed Standards researched and linked to the BI Specific teachers and administration of participating schools are identified and have submitted letters of commitment Links made to existing programs/broader impacts Engagement of teachers in the development process Can the review panel find clear evidence of these efforts?

Sufficient Access to Resources Evidence of connection with individuals/target community Evidence of affiliation with established programs Articulated plan for making resources available (funding, transportation, scheduling) Clear plan detailed for the logistics of sending teachers to professional conferences to share their work with the PI The review panel will check carefully to make sure that these resource issues have been addressed satisfactorily.

Stick with Best Practices Teacher Programs Multiple days, summer opportunities $$ compensation/grad credit Follow up meetings Built-in opportunities and time for teacher collaboration Host a teacher cohort Focus on lab technique & process Relevant content STEM career mapping Involve pre-service teachers Include teachers in the workshop design process Student Programs PI classroom visits with attention to grade level, covering reasonable content and level of detail/sophistication Provide hands-on experiences Go with classes on field trips Contextualize the PI s science research in society Give meaningful and relevant presentations Career mapping Try to relay excitement about science Share new/emerging careers

Yeah, but What do Broader Impacts Look Like in Practice? Examples of activities to demonstrate broader impacts are available at: http://www.nsf.gov/pubs/gpg/broaderimpacts.pdf These are among the specifics that are looked for by reviewers and by NSF program staff in their funding decisions. This information is not exhaustive, and not all examples need to be present in any given proposal. Proposal authors should be creative in demonstrating the broader impacts of their projects. Try to link similar kinds of activities you already may have underway to the research and education projects you are proposing for funding. Proposers also should consider what types of activities best suit their interests, while enhancing the broader impacts of the project being proposed.

Examples of Activities to Advance Discovery and Understanding While Promoting Teaching, Training, and Learning Integrate research activities into the teaching of science, math, and engineering at all educational levels (e.g., K-12, undergraduate science majors, non-science majors, and graduate students). Include students (e.g., K-12, undergraduate science majors, non-science majors, and/or graduate students) as participants in the proposed activities as appropriate. Participate in the recruitment, training, and/or professional development of K-12 science and math teachers. Develop research-based educational materials or contribute to databases useful in teaching (e.g., K-16 digital library). Partner with researchers and educators to develop effective means of incorporating research into learning and education. Encourage student participation at meetings and activities of professional societies. Establish special mentoring programs for high school students, undergraduates, graduate students, and technicians conducting research. Involve graduate and post-doctoral researchers in undergraduate teaching activities. Develop, adopt, adapt, or disseminate effective models and pedagogic approaches to science, mathematics, and engineering teaching.

Examples of Activities to Broaden Participation of Underrepresented Groups Establish research and education collaborations with students and/or faculty who are members of underrepresented groups. Include students from underrepresented groups as participants in the proposed research and education activities. Establish research and education collaborations with students and faculty from non-ph.d.-granting institutions and those serving underrepresented groups. Make campus visits and presentations at institutions that serve underrepresented groups. Establish research and education collaborations with faculty and students at community colleges, colleges for women, undergraduate institutions, and EPSCoR (http://www.nsf.gov/od/oia/programs/epscor/about.jsp) institutions. Mentor early-career scientists and engineers from underrepresented groups who are submitting NSF proposals. Participate in developing new approaches (e.g., use of information technology and connectivity) to engage underserved individuals, groups, and communities in science and engineering. Participate in conferences, workshops and field activities where diversity is a priority.

Examples of Activities to Enhance Infrastructure for Research and Education Identify and establish collaborations between disciplines and institutions, among the U.S. academic institutions, industry, and government and with international partners. Stimulate and support the development and dissemination of nextgeneration instrumentation, multi-user facilities, and other shared research and education platforms. Maintain, operate, and modernize shared research and education infrastructure, including facilities and science and technology centers and engineering research centers. Upgrade the computation and computing infrastructure, including advanced computing resources and new types of information tools (e.g., large databases, networks, and digital libraries). Develop activities that ensure multi-user facilities are sites of research and mentoring for large numbers of science and engineering students.

Examples of Activities for Broad Dissemination to Enhance Scientific and Technological Understanding Partner with museums, nature centers, science centers, and similar institutions to develop exhibits in science, math, and engineering. Involve the public or industry, where possible, in research and education activities. Give science and engineering presentations to the broader community (e.g., at museums and libraries, on radio shows, and in other such venues). Make data available in a timely manner by means of databases, digital libraries, or other venues such as CD-ROMs. Publish in diverse media (e.g., non-technical literature, and websites, CD- ROMs, press kits) to reach broad audiences. Present research and education results in formats useful to policy-makers, members of Congress, industry, and broad audiences. Participate in multi- and interdisciplinary conferences, workshops, and research activities.

Examples of Activities to Demonstrate Benefits to Society Demonstrate the linkage between discovery and societal benefit by providing specific examples and explanations regarding the potential application of research and education results. Partner with academic scientists, staff at federal agencies and with the private sector on both technological and scientific projects to integrate research into broader programs and activities of national interest. Analyze, interpret, and synthesize research and education results in formats understandable and useful for nonscientists. Provide information for policy formulation by Federal, State, or local agencies.

Use of the Term Broader Impacts in ISU Abstracts: 1985 2011 Percent of abstracts with BI statements Percent of abstracts with BI statements 66.7 38.5 31.0 26.1 33.3 42.8 26.3 46.4 13.5 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 2.9 0.0 4.4 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011

Recent Articles About BI Kamenetzky, J.R. (2012). Opportunities for Impact: Statistical Analysis of the National Science Foundations Broader Impacts Criterion. Science and Public Policy, 2012, pp. 1-13. Roberts, M. (2009). Realizing Societal Benefit from Academic Research: Analysis of the National Science Foundation s Broader Impacts Criterion. Social Epistemology: A Journal of Knowledge, Culture, and Policy, 23, pp. 199-219. Boardman, P.C. & Ponomariov, B.L. (2007). Reward Systems and NSF University Research Centers: The Impact of Tenure on University Scientists Valuation of Applied and Commercially Relevant Research. The Journal of Higher Education, Vol. 78 (1), pp. 51-70.

What s New at NSF? Grant Proposal Guide (NSF 13-1, effective January 14, 2013) Changes in Merit Review The project must be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge. NSF projects should contribute broadly to achieving societal goals. Broader impacts can be the result of the research, itself; research-related activities; or activities that are supported by and complementary to the research project. Outcomes related to both major review criteria should be assessed/evaluated at either the individual- or aggregate-project level.

So, What Else is New? BI evaluation is not strictly focused on the evaluation done by a PI (or by the evaluator working with the PI team) to judge the success of the BI activities. While that is essential, equally important is the need for the PI to use relevant research, evaluation results, and resources in developing the plan for the BI activities, where appropriate improving on prior results, and writing about the results. This is a handy webpage on the revised NSF review criteria: http://www.nsf.gov/bfa/dias/policy/merit_review/resources.jsp http://www.nsf.gov/bfa/dias/policy/merit_review/factsheets/proposers.pdf One of the unknowns is how the revisions will be implemented over time and the extent to which they will be enforced in the review process.

Changes to the Proposal Process Anything Else? The Project Summary will contain the following required separate statements o Overview of the Project o Statement on Intellectual Merit o Statement on Broader Impacts The Project Description must contain a separate section with a discussion of the Broader Impacts Proposing organizations must certify that organizational support will be made available, as described in the proposal to address the Broader Impacts and Intellectual Merit activities. Annual and Final Reports must address activities related to the Broader Impacts criterion that are not intrinsic to the research.

NSF Resources for Grant Preparation GPG Chapter 2 Proposal Preparation Instructions http://www.nsf.gov/pubs/policydocs/pappguide/nsf11001/ gpg_2.jsp#iic2j GPG Chapter 3 NSF Proposal Processing and Review http://www.nsf.gov/pubs/policydocs/papp/gpg_3.jsp www.nsf.gov/eng/iip/sbir/sample_postdoc_mentoring_pla n.doc http://dataservices.gmu.edu/data-management/nsf-dmptemplate/

The User-Friendly Handbook Developed to provide NSF project directors and principal investigators working with a basic guide for evaluating NSF s educational projects. Aimed at people who need to learn more about both the value of evaluation and how to design and carry out an evaluation rather than those who already have a solid base of experience in the field Blends technical knowledge and common sense to meet the special needs of NSF and its stakeholders. http://www.westat.com/westat/pdf/news/ufhb.pdf

The User-Friendly Handbook Why should NSF grantees conduct evaluations? Evaluation produces information that can be used to make continuous improvements in the project. An evaluation can document what has been achieved. extent to which goals are reached and desired impacts are attained Evaluation frequently provides new insights or new unanticipated information. There is an inherent interrelationship between evaluation (formative and summative) and program implementation. Provides information for communicating to stakeholders about the worth of the project to the public and up the line to senior decisionmakers and funders. Government Performance and Results Act (GPRA) the Office of Management and Budget s Program Assessment Rating Tool (PART) NSF assessment and reporting requirements

The User-Friendly Handbook Different Types of Evaluation Formative evaluation Implementation evaluation Progress evaluation Summative evaluation

The User-Friendly Handbook Six phases of project evaluation Develop a conceptual model (focused on the broader impacts) of the program and identify key evaluation points Logic model Develop evaluation questions and define measurable outcomes Develop an evaluation design Collect data Analyze data Provide information to interested audiences (i.e., tell your story)

The User-Friendly Handbook How should I go about getting evaluation data? Surveys Interviews Social network analysis Focus groups Observational methods Tests Document analysis Key informants Case studies Consider the cultural context, to make meaning of the data that have been collected Procedural ethics (e.g., Institutional Review Boards) Relational ethics (the connection between the evaluator and the evaluated)

A Sample Logic Model Resources Activities Outputs Outcomes Impact NSF Funding In-kind financial contributions from the University of Iowa Principal Investigator Co-Principal Investigators Participating faculty mentors Regular faculty advisors Peer mentors Support staff Recruitment Orientation Meetings between GEEMaP students and faculty advisors, peer mentors Social events Quantitative data collection Qualitative data collection via surveys, interviews, focus groups Students recruited into the project Website Each cohort of GEEMaP students continue in the program Publications Presentations Student retention Enhanced student satisfaction GEEMaP students are retained Early cohorts of GEEMaP students graduate More diverse quantitative science workforce Career path is pursued by GEEMaP doctoral earners Scientific contributions arising from the project

AEIOU An Evaluation Approach a(ccountability) e(ffectiveness) I(mpact) o(rganizational factors) u(nanticipated outcomes)

Evaluation Design Compared to what is a common evaluation question? So, your REU produces totally awesome results But is that effect due to the REU? To RCT or not to RCT? Randomized controlled trials Requires ability to randomize REU treatment to some students (probably groups of students) and withhold it from others Quasi-experimental methods If you can t randomize students to REU treatment/control, you should do your best to control for differences between REU and non-reu students This is where you need to work closely with your friendly local project evaluator, and maybe with a research methodologist and/or statistician

Specifics for REUs Formative evaluation to help project staff improve the local REU project identifying better ways of recruiting eligible students discovering obstacles that keep potential research mentors from participating examining the apparent strengths and weaknesses of the way this REU project is implemented This could be done by the evaluator working with project staff to develop a logic model, to reveal any apparent gaps in the project s rationale and lead to the creation of a better project plan observing the project in operation interviewing individuals from several groups project staff research mentors undergraduate student participants potential mentors and eligible students who did not participate