National Science Foundation National Science Foundation. Breakout 3: Evaluators: Business Meeting and Update. NSF Program Briefing

Similar documents
National Science Foundation National Science Foundation Evaluators Meeting: NSF Briefings. NSF Program Briefing

NSF I/UCRC Logic Model, as of April 11, University/Institutional. Faculty Increased faculty-tofaculty

Presentation Outline David Meyer, I/UCRC Site Evaluator

NATIONAL SCIENCE FOUNDATION - INDUSTRY / UNIVERSITY COOPERATIVE RESEARCH CENTERS PROGRAM EVALUATOR MEETING. June 14-15, 2007

NSF Center for GRid-connected Advanced Power Electronic Systems (GRAPES)

Request for Proposals for Faculty Research

MENTOR-CONNECT TUTORIAL

Agenda CDGA RFP Workshop Wednesday - April 11, 2018 at the Malcolm X College 1900 W. Jackson Blvd. (Convention Center) 9:00 am to 4:15 pm

WHY STTR???? Congress designated 4 major goals. SBIR Program. Program Extension until 9/30/2008 Output and Outcome Data

SMALL BUSINESS INNOVATION RESEARCH (SBIR) PROGRAM SMALL BUSINESS TECHNOLOGY TRANSFER (STTR) PROGRAM

System Performance Measures:

ACCOMPLISHMENTS: What was done? What was learned?

Request for Proposals for Student Research

University of Toledo Rocket Fuel Fund (UTRFF)

InfoEd Proposal Development Class

Role of the ILO. (Please mark up your handouts as we go through this)

I/UCRC Annual Directors Meeting January 8, 2009

Government Perspectives on University-Industry Engagement

SJSU Research Foundation

Navigate to the Application

SPONSORED PROGRAM ADMINISTRATION MEETING March 2017 WELCOME

Office of Technology Transfer Overview

College of Nursing Strategic Plan July, 2013

IMPORTANT! Some sections of this article require you have appropriate security clearance to things like the System Manger.

Operational Support Program Final Report Instructions

The Triple Helix Model Role of different entities

ORIS Reports User Guide Catalog of Reports Coeus Premium Dated: October 22, 2013

Commonwealth Health Research Board ("CHRB") Grant Guidelines for FY 2014/2015

The Project Application Appeal Process

Coeus Premium Institute Proposal Guide Overview & Field Definitions

Technology Transfer at Illinois

Capacity Building Grant Program (Section 4 and RCB) DRGR Guidance DRGR Action Plan Module Guide

PCORI Online. Training for Pre-Award Management System April 2017

Entering an Electronic Proposal Routing Approval Form (epraf) Overview Understanding PantherSoft Proposals... 3 Proposal Internal Clearance...

National Science Foundation

Development Coeus Premium. Proposal Development

Application Package

FY2019 Competitive Grant FAQs January 19, 2018

PILOT FUNDING FOR NEW RESEARCH (Pfund)

Capacity Building Grant Programs (Section 4 and RCB) DRGR Guidance DRGR QPR Module Guide

Pfizer-NCBiotech Distinguished Postdoctoral Fellowship in Gene Therapy Application Guidelines & Instructions

Pfizer-NCBiotech Distinguished Postdoctoral Fellowship in Gene Therapy Application Guidelines & Instructions (UPDATED )

ABOUT MONSTER GOVERNMENT SOLUTIONS. FIND the people you need today and. HIRE the right people with speed, DEVELOP your workforce with diversity,

2018 Innovation to Commercialization (I2C) Kate Wilczak Coordinator, Research Competitions

NSF MME Program and Other Funding Opportunities for Manufacturing Faculty

Welcome to the Industry University Cooperative Research Centers. Prakash Balan, Program Director, National Science Foundation

GUIDE FOR PIs. Fiscal oversight of your grant funds, and RF web-based accounting; Personnel/Hiring staff and related Human Resources issues;

October electronic Research Administration (era) OER, OD, National Institutes of Health

AB20 Model Agreement Update AOA 2012 Annual Conference

Developing Proposal Budgets

Gisele Muller-Parker August Symbiosis lunch

Raiser s Edge: How To Query Constituent Giving With A Cumulative Total Including Soft Credits

PILOT FUNDING FOR NEW RESEARCH (Pfund)

NSF Grants Conference NSF Policies and Procedures Update

Welcome and Introductions

COLUMBIA UNIVERSITY COLUMBIA BUSINESS SCHOOL EXECUTIVE MBA PROGRAM LAUNCHING NEW VENTURES B7519. Friday and Saturday Summer 2014

2014 Dissertation Grants Proposal Guidelines

RAMSeS Instructions: Completing a proposal submission file for approvals and certifications

GAP Grant Writing Basics. December 2015

Intellectual Property Policy: Purpose. Applicability. Definitions

RESEARCH ADMINISTRATION SERIES: BUDGET PREPARATION COMPANION WORKSHOP - NSF PROPOSALS. What is FastLane?

Five Star & Urban Waters Frequently Asked Questions

Request for Proposals SD EPSCoR Research Infrastructure Improvement Track-1 Award

Industry Partnerships at the University of Cincinnati: Their Role in Research, Innovation, Entrepreneurship and Commercialization

Effort Certification

CTNext Higher Education Entrepreneurship and Innovation Fund Program Guidelines

NSF FUNDAMENTALS WORKSHOP. Thomas Jefferson University December 2017

NSF IUCRC Lean Entrepreneurship at Your Center Workshop NSF IUCRC BIENNIAL CONFERENCE JULY neilsheridan.com/u.zip 7/27/2017.

STRATEGIC RESTRUCTURING FUND INSTRUCTIONS FOR COMPLETING THE GRANT APPLICATION SUPPLEMENT

ARG/AR-WITAG ELIGIBILITY AND GUIDELINES

FY 2014 Amendments Instructional Guide for Recipients

City of Kingston Report to Council Report Number

What You Need to Know About Submitting NSF Proposals in 2014

Manufacturing Extension Partnership Program: An Overview

Office of the Vice President for Research (OVPR) FY19 Pilot/Seed Grants to Attract External Funding. Request for Proposals

NEW LEADERS FINAL REPORT INSTRUCTIONS

The Shifting Sands of Government IP. John McCarthy Karen Hermann Jon Baker

Common Elements of Grant Proposals Tips and Best Practices

Small Grant Application Guidelines & Instructions

GRANT OPPORTUNITIES FOR ACADEMIC LIAISON WITH INDUSTRY (GOALI)

Networking for Success

AES Competitive Grants FY 2017 Request for Proposals

Presentation Transcript

RAIS AND REPORTING USER GROUP RESEARCH ADMINISTRATION INFORMATION SYSTEMS (RAIS)

Fulbright Distinguished Awards in Teaching Program Overview of Online Application Process

How to Manage Externally Funded Grants PROJECTS - FUND CODE 501 or 502

Investigator s Disclosure of Economic Interests Addendum

University Investments In the Library: What s the Payback? A Case Study at the University of Illinois at Urbana-Champaign

MNsure FY 2019 Navigator Outreach and Enrollment Grants Webinar Script and Notes

Coeus Release Department Users Enhancements and Changes

If you have previously created an account in the Results Verification System (RVS), you may login using your address and password.

D. PROPOSAL DETAILS CREATE A NEW PROPOSAL GENERAL INFO ORGANIZATION ADD INVESTIGATORS AND KEY PERSONS CREDIT SPLIT SPECIAL REVIEW D.3.

Now that we have reviewed the agenda and objectives for today, let s proceed with the EC Grants Overview (PPT SLIDE 1).

Request for Proposal. Award to Support Training, Consulting, and Implementation of Innovative Diabetes Interventions

MANAGERS COMMITTEE REVIEW AND RECOMMENDATIONS CALIFORNIAN COOPERATIVE ECOSYSTEM STUDIES UNIT RENEWAL

THE MARILYN HILTON AWARD FOR INNOVATION IN MS RESEARCH BRIDGING AWARD FOR PHYSICIAN SCIENTISTS Request for Proposals

REQUEST FOR PROPOSALS THE ROSE HILLS FOUNDATION INNOVATOR GRANT PROGRAM RESEARCH FELLOWSHIP APPLICATION

FY19 Accountability Court Grant Application Webinar COUNCIL OF ACCOUNTABILITY COURT JUDGES (CACJ)

The Role of the Research Enterprise in Economic Development

NSERC Management Response: Evaluation of NSERC s Discovery Program

Transcription:

National Science Foundation WHERE WHERE DISCOVERIES BEGIN BEGIN National Science Foundation Breakout 3: Evaluators: Business Meeting and Update NSF Program Briefing 2015 NSF I/UCRC Annual Meeting

National Science Foundation WHERE WHERE DISCOVERIES BEGIN BEGIN National Science Foundation Agenda Items: 1) Thank you 2) NSF expectations 3) FY15 snapshot 4) Evaluator Program Business

National Science Foundation WHERE WHERE DISCOVERIES BEGIN BEGIN National Science Foundation At a bare minimum, the evaluator is expected to: 1) Attend semi-annual evaluators' meetings (typically in January and June); 2) Attend semi-annual Industrial Advisory Board meetings; (if unable to attend an IAB meeting, please find a substitute evaluator) Lead and/or assist in the implementation of the LIFE feedback process. (Note: Phase III evaluators may elect to attend only one IAB meeting per year.) 3) Prepare an "Evaluator's Report," with cover sheet when a new Center is born and provide an annual narrative summary of significant Center developments for submission to NCSU and your Center Director; this must include an attempt to document Center success case study and/or economic impact assessment (See Identifying and Documenting IUCRC Center Success Stories and Economic Impacts ). 4) Complete Semi-Annual Meeting Best Practice Checklist (word, & PDF versions) at each IAB meeting and attach to Annual Evaluator's Report for each I/UCRC. 5) Administer "process/outcome" questionnaires to faculty, and Industrial Advisory Board Members annually; 6) Prepare an annual report based on the process/outcome questionnaire data for your Center and submit to your Center Director; 7) Forward process outcome questionnaire data to the evaluation team at NCSU; 8) Provide information and feedback to NSF; and 9) Provide information and feedback to your Center Director. Innovation through Partnerships 3

National Science Foundation WHERE WHERE DISCOVERIES BEGIN BEGIN National Science Foundation As a reminder Innovation through Partnerships 4

National Science Foundation WHERE DISCOVERIES BEGIN Membership Requirements Single university center must have a minimum of $400,000 annually in membership fees with a minimum of eight full members. A multi-university Phase I center must have: a minimum of $300,000 annually in membership fees AND a minimum of six full members AND each site in the center must have: a minimum of $150,000 annually in membership fees AND a minimum of three full members. For the first year of operation $150k must be all IN CASH. However, inkind contributions (with NSF s approval) may be allowed totally to no more than one full membership fee Innovation through Partnerships 5

National Science Foundation WHERE DISCOVERIES BEGIN IUCRC Memberships In Cash Check to center/sites or via MIPR or IAA In Kind First year: Phase I totaling to no more than one full membership fee (with NSF s approval) Subsequent years/phases: all in-kind contributions must be approved by the IAB A member can buy more than two memberships but have the voting power equivalent to two memberships. A member or multiple members can invest additional funding to accelerate center s projects. Innovation through Partnerships 6

National Science Foundation WHERE DISCOVERIES BEGIN IUCRC Eligible Members Private or public sector organizations: Private companies of any size Local, State, Federal agencies Trade organizations or associations (must sign addendum for Associations and Institutes) Distinct entities/organization within the same company/agency count as individual members and buy individual memberships (example: GM Chrysler and GM Buick; Army Research Office and Army Weapons and Materials) Innovation through Partnerships 7

National Science Foundation WHERE WHERE DISCOVERIES BEGIN BEGIN National Science Foundation Quiz: right or wrong? You can only count a maximum of two memberships toward your minimum requirement With 50k membership and 2 universities, it is not necessary to have 6 different members, 3 members with 2 memberships each would be acceptable Company A buys 50k in membership fee at each site of a center, this counts as ONE full member towards NSF minimum membership requirement, but company has TWO votes. Company A Division X buys 50k in membership fee at one site, and Company A Division Y buys 50k in membership fee at the same or at another center site, this counts as TWO full members towards NSF minimum membership requirement. Always delegate NSF to answer to these questions as well as questions related to NSF policies, submissions, supplements, etc

National Science Foundation WHERE DISCOVERIES BEGIN SRO Membership Certification New Form Annual certification Optional This is editable you can add columns if in NCE, or if you are certifying for all sites and needs to indicate which site, and you can remove the instructions, but you must provide the information Innovation through requested. Partnerships 9

National Science Foundation WHERE DISCOVERIES BEGIN Need your help with the following issues 1. Along with your Semi-Annual Meeting Best Practice Checklist, email us the updated cover sheet of the Evaluators Report after each IAB meeting. 2. Ensure that each center has active website and it is being updated periodically. 3. Check with the center admin and director that all the presentation material is posted in the secure section at least 3 days before the IAB meeting. Inform us if there is resistance in doing so from the center. 4. Before the planning meeting, a good website and brochure must be developed. Innovation through Partnerships 10

National Science Foundation WHERE DISCOVERIES BEGIN Critical mass at meetings Make sure that center ensures the critical number of members in order to make meeting meaningful Multi-institution centers with only 3-5 potential members at planning meeting, below the minimum membership requirement Phase, absence of one site is no worth anyone time (and money) and might be counterproductive. Stay on top of the PIs and if you sense this will happen inform NSF immediately. Travel budget is coming out of Program Budget! Innovation through Partnerships 11

National Science Foundation WHERE DISCOVERIES BEGIN Scheduling Your I/UCRC Meetings 1. Access the I/UCRC Public Calendar to determine if your preferred dates are available 2. Contact Kevin Simmons to confirm NSF availability and contact your evaluator 3. If your dates are confirmed, schedule your meeting Please try to schedule your meeting immediately prior or after another meeting in the same town or State. NSF is going Virtual! Travel budget is coming out of Program Budget! Innovation through Partnerships 12

National Science Foundation WHERE DISCOVERIES BEGIN Where we are and FY15 snapshot FY14: Active and in NCE: 52 ENG Centers 25 CISE Centers 17 ENG planning 2 CISE planning Ending (as of now, an handful are in trouble ): ENG: 5 ending and in NCE (CNDE Shelly, CDADIC Scott, CAVE Davis, PSERC Doering, CBERD Gray) + at least two in big trouble FY15: First round: ENG: received 60 proposals or 29 projects anticipates funding about 25% which will require just two evaluators CISE: anticipates the need for three evaluators Innovation through Partnerships 13

National Science Foundation WHERE DISCOVERIES BEGIN Evaluator Program Business 1) Numbers and recruiting strategy Other items Innovation through Partnerships 14

IUCRC PROGRAM FINDINGS FROM THE CENTER STRUCTURE DATABASE DRAFT RESULTS: FY 2013-2014 *Data collected from 65 out of total 66 active centers TRENDS FROM CENTER STRUCTURE DATABASE: 1980-2014 Gray, D.O., Leonchuk, O., & McGowen, L.C. North Carolina State University Slide 15

Overview Full Slide Deck on IUCRC Evaluator website: www.ncsu.edu/iucrc Overview Number of Centers and Funding Membership Outcomes: IP Events and Students ENG vs. CISE comparison Comparison by Phase IUCRC Evaluation Project Slide 16

Understanding Center Count and Activity Statistics FY 2014 66 NSF Funded Centers > 1 yr. (completed a project year) 11 NSF Funded Centers < 1 yr. (no project year completed) = 77 Funded Centers Statistics on Total and Average Activity IUCRC Evaluation Project Slide 17

NSF and Center Budgeting Life Cycle Oct FY1 NSF Fiscal Year Mar FY1 Aug FY1 Oct FY2 Center Project Year NSF Fiscal Year Mar FY2 Aug FY2 Center Project Year IUCRC Report Completed

Center Life Cycle* 100 Active Centers Phased Out Centers 80 (Current Year) 60 40 20 0-20 -40-60 -80 (Cumulative Record) -100-120 *Data Current for NSF FY2014 NSF-I/UCRC Center Structure Database Slide 19

Single & Multi-Site Centers* 75 Single-Site Multi-Site 70 59 50 47 45 46 47 48 51 54 25 16 35 33 37 38 39 39 36 31 30 29 27 25 26 26 26 27 25 23 19 19 20 19 19 18 15 15 14 22 22 27 33 0 1 2 2 4 4 4 4 5 5 8 12 12 10 9 6 6 6 7 7 *Data Current for NSF FY2014 NSF-I/UCRC Center Structure Database Slide 20

ACTIVE CENTERS AND SITES BY YEAR* Centers Sites ~3 Sites/Center 250 200 Plus 5 Int l Sites 161 170 175 188 214 150 116 100 50 68 81 83 52 50 52 95 96 99 87 77 45 45 44 42 39 97 75 82 34 34 37 42 54 57 60 66 77 0 '98 '99 '00 '01 '02 '03 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 *Data Current for NSF FY2014 NSF-I/UCRC Center Structure Database Slide 21

Total Program Funding 160 Millions 140 120 100 80 60 40 20 0 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 22

NSF Budget by Year Other Programs co-fund too! 25 Millions ENG CISE FRP Program 20 15 10 5 0 1.8 8 2.6 3.3 2.2 7 1.7 1.8 4 2.3 4.5 4 2 10.4 9.8 0.9 8.8 2.1 2.2 8.6 7.7 8.2 1 1.3 1.5 1.5 2 3 3 2.9 3 3.1 3.6 3.9 4.2 4.2 4.2 4.2 4.2 4.2 4.2 4.2 5 4.6 5.1 6 0.5 6 0.8 0.8 6 6 6 5.2 5.2 '80 '82 '84 '86 '88 '90 '92 '94 '96 '98 '00 '02 '04 '06 '08 '10 '12 '14 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 23

Total Funding by Source in Dollars 160 140 120 100 Millions UNIVERSITY OTHER (FED. NON-FED., & OTHER CASH) STATE OTHER INDUSTRY INDUST. MEM. FEES OTHER NSF IUCRC 80 60 40 20 0 '80 '82 '84 '87 '89 '91 '93 '95 '97 '99 '01 '03 '05 '07 '09 '11 '13 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 24

Industrial Memberships Total Number of Memberships 1400 25 Average Number of Memberships Average Membership Volitale 1200 20 1000 800 15 Complex changes: 600 400 200 0 '85 '88 '90 '92 '94 '96 '98 '00 '02 '04 '06 '08 '10 '12 '14 10 5 0 2011: Phase 3 introduced = 4 mature centers added, increasing average 2012: BSAC re-enters via Phase 3, increasing average 2011-2014: Massive growth in new centers established (N = 32), decreasing average '85 '88 '90 '92 '94 '96 '98 '00 '02 '04 '06 '08 '10 '12 '14 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 13

Average Membership Turn Over Members Added this FY Members Left this FY 6 5 4 3 2 1 0 '89 '90 '91 '92 '93 '94 '95 '96 '97 '98 '99 '00 '01 '02 '03 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 *Newly funded Centers members are not counted as Members Added FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 26

Member Turnover Rate 35% 30% 25% Near lowest turnover since 1998 20% 15% 10% 5% 0% '92 '93 '94 '95 '96 '97 '98 '99 '00 '01 '02 '03 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 Turnover % = Members terminated in year X+1 / Total members in year X FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 27

Member Composition 2005-2014 70% Large Small Fed State Others 60% 50% 40% SB reaching plateau 30% 20% 10% 0% '05 '06 '07 '08* '09* '10* '11* '12* '13* '14* *Years Advanced Forestry excluded as a small business outlier: 08 Small = 36, 09 Small = 49, 10 Small = 57, `11 Small = 66, 12 = 71, 13 = 77, 14 = 86. ^ Categories comprising Others include: non-profit, non-us gov t, and other org. FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 28

Intellectual Property & Commercialization Events 04-14 Totals 180 160 140 54% of disclosures from 4 centers *Invention Disclosures Patent Applications 120 100 80 60 40 20 0 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 Patents Granted Software Copyrights Licensing Agreements Royalties Realized Spinoff Companies Formed *2014 Invention Disclosures outliers: Visual Decision Informatics = 40, Berkeley Sensor & Actuator Center = 20, E-Design = 12, Identification Technology Research = 17. FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 29

Average Students Graduated Masters PhD BS 8 7 6 5 4 3 2 1 0 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 30

Total Students Hired by Members Masters (2014 Mean = 0.98) PhD (2014 Mean = 1.35) BS (2014 Mean = 0.51) 140 120 Rebound for hiring PhDs 100 80 60 40 20 0 '04 '05 '06 '07 '08 '09 '10 '11 '12 '13 '14 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 31

CISE & ENG Partnership Some Comparisons and Recent Trends Slide 32

Growth in # of CISE & ENG Centers over time* N CISE N ENG 60 50 40 30 20 10 40 42 42 32 26 26 28 29 18 13 14 15 7 8 8 9 22 44 25 52 0 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 *Data Current for NSF FY2014 IUCRC Evaluation Project Slide 33

2014 Phase-based comparison NSF-I/UCRC Center Structure Database Slide 34

N of Centers by Phase* 45 40 35 38 34 41 Over half of all centers are in Phase 1! 30 25 20 18 17 18 15 10 12 10 14 5 0 2012 2013 2014 Phase 1 Phase 2 Phase 3 *Data Current for NSF FY2014 IUCRC Evaluation Project Slide 35

Total Program Funding $50,000,000 $49,000,000 $48,000,000 $47,000,000 $46,000,000 $45,000,000 $44,000,000 $43,000,000 $42,000,000 $41,000,000 $40,000,000 $39,000,000 Phase1 Phase2 Phase3 IUCRC Evaluation Project Slide 36

Average & Median Total Center Funding $3,500,000 Centers get larger over time $3,000,000 $2,500,000 $2,000,000 $1,500,000 Average Median $1,000,000 $500,000 $0 Phase1 Phase2 Phase3 IUCRC Evaluation Project Slide 37

Average NSF IUCRC Awards & Supplements $256,000 Phase 2 centers getting rewarded $254,000 $252,000 $250,000 $248,000 $246,000 $244,000 $242,000 $240,000 $238,000 Phase1 Phase2 Phase3 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 35

Industrial Memberships Total Number of Memberships Average Number of Memberships 500 30 450 400 25 350 20 300 250 15 200 150 10 100 5 50 0 Phase1 Phase2 Phase3 0 Phase1 Phase2 Phase3 FY 2013-2014 NSF-I/UCRC Center Structure Database Slide 37

QUESTIONS? NSF-I/UCRC Center Structure Database Slide 40

Evaluator Responsibilities and Updates Denis O. Gray, Ph.D. IUCRC Evaluation Team NC State University Slide 41

Evaluator Resources https://www.ncsu.edu/iucrc/ Dear Evaluator Memo Dear Evaluators: Greetings from the evaluation team at NCSU! Hope your summer is going well, and you are ready for another stimulating year supporting the NSF IUCRC Program. The purpose of this message is to provide you some updates, alert you to some new developments related to the IUCRC Evaluation Project, and remind you about your responsibilities going into the fall. Here are some highlights of what is covered in more detail below (please read 1.A-1.C now!). As always, all project resources are available on our project website (www.ncsu.edu/iucrc). I.A. Highlights (see details below in Section II) For 2013-2014 data collection, we have not made any changes to the industry questionnaire. We remind you to help us maintain the integrity of our database by using the current questionnaire available on our website, and not the version saved on your computer. (See II.A below) In an effort to streamline submission of Process/Outcome Data, we also remind you to use the PO excel workbook, complete with codebooks, data entry shells, and a Research Cost Avoidance calculator. This will shorten the time it takes to enter, transmit and analyze your data. (See detailed description in Section II). We know that some of you add your own questions to the Process/Outcome questionnaire. If so, please do not change the existing questions we need all the useable data we can get! We have made a few enhancements to the web-based LIFE forms (See details in Section II). We have done our best to make sure they are glitch free. However, if you encounter a problem, please let us know ASAP so that we may fix it. I.B. Reminders about Recent but Not Brand New Changes NSF has requested that the members listed on your Evaluator Report Cover Sheets match the membership certification, otherwise the annual report will be rejected, so please be sure you get a copy of the membership certification before you complete the cover sheet. You will keep receiving a reminder from us 60 days prior to your evaluator report due date that you should start collecting economic impact assessment data if applicable. Please read the guidelines below carefully and the detailed instructions contained on our website. Contact us if you have any questions (See II.B below). When NSF cannot attend the IAB meeting, please remember to send to NSF a short meeting summary report using this form (See II.C below).

www.ncsu.edu/iucrc Slide 43

New LIFE Features Lindsey McGowen, PhD NCSU I/UCRC Evaluation Project January 8, 2015 Slide 44

Overview Problem IAB are being asked to rate their level of interest in projects they have already approved New proposals are sometimes presented at the same meeting with project updates, confusion over which is which Evaluator would like to collect Faculty PO data at the meeting PIs losing their work when they respond to multiple IAB comments at once Solution Project phase specific response categories for new proposals vs project updates optional Project phase labels required Faculty PO survey link on PI response page optional PI response option now with one submit button per page; rating based navigation

New Meeting Options - PI Faculty Survey: allows you to select a version of the evaluator s web-based faculty P/O questionnaire to be linked on the bottom of the project list presented to PI users. - Project Phase Ratings: will provide project phase specific response options. For New Proposals, response categories reflect level of interest ratings. - Very Interested - Interested - Interested w/change - Not Interested - Abstain - For Project Updates, response categories reflect assessment of progress. - Great Progress - On Course - Needs Change - Off Course - Abstain

New Project Phase Labeling Use the drop down menu to select the appropriate project phase: - New Proposal - Project Update The project phase label is required. If you have selected the option to use project phase specific response categories, they will be based on the project phase you indicate here.

Admin View User View

Review Meeting with Project phase specific response categories - Projects grouped by phase - Click on any project title to see comments - Reminder: Review meeting functions from the Admin side identifies the names of IAB respondents. Use the review meeting function from the User side to access summaries with anonymous IAB comments. PI responses are always identified by name.

Faculty PO Link If you selected the option to include a link to the Faculty PO survey, it will appear after the project list on the PI page. Reminder: If you are using the web-surveys, contact NCSU to get your data. Optional Link to the Process/Outcome Questionnaire

Tweak to PI response navigation IAB comments on a project are grouped by rating, 5 comments per page. For Example, this page displays the first 5 comments for Great Progress. You can advance to the next set of comments by entering responses to the comments displayed and then clicking Submit Response, or by clicking on the response category for which you would like to see and respond to comments.

Questions? Slide 55

IUCRC Logic Model Update January 2015 Teri Behrens Lindsey McGowen Denis Gray Slide 56

Analysis of Program Goals and Objectives Program has evolved over 30-plus years Goals and objectives have tended to be emergent and/or high-level impacts Evaluation has also been ongoing and emergent Program solicitation emphasizes means and ends with little in between Little guidance on the mechanism(s) that help translate means into ends Little guidance on all the intermediate steps that come between means and ends

NSF 13-594 Means and Ends Means Leverage NSF funds with industry to support graduate students performing industrially relevant research; Integrate research and education, and facilitate technology transfer Promote research programs of mutual interest Active engagement with academic and industrial leaders throughout the world Develop long term partnerships among industry, academe and government Ends Contribute to the nation's research infrastructure base Enhance the intellectual capacity of the engineering or science workforce Expanding the innovation capacity of our nation's competitive workforce Encouraging the nation's research enterprise to remain competitive Enhanced Innovation Ecosystem

Project Goals Shine a light on Program Mechanism(s) Program Logic Model: activities, outputs, outcomes, impacts for various stakeholder groups and indicators for each Identify data that has been collected to support program effectiveness and identify opportunities for additional assessment

Project Team Teri Behrens Alexandra Medina-Borja Craig Boardman Connie Chang Denis Gray Shannon Griswold Larry A Hornak Lindsey McGowen Craig Scott Eric Sundstrom With additional feedback from the Evaluator Group

Process In-person workshop generated many ideas Drafted LM Reviewed over 4 conference calls Presented to Evaluator Group in June 2014 Feedback incorporated Revised (Final?) LM review in January 2015 Logic modeling is an iterative process should be a living document

IUCRCs: BUILDING AN ENHANCED RESEARCH AND INNOVATION ECOSYSTEM University resources and facilities Students Increased opportunities for internships/employment Ideas / funding for thesis / dissertation research Industry network Skills in bridging industry / academy NSF Funding, Prestige and Technical assistance (Best practices in center management) Industry intellectual and financial support System Interventions Development of shared research agenda Mutual understanding between industry and university IUCRCs are a SYSTEM level intervention targeted support creates a self-reinforcing network of relationships. Collaborative execution of agenda Trustbased Partnershi p Cycle Quality and relevance of research Mutual benefit Faculty : Short-term Increased: Scholarly productivity & reputation Advances in knowledge Skills in collaborative research Consulting / contract opportunities Ability to attract / support students Understanding of industry needs and opportunities Industry network Faculty: Long-term Funding from diversified sources New / enhanced relationships with industry (social capital) Opportunities for scientific leadership Industry: Short-term Access to potential employees Amplified R&D Broader scientific network Access to IP Industry: Long-term More efficient research Better prepared employees Ability to capitalize on university research New / improved products, processes, know-how and/or services Broader scientific network (social capital)

Inputs Activities Outputs (Immediate results of activities first year) Short-term Outcomes (What is different after 1-2 years?) Intermediate - Long-term Outcomes (3 10 years) Impacts / Externalities Center Operations University resources and facilities NSF Funding, Prestige and Technical assistance Industry intellectual and financial support Leadership Manage center according to best practices Conduct industrially relevant research Partnering / boundary spanning Research results Human capital - Faculty and students with skills relevant to industry Social capital trust among university and industry Partnership Cycle Faculty Increased: Scholarly productivity & reputation Advances in knowledge Skills in collaborative research Consulting / contract opportunities Ability to attract / support students Understanding of industry needs and opportunities Industry network Students Increased opportunities for internships/employme nt Research achievements Ideas / funding for thesis / dissertation research Industry network Industry Access to potential employees Amplified R&D Broader scientific network Access to IP Center / University Self-sustained partnership with industry Increased ability to attract faculty, students, and external research support Deeper and increased interactions with industry Enhanced reputation More entrepreneurial culture Faculty Funding from diversified sources New / enhanced relationships with industry (social capital) Opportunities for scientific leadership Students Skills in bridging between university and industry contexts Enhanced social capital Industry More efficient research Better prepared employees Ability to capitalize on university research New / improved products, processes, know-how and/or services Broader scientific network (social capital) Enhanced research and innovation ecosystem NSF On-going Technical Assistance Center Growth: Award of supplemental funds / contracts -- New members added -- New research sites added

Inputs Activities Outputs (Immediate results of activities first year) University Human resources (faculty, researchers, students) Equip. & facilities Research accomplishments Financial support (reduced indirect, support for admin and student) Social capital (Existing collaborations, networks) Organizational capital (policies, mission, culture) NSF Funding & Prestige Evaluation Best practices and technical assistance Program requirements Organizational capital Industry Financial support Technical insight and direction Research accomplishments Specialized equipment and materials Time Human capital Social capital NSF On-going TA Center Lead and Manage center Leadership Implement center model (best practices) Recruit members Coordinate research agenda Evaluate Plan and select research Conduct industrially relevant research Collaborate with other researchers (industry and university) Teams execute research activities Manage projects to meet industry standards Partnering / boundary spanning Champion center research Internally Interact with / among members Share knowledge and ideas Participate in center meetings Research results Reports, publications, presentations Intellectual property Human capital Faculty and students with skills relevant to industry Graduates Industry members with deeper understanding of university resources, skills, talent Research management skills (director) Social capital Increased trust among and between university and industry members Center Structure Center Growth & Recruitment Award of supplemental funds / contracts New members added New research sites added Short-term Outcomes (What is different after 1-2 years?) Faculty Increased: Scholarly productivity & reputation Advances in knowledge Skills in collaborative research Consulting / contract opportunities Ability to attract / support students Understanding of industry needs and opportunities Industry network Students [Alumni] Increased opportunities for internships/employme nt Research achievements Industry network (social capital) Industry Access to potential employees Amplified & efficient R&D Broader scientific network Access to IP Intermediate - Long-term Outcomes (3 10 years) Center / Dept. / University Self-sustained partnership with industry Increased ability to attract faculty, students, and external research support Deeper and increased interactions with industry Enhanced reputation More entrepreneurial culture Faculty Funding from diversified sources New / enhanced relationships with industry Opportunities for scientific leadership Students [Alumni] Skills in bridging between university and industry contexts Industry More efficient research Better prepared employees Ability to capitalize on university research New / improved products, processes, know-how and/or services Broader scientific network Impacts / Exterrnalities Enhanced research and innovation ecosystem Strengthened connections and feedback among parts of the system Enhanced economic competitiveness Expanded innovation capacity Increased / enhanced scientific and technical human and social capital Key: Red text no research to date Underline previous research may need to be updated Black text sufficient current research

Next Steps Feedback from team and NSF Build link between measurement LM and data source Identify and fill gaps in evaluation / research