Priorities & Metrics Workgroup Meeting No. 5 and Debrief with Project Selection Workgroup

Similar documents
Priorities & Metrics Workgroup Meeting No. 4

6 Governance and Stakeholder Involvement

San Diego IRWMP Regional Advisory Committee Meeting #63. Innovative Solutions for Water and the Environment

Innovative Solutions for Water and the Environment

Proposal for the Tulare/Kern Funding Area

Project Guide for Proposition 1 Disadvantaged Community (DAC) Planning Grants

Coachella Valley Integrated Regional Water Management Program Planning Partners

Innovative Solutions for Water and the Environment

Innovative Solutions for Water and the Environment

Re: Comments on the Draft Guidelines for the Low-Carbon Transit Operations Program

Final Volume II : Disadvantaged Communities

2017 DoDEA Grant Application Webinar Transcript

Attachment 1: Authorization and Eligibility Requirements

Draft Community Outreach Plan for the Climate Action Plan Update

GOVERNANCE, STAKEHOLDER INVOLVEMENT, COORDINATION

Strategic Planning through Collaborative Design

BUTTE COUNTY DEPARTMENT OF WATER AND RESOURCE CONSERVATION REQUEST FOR PROPOSALS TO

Transcribed by Kaitlin Meese

2018 ASFPM National Conference Presentation Submission Form

Legacy Resource Management Program Guidelines for Full Proposal Applicants (2016)

UNIVERSITY OF CALIFORNIA, DAVIS AUDIT AND MANAGEMENT ADVISORY SERVICES. Counseling Services Audit & Management Advisory Services Project #17-67

TE18 Review Process and Responsibilities

Min Value 2 Max Value 5 Mean 4.76 Variance 0.25 Standard Deviation 0.50 Total Responses 147

DHS Waiver Rates System Webinar Recording

AV IRWMP Ninth Stakeholder Meeting Wednesday, July 17, 2013

Kate Goodrich, MD MHS. Director, Center for Clinical Standards & Quality. Center for Medicare and Medicaid Services (CMS) May 6, 2016

COMPETITIVE TRAINING GRANTS PROGRAM (CTGP)

Funding through the Bay Area IRWMP Feb. 20, 2014 BAFPAA-BAWN

1.0 Introduction PacifiCorp s Contributions.

Improving Communication Openness in BWHC Ambulatory: Update

PREPARATION OF LOGS: CLINICAL EXAMINATION

Disaster Recovery Grant Reporting System (DRGR) Action Plan Module Draft User Guide

2015 Lasting Change. Organizational Effectiveness Program. Outcomes and impact of organizational effectiveness grants one year after completion

White Paper BKLYN Incubator

The Rhetoric of Proposals

The Evolution of a Successful Efficiency Program: Energy Savings Bid

Today s webinar is intended to provide an overview and program orientation, and to highlight two significant changes to this year s RFP.

9/20/2016 Model Business Plan Outline

Course Assessment Report - 4 Column Great Basin College Courses (HHS) - Nursing

07/01/2010 ACTUAL START

IATI Implementation Schedule for: Plan International USA

FY 2018 SNAP PROCESS AND TECHNOLOGY IMPROVEMENT GRANTS (PTIG)

School Bond Transparency In San Diego County

3. My proposed program doesn t fit the eligibility category, can I still apply?

Regional Participants Committee (RPC) Meeting No. 2 May 26, 2010; 1:30 pm to 3:45 pm Amador County Administration Building, Jackson California

A GUIDE TO Understanding & Sharing Your Survey Results. Organizational Development

Using Lean, Six Sigma to Improve Surgical Services James Pearson J.O.P. Consulting

Higher Education Coordinating Committee September 11, 2015 Conference Call 10:00 a.m. 12:00 p.m.

Environmental Issues Committee March 16, :00 PM Umpqua River Room, EMU

California Self-Generation Incentive Program Evaluation

Writing an Effective Grants for Art Programs Proposal

National CASA Association Local Special Issues Grant Application. Instructions and Information

Approved by WQGIT July 14, 2014

FOR INFORMATIONAL PURPOSES ONLY DO NOT SUBMIT

Publish Now, Judge Later

MACRA Frequently Asked Questions

Mass Funding Information Session. Brought to you by: MSU Office of Student Engagement

STATE ADVISORY COUNCIL ON PALLIATIVE CARE AND QUALITY OF LIFE QUARTERLY MEETING AUGUST 12, A.M. RSA Tower Board Room 1586 Montgomery, AL

MEMORANDUM. Kari Holzgang, Program Analyst State Water Board Division of Financial Assistance

REQUEST FOR PROPOSAL

Report on the Pilot Survey on Obtaining Occupational Exposure Data in Interventional Cardiology

MIPS Improvement Activities: Quality Insights Tips, Tools and Support Transcript from Live Webinar

Coral Reef Conservation Fund 2017 Pre-Proposal Tip Sheet

SAFE AND SOUND SCHOOLS MISSION, VISION, & VALUES STATEMENT

Attendees: *Denotes official RAC member

University of Michigan Health System. Current State Analysis of the Main Adult Emergency Department

Proposition 1 Stormwater Grant Program Potential Solutions for Sustainable Streets

California Department of Water Resources Integrated Regional Water Management (IRWM) Proposition 1 Disadvantaged Community Involvement Program

Alumni Participation Survey. January Powered by

MACRA and the Quality Payment Program. Frequently Asked Questions Edition

The Importance of Mentoring

A GUIDE TO Understanding & Sharing Your Survey Results

Coachella Valley Integrated Regional Water Management Program, Disadvantaged Community (DAC) Outreach Demonstration Project

DARPA BAA HR001117S0054 Posh Open Source Hardware (POSH) Frequently Asked Questions Updated November 6, 2017

Order of Business. D. Approval of the Statement of Proceedings/Minutes for the meeting of January 24, 2018.

Office of the District of Columbia Auditor

4th CALL FOR PROPOSALS

Stakeholder Advisory Committee: Program Elements Workshop Meeting Summary

Rural Health Clinics

Question and Answer Transcript Follow-up to the December 7, 2011 webinar on: Proper Management of Federal Grants - Support of Salaries and Wages

MACRA, MIPS, and APMs What to Expect from all these Acronyms?!

THE ANDREW MARR SHOW INTERVIEW: SIMON STEVENS 22 ND MAY 2016

2018 BFWW Questions. If so what kind of support letter do I have to get from the Department Chair (i.e., he will be promoted to Assistant Professor).

BUSINESS SUPPORT. DRC MENA livelihoods learning programme DECEMBER 2017

CAMPBELL UNION HIGH SCHOOL DISTRICT

THE ELECTRONIC PALLIATIVE CARE SUMMARY (epcs) / VISION

Pure Experts Portal. Quick Reference Guide

2015 Turf Replacement Initiative

REQUEST FOR PROPOSAL

Activity Three: What are we doing together?

The Science of Emotion

Event ID: Event Started: 5/18/2016 1:40:25 PM ET QuILTSS Consistent Assignment Webinar Series: Session 1 WebEx from May 18 th

REGION 5 INFORMATION FOR PER CAPITA AND COMPETITIVE GRANT APPLICANTS Updated April, 2018

2018 DODEA Broad Agency Announcement Technical Assistance Webinar March 2018

MINUTES OF THE SUBCOMMITTEE MEETING OF THE ASSEMBLY COMMITTEE ON HEALTH AND HUMAN SERVICES. Seventy-Fourth Session April 3, 2007

Telecommuting Patterns and Trends in the Pioneer Valley

2. Do the experts have to assess the plausibility of financing a strategic partnership?

Standards of Excellence

Niagara Health Public Opinion Poll 2016

Guidelines for writing PDP applications

Transcription:

Priorities & Metrics Workgroup Meeting No. 5 and Debrief with Project Selection Workgroup December 12, 2012 9:00-11:30 am San Diego County Water Authority Board Room 4677 Overland Avenue, San Diego, CA 92123 Draft Notes Action items and responses to comments are presented in italics Mark Stadler, SDCWA Sheri McPherson, County of SD Jeff Pasek, City of San Diego Cathy Pieroni, City of San Diego Travis Pritchard, San Diego CoastKeeper Dave Harvey, RCAC Joey Randall, OMWD Cari Dale, City of Oceanside Attendees: Terrell Breaux, City of San Diego Rob Hutsel, San Diego River Park Foundation Robin Bier, Padre Dam Municipal Water District Linda Flournoy, SDSU Center for Regional Sustainability Peter Famolaro, Sweetwater AUthority Rosalyn Prickett, RMC Crystal Mohr, RMC Lewis Michaelson, Katz & Associates 1. Welcome and Introductions Lewis Michaelson welcomed the group, who did self-introductions. 2. Recap of Previous Meeting and Review of Notes Rosalyn Prickett provided an overview of the meeting purpose, noting that this meeting has three types of attendees: members of the Priorities and Metrics Workgroup, members of the Proposition 84-Round 2 Project Selection Committee, and local project sponsors that submitted projects for Round 2 funding. Ms. Prickett noted that the primary purpose of this meeting was to provide a debrief with the Project Selection Committee to help guide the Priorities and Metrics Workgroup in making recommendations for future project selection processes.

3. Meeting No. 5 Objectives: Lewis Michaelson provided an overview of the current meeting objectives, including: Debrief with the Project Selection Workgroup regarding the Proposition 84-Round 2 Project Selection Process Brainstorm improvements to the project prioritization process 4. Discuss Project Review Process Lewis Michaelson opened the conversation, asking the Project Selection Committee members to provide input on the recent selection process, and to provide general feedback to the workgroup. Below is a summary of that discussion. The interview process went really well would definitely like to see this in future project selection processes. They provided a first-hand opportunity to discuss projects with proponents in an even-handed manner, as the information was coming directly from proponents. The workgroup members were well-balanced between different topical areas, which gave a good balance of knowledge. The time required to be on the workgroup is not un-substantial thank you to all workgroup members for taking the time to work on this. The process was much more organized than during Proposition 50 the interview process as well as the Strategic Integration Workshop. The process seems to be improving each round. Would there potentially be value to completing the speed dating exercise undertaken during the Strategic Integration Workshop without available grant funding? Do you think people would participate? o Yes, there would be value to this, although it is also difficult to encourage integration without some sort of incentive. Integration is challenging and takes a lot of work. It would be helpful to do the integration sooner, more than a month before the call for projects. There needs to be an emphasis on meaningful integration we need to ensure that the integration process produces better projects. It seems as if the NGOs get somewhat used in this process lumped into projects in an un-meaningful way to superficially increase benefits. There is always a concern about the grant application package recommendations, as the workgroup has historically always reduced grant funding for all projects. We need to be careful to trim project budgets without losing project purpose. We have improved upon the project database, but there is still a need to provide meaningful, useful information to the project selection workgroup. I think that the process itself needs to be improved. The first part of the process, where projects are scored and lumped to Tier 1 vs. Tier 2 is very structured. There are clear criteria, and clear scoring. The second part of the process during which the

selection workgroup reviewed projects was much less structured, with unclear criteria. I disagree with the above point the second process is less structured on purpose. There are so many potential project benefits that cannot be quantified, and there are many projects that we would want to include in the package for their specific and unique benefits that will not be thought of ahead-of-time. It would be a shame to overly structure the process to the point that we would potentially exclude highly beneficial and important projects. We need more time for the interviews. The 10-minute presentations were not long enough. It is worth taking extra time to conduct interviews. Lewis Michaelson asked the group, what kind of information would be useful for the project selection workgroup that is not currently captured by the project database? Below is a summary of that discussion. Specific scope of work details need to be included to get down to what the project will actually accomplish. Specific budget details are also necessary. We need to balance additional details with the fact that the project database submittal is already lengthy and complex is there anything that we can remove? Perhaps we could add an intermediate process through which those projects included in the Tier 1 list would submit additional scope of work and budget details that are not included in the database. This way projects that will not be realistically considered for funding will not have to submit so much information to the project database. We could develop a template for the additional information that proponents could choose to fill out and include in the database, or wait to fill out if their project is included in the Tier 1 list. The reality is that many projects are in the range of 10-30% design during submittal to the project database, and realistically may not have additional details. The way the work plan and budget portions of the project database were structured were in the format that DWR requires for the grant application. Perhaps we restructure these pieces to better fit our needs, which would be better-suited for workgroup evaluation. For example, the monitoring task in DWR s opinion refers to post-project monitoring, which they require. This is not necessarily clear in the project database. In general, there just needs to be more room in the work plan section for additional text. Maybe it would be helpful to have a FAQ page or an instruction page to explain what is meant by each section of the project database for those who are unfamiliar with the process.

How to we quantifiably analyze the information provided in the work plan sections of the project database? Is this just something that we evaluate as a yes or a no (included or not included)? Or is there some way to reasonably assess the quality of these items during initial project review? Round 2 was improved vs. Round 1. Like the idea at having a template to enter information into and explain the database questions. In general, the database was geared towards construction projects, non-construction projects did not seem to fit. It is very important to make sure that our process does not negate the benefits of research and development (R&D) projects. These projects are important to the Region, and we have lobbied hard to DWR to increase benefits given to these projects. Is there a way to pull in other IRWM regions to support and/or fund R&D projects that would benefit the state? Lewis Michaelson asked those in the room who submitted projects to the online database: what were challenges and benefits to the process, what would you change? Below is a summary of that discussion. The project database submittal was fairly straight-forward. The character limits forced us to concisely summarize information, which is not necessarily bad, just takes additional work. It was easy to get questions answered when there were questions. Although some of the terminology was difficult to understand, it was helpful knowing that support was available. The process was time-consuming, but efficiently executed. Lewis Michaelson asked those in the room who were on the project selection workgroup: what were challenges and benefits to the process, what would you change? Below is a summary of that discussion. The facilitation was very beneficial, and helped the workgroup meetings move quickly and efficiently. Public voting aspect was somewhat difficult. Would like to have seen more private voting. Did not like the way project budgets were reduced. There needs to be a better way to quantify the minimum grant funding a project can realistically receive, and still be beneficial. Perhaps add the following language to the database: if you were to get less grant funding than requested, what would you do? There needs to be enough space to fully explain this. Perhaps we choose to only fully fund projects and fund fewer projects. Whittling down project budgets could reduce potential benefits to the region.

We need to determine: is it better to get more projects in the package to increase benefits, or to keep whole projects for the sake of funding better and more complete projects. Perhaps we should put a cap on the amount of grant funding that is available per project. It would be difficult to put a funding cap on projects projects are so different with such different budgetary needs, that this seems unfair. Putting a maximum value on grant requests would increase transparency and save time. Perhaps choose a total number of projects that should be included in the database, and allow each project to only apply for a certain percentage of the total available funding. The DAC issue of directly vs. indirectly benefitting DACs needs to be clarified. Although it takes more time, it is better to funnel all project questions through the consultant team. This process was more transparent than during the previous project selection process. There needs to be a text field associated with the attachments to explain why certain attachments were included, and what they mean. During this round some project sponsors that were interviewed were also on the project selection workgroup. This should not be repeated in the next round, project selection workgroup members should not be allowed to be on the interview team. It would be highly beneficial to have project abstracts a one-page abstract that fully describes the project. Also agree that a FAQ sheet would be helpful on the project database to explain what is meant by each tab and field. Lewis Michaelson asked the group to look over the handouts provided during the meeting regarding the project selection criteria, project scoring, and project weighting. Below is an overview of the feedback provided by the group: The proposal-level criteria are difficult to apply (from a workgroup perspective), because they are so subjective. The proposal-level criteria are purposefully subjective to give the workgroup flexibility in choosing which projects are most beneficial to the Region. Should we include R&D projects in the criteria? Perhaps designate a percentage of grant funding that should be spent on R&D? The project-level and proposal level-criteria are already very extensive. Would not agree with adding more criteria. We also want to encourage new-comers to the project selection workgroup. If we substantially change the criteria and add more, we may discourage new-comers from being a part of the process. There really needs to be a balance of R&D projects and on-the-ground implementation projects agree with designating a percentage to R&D.

We used sticky notes in the Prop 50 project selection process to move projects around and create different proposal packages. This really helped to apply the proposal-level criteria would recommend doing this during the next round. Consider evaluating projects without looking at the budget/grant request, then evaluating the monetary values. This helps to clearly evaluate the projects based on merit. Lewis Michaelson then asked the group to discuss inclusion of the project-level and proposal-level criteria in the IRWM Plan Update. How specific should the IRWM Plan Update be on this topic? Below is an overview of the feedback provided by the group: Some form of project scoring should be in the IRWM Plan Update. Consider only including the first three columns of the project scoring and weighting table. This allows the Region flexibility to adjust the criteria based on the weighted percentage. Add a criterion in regarding water management research allow the RAC to decide the applicable weighted percentage during each funding round. We should drop the percentages completely, and indicate that they will be determined by the RAC during each funding round. Need to capture the process by which the workgroup can nominate projects from Tier 1 to Tier 2. Need to leave some flexibility regarding the way this is described in the IRWM Plan Update once it goes into the Plan, it becomes hard-wired. Need to leave adequate flexibility. It would be good to add something about greenhouse gas emissions and climate change. Is there any way to create a worksheet that could be used to do a quick economic analysis? o Perhaps we could ask those projects being interviewed to complete a worksheet that ties project metrics to economic benefits. Explain that the objectives get scored 0.5 points for indirectly applying and 1 point for directly applying. Fully explain the whole ground-truthing process where 0.5 points are applied. Clarify how benefits to DACs are applied. Add in the proposal-level criteria, that the workgroup strives to balance funding between agencies and NGOs. Suggest adding that the project database will only capture information absolutely necessary to assess Tier 1 vs. Tier 2, then a secondary process will take place to fully vet and score Tier 1 projects. Further discussion points included the following: How do we get folks to continue to participate on the Project Selection Workgroup? We need more visibility. Perhaps present at a conference, or write a white paper?

Send the IRWM Strategic Plan link to stakeholders so that they know about DWR s process. 5. Summary and Action Items The consultant team will make note of all project selection suggestions, and bring them to the RAC in February, at which point the RAC will discuss this further.