UNIVERSITY TECHNOLOGY ACCELERATORS: DESIGN CONSIDERATIONS AND EMERGING BEST PRACTICES

Similar documents
Bring Your Ideas to Life

To advance innovation and creativity in future IT generations in Palestine.

Copyright Ó 2014 Cognizant Comm. Corp. E-ISSN X PowerBridgeNY:

White Paper BKLYN Incubator

The University of British Columbia

Beeline Startup Incubator. Rules and Regulations

Innovative Commercialization Efforts Underway at the National Renewable Energy Laboratory

Connecting Startups to VC Funding in Canada

The Ultimate Guide to Startup Success:

TURN YOUR IDEA OR SIDE PROJECT INTO A MILLION DOLLAR BUSINESS

UMass Lowell New Venture Initiative (NVI) Program Summary

Request for Proposals

VISION 2020: Setting Our Sights on the Future. Venture for America s Strategic Plan for the Next Three Years & Beyond

Innovation Programs. Our current programs include:

UCLA INNOVATION FUND PROCESS...

Providing Quality Assistance To Inventors

KickStart Venture Services Commercialization Award Program

PROGRAM SOLICITATION An Initiative of the Ohio Department of Higher Education

SBIR ADVANCE GRANT APPLICATION GUIDELINES Next Deadline: 4:00PM CDT November 24, 2014

Innovation in the University Environment A Pragmatic Approach

What s Working in Startup Acceleration

Urbantech NYC Marketing and Expansion Project: 6092 Contract: Questions & Answers September 27 th, 2017

Innovation Academy. Business skills courses for Imperial Entrepreneurs

COLUMBIA UNIVERSITY COLUMBIA BUSINESS SCHOOL EXECUTIVE MBA PROGRAM LAUNCHING NEW VENTURES B7519. Friday and Saturday Summer 2014

OCE Social Innovation Program

Course Numbers: COE2701, CS2701, MGT480x (Cross listed, may only register for one)

NEW JERSEY ECONOMIC DEVELOPMENT AUTHORITY. COMPETIVE SOLICITATION For TECHNOLOGY ACCELERATOR PROGRAM MANAGER

Pond-Deshpande Centre, University of New Brunswick

Who WE ARE. You provide the entrepreneurial spirit, we provide the tools. Together we cultivate your passion, channel

Alfred E. Mann Foundation for Biomedical Engineering

CTNext Higher Education Entrepreneurship and Innovation Fund Program Guidelines

S 2015 TRATEGIC PLAN

Strategic Plan

Guidelines for FLoW DOE Cleantech UP Applicants

STUDENT COMPETITION PACKET

ACTION ENTREPRENEURSHIP GUIDE TO GROWTH. Report on Futurpreneur Canada s Action Entrepreneurship 2015 National Summit

2018 Competition Guidelines

COULTER TRANSLATIONAL RESEARCH AWARDS 2015 FULL FUNDING ROUND APPLICATION & ADMINISTRATIVE GUIDELINES

PwC s Accelerator Local to Global

THE ENTREPRENEURIAL REFUGEE NETWORK

Capital for Small Projects NSERC Engage Up to $25,000 $25,000 in-kind Collaboration on research projects with university/college researchers. OCE VIP1

Canadian Accelerators

Corporate Entrepreneur Interview. Carlos Moreira,

British Columbia Innovation Council 2016/ /19 SERVICE PLAN

Copernicus Incubation Programme

From Idea to Impact: Highlights of VentureWell Initiatives to Develop Innovation Ecosystems

Ilm Ideas 2 Lessons Learned Brief 2: Working with the Incubators

Q&A with Lo Toney. Founding Managing Partner of Plexo Capital. R E P O R T

The Rise of the Innovation Commons: A Conversation with City University of Hong Kong's Candy Lau

Programme Curriculum for Master Programme in Entrepreneurship

your pathway to ENTREPRENEURSHIP

TERMS OF REFERENCE. remote and from Chisinau, Moldova (at least 3 business trips to Moldova for mentorship purposes) Expected duration of

Programme Curriculum for Master Programme in Entrepreneurship and Innovation

MISSION INNOVATION ACTION PLAN

2014 Fire Pit Competition

A Multi-University Fed Post-Graduate Accelerator and a Model for Economic Development

Grant Fundraising Guide. Accion Venture Lab June 2018

Catalyst Fund Intermediate Awards Program

BUSINESS INCUBATION TRAINING PROGRAM

The Royal Academy of Engineering. Enterprise Hub. Call for proposals

The Nonprofit Marketplace Bridging the Information Gap in Philanthropy. Executive Summary

The New York Women s Foundation

Interim Report of the Portfolio Review Group University of California Systemwide Research Portfolio Alignment Assessment

Our mission. University of Washington Evolving to Meet Faculty Needs. Universities Contribute to Building Wealthy Regions. Building Wealthy Regions

Opportunity Quest 2016 Snow College Business Innovation Competition

What is a TM Forum Catalyst Project?

Commercial Solutions Opening (CSO) Office of the Secretary of Defense Defense Innovation Unit (Experimental)

From Technology Transfer To Open IPR

Disciplined Entrepreneurship: An Alternative to Lean LaunchPad?

U.S.-Israel Joint Economic Development Group R&D Mapping Project

Recruiting Game- Changing Talent

Undergraduate Course Descriptions

Program Objectives. Your Innovation Primer. Recognizing and Organizing for Innovation THE INNOVATIVE ORGANIZATION

Health care innovations and medical technology: reaching the unreached

Rules and Procedures Overview. Kickstart:Wyoming Program SBIR Phase I and II Matching Program

Paper on Business Incubator Framework

Wikipedia. What is LIFT? What is ITAC? Why companies use us? How we work with Tech companies?

Doing Business with DARPA

Cambridge Judge Business School Entrepreneurship Centre. ETECH Projects 2017 INVENTORS MANUAL

OFFICIAL RULES & GUIDELINES

Cozad New Venture Competition. Official Rules, Requirements, and Judging Criteria

co~;p#~ D New Administrative Unit

Programme Curriculum for Master Programme in Entrepreneurship and Innovation

Narrative Descriptions of Sessions Day One Submission, Policy, Procedures

Doing Business with DARPA

AC : UNIVERSITIES AND INDUSTRY CREATE ENGINEER-ENTREPRENEURS TO FUEL INNOVATION

International Innovators

Zell Scholars. Program Overview. Kellogg Initiatives Innovation & Entrepreneurship

How Corporate Research and Venture Capital can learn from one another

Entrepreneurship Education Program at the University of Tokyo

Commercial Solutions Opening (CSO) Office of the Secretary of Defense Defense Innovation Unit (Experimental)

ICT-enabled Business Incubation Program:

ACCELERATION IN INDIA: INITIAL DATA FROM INDIAN STARTUPS

Funding Emerging Medtech Ventures

Entrepreneur Handbook. Resources for Starting a Business in Fairbanks, Alaska

Moving your Research to Market

CARE FUND INAUGURAL PLAN

REPORT TO RESEARCH PARTICIPANTS: Crowdfunding Innovation: It s Not about the Money

A 12-MONTH PROGRAM THAT CAN BE COMPLETED FROM ANYWHERE IN CANADA

Technology Resource Incubator Outreach Program

Transcription:

Technology and Innovation, Vol. 19, pp. 349-362, 2017 Printed in the USA. All rights reserved. Copyright 2017 National Academy of Inventors. ISSN 1949-8241 E-ISSN 1949-825X http://dx.doi.org/10.21300/19.1.2017.349 www.technologyandinnovation.org UNIVERSITY TECHNOLOGY ACCELERATORS: DESIGN CONSIDERATIONS AND EMERGING BEST PRACTICES Julia Byrd 1, Orin Herskowitz 1,2,3, Jim Aloise 1,2, Andrea Nye 4,5, Satish Rao 2,3, and Katherine Reuther 4,5 1 PowerBridgeNY, New York, NY, USA 2 Columbia Technology Ventures, Columbia University, New York, NY, USA 3 NYC Media Lab Combine, New York, NY, USA 4 Department of Biomedical Engineering, Columbia University, New York, NY, USA 5 Columbia-Coulter Translational Research Partnership, New York, NY, USA This article reviews some of the lessons learned by Columbia University in five years of managing or co-managing proof-of-concept center accelerators for grant-funded technologies in three industries: medical devices, diagnostics, and imaging; clean energy; and media. Each of these accelerators is described in terms of objectives, strategies, tactics, and organizational structure, with the commonalities and differences across the accelerators discussed in some detail. Based on these commonalities, the article identifies some common key decision points to be addressed and best practices to be employed by other universities looking to launch accelerators of their own. Given the increasing proliferation of such accelerators at other institutions, the authors propose establishing a forum for ongoing discussion and best practice sharing in the future. Key words: Accelerators; Entrepreneurship; Technology commercialization; Proof-of-concept; Valley of Death INTRODUCTION: WHY WRITE THIS ARTICLE? Over the past five years, Columbia University has launched accelerator and proof-of-concept center programs, which bridge the gap between discovery and technology development, in diverse industries: medical technologies (Columbia-Coulter Translational Research Partnership), clean technology & energy (PowerBridgeNY), digital media (NYC Media Lab Combine), and now in therapeutics as well. Each has built on the lessons and successes of the other programs, with the end goal of accelerating the commercialization of university-originated intellectual property (IP). By applying similar strategies and tactics to different fields, each fledgling program was able to modify previous models to meet the needs of another industry, with ensuing lessons flowing back to the other programs in a positive feedback loop. As the suite of Columbia programs grew, we began to recognize that there are both common elements of technology accelerators as well as elements that need to be adapted to best fit the specific dynamics within different industries. For example, NYC Media Lab Accepted April 15, 2017. Address correspondence to Orin Herskowitz, Executive Director, Columbia Technology Ventures, 80 Claremont Ave. 4 th Floor, New York, NY 10027, USA. Tel: +1 (212) 854-1242; E-mail: orin-herskowitz@columbia.edu 349

350 BYRD ET AL. Combine (Combine) works with media technologies that tend to be leaner and faster moving, and hence instituted an abbreviated application and awardee process compared with the energy or medical device programs. In the case of Combine, a new format was molded using many but not all of the fundamental elements in PowerBridgeNY and Columbia-Coulter Translational Research Partnership. We are now in the process of taking the lessons learned from these first three accelerators to launch a new program specific to therapeutics and are finding that building off an existing base is much easier than starting from scratch. We are also combining many of the shared administrative and infrastructural elements from each of these programs into a centrally-staffed virtual core facility, called the Columbia Accelerator Network, to better leverage these best practices and increase efficiency and effectiveness across the programs. The authors hope that by sharing lessons learned from these accelerators, as well as collecting and disseminating similar lessons from accelerators at other research institutions, best practices will continue to evolve to benefit everyone involved in moving early-stage technologies out of labs and into the market for the good of society. To that end, Columbia University has begun to keep a public repository of observations and materials from our existing accelerator programs, including lessons learned, application and review materials, award terms and conditions, and public outreach materials. Other universities will be able to access these resources while also uploading their own, thus initiating a conversation across the national research community. If your institution would like to participate, please email the authors at techventures@columbia.edu with Accelerators in the subject line. Until the repository is fully established, our hope is that this article will start the dialogue and begin sharing what we have learned about both the key elements of an accelerator program and how the surrounding components can be tweaked in order to meet industry needs. Ultimately, the authors hope readers will be empowered by this information and take steps toward starting their own programs or, if they already have programs, share their experiences with others. BACKGROUND: WHY ARE THESE PROGRAMS NEEDED IN THE FIRST PLACE? It is well established that high-potential early-stage scientific innovations often fall into what is commonly known as the Valley of Death (Figure 1). This valley exists when fundamental basic research that indicates potential opportunities for commercialization has been completed in the academic lab but stalls without the expertise, knowledge, and resources to bring these technologies to market. In many cases, this results in a tremendous net loss to society fewer new products or services, fewer new jobs, loss of exports and taxes, and lower chances for fundamental breakthroughs. Why does this valley exist? Federal research grants primarily fund basic academic research, but the resulting projects are often still too risky for industry to simply in-license or for traditional venture capital investment. Academic researchers often lack the business skills, experience, and network to navigate the early stages of company formation. These projects do not benefit from early and frequent industry and investor input reflecting the real-time needs of the marketplace or are developed in ways that cannot scale effectively to serve those needs. Even those products that make it to market may not have been tested, proven, and deployed enough to appeal to clients or consumers, particularly enterprise customers who may need to demonstrate high reliability and cost-effectiveness. As a result, early-stage start-ups leveraging grant or angel capital have become increasingly critical for getting high-promise but not-yet-validated university IP to a more mature stage, after which the large industry players can obtain access to the technology either through an IP license, company acquisition, or by purchasing the start-up s product or service. Technology commercialization in general, but specifically via start-ups, also fulfills other objectives for universities, including local and regional economic development, supporting entrepreneurial students and faculty, increasing connections to local communities, and employing postdocs and graduate students. However, commercializing a technology through a license or a start-up presents problems beyond solely the dearth of financial resources. It is fairly

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 351 Figure 1. Valley of Death for innovations. rare for faculty inventors in an academic setting to have previous entrepreneurial experience. Likewise, students who invent a technology during their studies and want to commercialize it often lack the entrepreneurial skills and market knowledge needed to succeed. Thus, accelerators can provide participants with needed skills and the means to obtain market knowledge while also supplying some of the validation and prototype funding required. When it works, the result is a system capable of getting more and better technologies out of the lab at a faster rate. CONTEXT: HOW DID THESE SPECIFIC PROGRAMS COME ABOUT, AND WHAT HAVE THEY ACCOMPLISHED? Columbia s biomedical accelerator program, Columbia-Coulter Translational Research Partnership (Columbia-Coulter), was established in 2011 through a generous five-year, $5 million grant from the Wallace H. Coulter Foundation, with the ultimate goal of developing health care solutions that address unmet or underserved clinical needs and lead to improvements in patient care. This is accomplished by supporting interdisciplinary, cross-departmental teams as they work to bridge the Valley of Death and move promising medical technologies from the lab to patient use. The Coulter Foundation was established in 1998 after its namesake, Wallace H. Coulter. Wallace was the founder of Coulter Corporation, a leading global diagnostics company, and a prolific inventor and entrepreneur whose inventions led to significant breakthroughs in science and medicine. In addition to the program at Columbia, the Coulter Foundation has funded over 20 similar programs and initiatives at universities across the country, helping to support education, mentoring, project management, and funding for promising translational projects. By fostering collaboration between biomedical engineers and clinicians while focusing specifically on the commercialization of medical devices and diagnostics and health care information technology, Coulter Foundation programs have served as an effective catalyst in the development and validation of biomedical technologies. Now entering its sixth year, the Columbia-Coulter program has provided education and in-kind resources to over 95 clinician-engineer-led teams

352 BYRD ET AL. at Columbia and direct funding of over $4 million to 35 projects. Of these funded projects, six have spun out of the university into start-ups, raising $9 million to date, and five have been licensed to established companies in industry, with one already FDA approved and in use for both clinical practice and research. In addition, funded projects have secured an additional $49 million in government and foundation grants to further support translational research efforts on these projects within the University. Having seen the success of Columbia s biomedical accelerator program, in 2013, Columbia Technology Ventures (CTV), which is Columbia s technology transfer office, applied for a grant from the New York State Energy Research and Development Authority to establish a proof-of-concept center for clean energy technologies using a comparable model. Columbia partnered with Brookhaven National Lab, Cornell NYC Tech, and Stony Brook University and won one of the three awards. Concurrently, New York University (NYU) partnered with the City University of New York and won another of the awards. Since Columbia, NYU, and the other institutions are all co-located in downstate New York and have a history of successful collaborations, the institutions formed one joint center called PowerBridgeNY (PBNY). Together, the institutions are able to share resources and responsibilities rather than duplicate or compete for applicants, mentors, judges, sponsors, investors, events, and so forth. As a result, PBNY has $10 million in funding to spend across the six partner institutions over the five-year period. The goals of PBNY are similar to those of the Columbia-Coulter program. First, the program seeks to move clean energy technologies developed at the universities out of the lab and into the market, ideally as start-ups based in New York State. The program provides translational funding for prototypes as well as an education in entrepreneurship, mentorship, marketing, and other support mechanisms. Another goal is to enhance the clean technology ecosystem in the New York City (NYC) area, primarily by hosting events, aggregating resources, and engaging external organizations and individuals in the program as advisors, mentors, and judges. Over the last three years, PowerBridgeNY has received nearly a hundred applications from across the six partner schools, with twenty-two teams ultimately selected as awardees. As a result of the PBNY funding, there have already been sixteen new inventions disclosed to the universities, eleven prototypes completed, eight new start-ups incorporated, and four license agreements signed. Teams have also raised nearly $3 million in additional grant funding, including five Small Business Innovation Research (SBIR) grants, with several more applications in the pipeline. Additionally, one team from the first cycle recently closed a $9 million Series A investment. To assist the teams, the program has a pool of over a hundred advisors and mentors, as well as a panel of fifteen industry and venture capital judges, who provide feedback during the application process. The successes of Columbia-Coulter and PBNY in pushing university technologies to the market inspired the concept for Combine, a program launched by the NYC Media Lab. The program offers design and operation support from NYU and CTV and is funded by the New York City Economic Development Corporation and the NYC Mayor s Office for Media and Entertainment. Combine accelerates NYC university teams working on digital media-related technologies through a program similar to that offered by PBNY and Columbia-Coulter while leveraging the Media Lab s well-established consortium of 20+ media corporations and nine NYC universities. Combine pushes teams to find product-market fit, pivot the prototype based on intense customer discovery, regularly interact with the media industry, and construct a story for the final pitch. The ultimate goal is to have teams exit the program with a clearly identified business, a skeleton prototype, and significant leads with industry partners and investors. Combine has completed its first cycle, for which over 60 applications were received from nine NYC universities, with nine applications accepted. During the course of the four-month program, teams conducted more than 1,000 customer interviews combined and regularly interacted with mentors from several media corporations and investment firms. Since the first cohort graduated this past spring, seven teams have been incorporated as start-ups, three were accepted into later-stage privately funded start-up accelerators, and two have already executed their first commercial agreements with industry partners.

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 353 BEST PRACTICES: WHAT MAKES A GOOD PROGRAM? The Columbia programs have a combined nine years of experience running accelerator programs. Below, we share what we believe to be successful approaches to core elements of the accelerator programs: program oversight, outreach and application development, selection process, educational elements, and awardee process. Naturally, these are not the only ways to approach these steps, nor are they necessarily the best. However, we share them here in the hopes of starting the conversation. Columbia has developed many of the template materials for activities below, which we can share with interested institutions upon request. Program Oversight Because an accelerator involves much more than simply soliciting applications and awarding funds, when setting up a new initiative, it is essential that dedicated staff be available to oversee the many activities in order to ensure that a quality program is created and nurtured. This includes a focus on the programmatic efforts necessary to attract project applications, the creation or honing of educational programs capable of supporting teams, and the creation of vital external collaborations and partnerships. Because of the many stakeholders involved (applicant teams, awardee teams, mentors, judges, teaching teams, and others), as well as the various phases taking place concurrently (e.g., project applications from new teams, project management of existing teams, educational programs), we found that setting up an accelerator requires, at a minimum, one full-time equivalent (FTE) or more if resources allow. Our Columbia-Coulter medical technology accelerator is run with 1.5 FTEs, while a multi-institutional program such as PowerBridgeNY requires at least 2 FTEs but ideally more. Once a program is fully established and systems are in place, and/or there are multiple accelerators that can share core resources, it could be possible to scale back the number of staff dedicated to any one program, but the initiation of a successful program will require a focused and dedicated staff. The skill set of the administrative team can vary depending on the goals of the program. In most cases, knowledge of the technological area is a plus but not required; it is most important for the administrators to be familiar with technology commercialization and entrepreneurship overall. The full-time administrator needs to be self-organized and comfortable working on a small team where resources are limited. Many programs may also be able to rely on available resources from within the technology transfer office or elsewhere at the university to fill in gaps as needed. Outreach and Application Development During the initial years of a new program, it can be challenging to spread the word to potential applicants. Accelerators can increase their outreach and application development through the following: Engage with technology transfer offices and campus entrepreneurship organizations to advertise the program via newsletters, events, referrals, and direct contact with researchers via presentations at departmental meetings and new faculty orientations. Target researchers with relevant grant awards or submitted grant applications by coordinating with the sponsored projects office or sourcing them from publicly available grant agency awardee lists. Leverage accelerator alumni teams for referrals to their scientific colleagues. Circulate advertising print material, such as flyers and handouts, to relevant departments. Host multiple information sessions on campus: Invite researchers as well as technology transfer officers and entrepreneurship directors to discuss the specifics of the program. Present a simple, accessible document containing eligibility requirements, important dates, requirements for awardees, and benefits of participating, including specific examples of success if possible. Assemble a steering committee of key stakeholders within the university community (research faculty, entrepreneurship educators, technology transfer officers, department chairs and/or deans) to champion the program by helping to shape policies, provide feedback, and spread the word across campus.

354 BYRD ET AL. In addition to soliciting team and project applications, it is also important to put thought into who will review applications and serve in mentorship roles to guide and assist teams. Depending on the strength of the local ecosystem, recruiting an initial set of mentors and judges could be challenging. Some ideas for maximizing the chances of finding top-tier advisors include: Direct outreach to personal contacts and alumni. The steering committee originally put together to mold the structure of the program will likely have an impressive combined contact list. Accelerator administrators can help jog memories by looking through the LinkedIn contacts of the steering committee to suggest candidates. Solicit referrals from existing reviewers and mentors. Even people who decline to serve due to time constraints may be willing to suggest other candidates. Solicit referrals from local business plan competitions, incubators, business organizations, chambers of commerce, and entrepreneurship organizations. Attend and recruit at local events relevant for the industry. Create a ready-made guidebook for mentors and judges that includes what you expect of them, forms required for signature, and commonly asked questions. Selection Process Applicants move through selection processes that are unique to each program but share common key elements: External judges are selected from a mix of industry, venture investors, entrepreneurs, and technical experts, who collectively lend credibility and real-world experience to the program. Judges can be subdivided into approximately equal groups based on expertise, with certain applications assigned to each group. We recommend having enough reviewers so that each judging group gets no more than ten applications assigned to it to limit fatigue. It is also to be expected that some judges will drop out or become unresponsive, so adding in a few more judges than you expect to need may be useful. A phased application process for applicant teams with increasing commitment required for each incremental step is used. A multi-stage process provides the chance for teams to get additional feedback and allows judges to get a sense of each team s coachability. For more rigorous programs, stages might include: Idea Grants: After hearing about the concept through another institution, PBNY instituted an easy but optional element of the application process to encourage more initial submissions, with nominal prizes attached. The so-called Idea Grant submission form is short and informal, asking respondents to simply name their team members (no biographies needed), describe their technologies in one paragraph, and describe why they want to apply to the program in another paragraph. If the program is industry-specific, a question about how the technology fits into the industry can be added. Applicants will also have to select a 30-minute time slot to interview with the program managers, which is a time both for the program managers to assess the team and technology as well as for the team to ask questions to determine if the program is a good fit. After the interview, the program managers send a follow-up email to each team indicating whether or not they are encouraged to submit a pre-proposal. Those that ultimately do submit a pre-proposal receive a nominally-valued Visa gift card for their lab s use. Applicants can only receive one Idea Grant per submission, so reapplications are not permitted. Pre-Proposal: A short (one- to two-page) written proposal and/or self-made video to provide a basic overview of the proposed project is required. Judges can use this material to assess the inherent feasibility and applicability of a given project, primarily in order to screen out teams early on before requiring too much effort by the team. Keeping this short can encourage more researchers to apply and minimize effort in case they do not proceed. Videos are also excellent recruiting tools for mentors who might not want to wade through long

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 355 written applications to get a sense of which teams are of interest. A sample format might include the following sections: the problem or unmet need, market size, team introductions and the envisioned solution. Full Proposal: A full proposal expands on the aspects first introduced in the pre-proposals and allows teams to update their responses based on feedback from reviewers during the pre-proposal phase. In addition to providing a more in-depth market analysis, teams are asked to discuss competitors, intellectual property position, project budget, and detailed technical milestones. To help teams dive deeper into their initial markets, a business mentor is assigned (see below) to each team. For our energy and biomedical accelerators, given the centrality of intellectual property protection, an external law firm is enlisted to perform cursory IP reviews, which have proved extremely helpful. For a few thousand dollars per team, the IP review can provide judges with an independent, objective lay summary of the core technology, including prior art landscape. Live Pitch: Well in advance of pitch day, teams are provided with a structured format for their materials, including guidelines on content to be covered, length of each section, time and location of their pitch, and a list of the judges. Teams continue to work with their mentors to finalize their pitches and practice delivery in order to give the best impression to the judges. Sample pitch guidelines might include a general overview of how to draft a compelling story, specific slides to include (e.g., IP, competitive landscape), a list of questions that should be answered, and/or a list of common mistakes to avoid. In addition to these guidelines, program administrators run a pitch practice session (see below) to assist teams in refining their stories and sharpening their presentation skills. Communication of scores and comments to the teams. Allow reviewers two to three weeks to review written proposals. Pre-scores and comments are due from the judges at least two business day in advance of the review meeting. An online review and scoring platform is helpful to efficiently collect and manage judges feedback in advance of and during the proposal rounds. FluidReview, for example, is a commercially-available tool that we have found useful for this task. During review sessions, the scores of all the teams are pre-loaded and teams ranked from lowest to highest based on that scoring. Review discussions are focused towards the teams in the middle rather than discussing the consensus on the winners and losers in depth. An introduction at the beginning of the meeting is important to establish the goals of the program with the judges. Show both average scores as well as the scores of individual judges. To keep vocal judges from monopolizing the conversation, use the individual scores to guide the discussion and draw out quieter judges. Judges can change their individual scores (and thus the overall average score of the team) at any time. At the end, display the final ranking of teams and estimate where the cutoff will be. Ask the reviewers if they agree with which teams will be moving forward or if they would like to change their scores. Have a dedicated notetaker capture anonymized verbal comments to be given directly to teams along with the de-identified comments from the reviewers themselves. Provide teams with the unfiltered (but aggregated and anonymized) feedback from the reviewers regardless of whether they move forward in the competition or not. This allows teams to see the judges perception of their material so that they can improve for future rounds or subsequent submissions. Once awards are made, keep judges updated on their progress. The structure of the updates can vary from formal periodic reviews to informal invitations to pitch or demo days that highlight how awardees are progressing.

356 BYRD ET AL. The above list enumerates the common elements that most if not all accelerators and proof-of-concept centers should consider. However, the Columbia programs have built upon the common core elements to suit the needs of each specific program. For example, the processes for the energy and medical technology accelerators are largely similar based on both industries being capital intensive with long development timelines, while the media program was adapted to fit the faster-moving media industry in particular. For example, start-ups in the media space often go through faster development cycles, require less money to launch, and, with typically software-based products, depend more upon speed-to-market than bulletproof patents. Accordingly, the Combine program adopted a single-step application process and solicited brief IP reviews from technology transfer offices rather than full IP reviews from outside counsel. Educational Elements While many teams initially apply for the promise of funding, they often acknowledge that the educational experiences were actually equally or more valuable. Educational elements, which are extremely customizable in timing and execution, could include any combination of the following: Methodology: The existing Columbia accelerators offer educational boot camps grounded in characterizing an unmet need and assessing how well a proposed solution fits that need. While one uses a variation on the Bio- Design program and the other two utilize the Lean LaunchPad approach, the most critical aspect is to select a methodology (ideally an existing approach with a successful track record), get trained in how to teach it, customize a curriculum for one s specific needs, and then modify it for future cohorts as needed. Assigning readings and videos (e.g., customer discovery interview examples) can reinforce teachings. Both Bio- Design and Lean LaunchPad have extensive and modular resources available online for teachers and students, the use of which can avoid huge amounts of work and expense when launching an accelerator. Group Learning and Feedback: As much as possible, there is an emphasis on and effort towards in-person sessions for peer-to-peer learning. Teams should be required to speak with potential customers in order to assess the market problem and determine how effective their technology is as a solution, which allows teams to pivot their solutions in response to market feedback. Presenting live each session forces teams to dedicate the time required to complete assignments and learn the methodology. In-person sessions also allow teams to meet each other, fostering a sense of community among participants. Even for teams whose proposals do not move forward, this kind of educational experience can be leveraged for future entrepreneurial pursuits. Mentors: Assigning business mentors to work closely with teams for an extended period of time can fill basic business knowledge gaps, inject domain expertise, provide an impartial observer, and potentially open doors to target industries. A dating period during the pairing process allows teams and mentors to gain familiarity with each other before committing to work together throughout the program. Program administrators ensure there is an onboarding and check-in process for the mentors. Mentors are given an overview of the methodology and tasks teams are working on and are invited to educational events. Administrators periodically meet with both teams and mentors to ensure the relationship is working and that they are collaborating on a regular basis. Feedback can be formally or informally collected from the teams to ensure that the best mentors are invited back for future rounds. Pitch Prep Event: Presenting a business pitch (vs. giving a scientific presentation) will most likely be new for most teams, so an event to acclimate them to the format and to prepare them for potential questions can dramatically improve their final products. Some programs choose to assemble a volunteer panel of mock judges consisting of mentors, investors, and other local experts in the field. Other programs hire professional pitch and presentation coaches for a few days to work with the teams on content as well as presentation style. The Columbia programs generally allocate a significant amount of time to teams for practicing pitches, including receiving iterative feedback from instructors, mentors, and judges.

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 357 Other Educational and Networking Events: The chosen educational methodology may not include important elements in start-up creation, especially targeted content relevant to an industry vertical. Accordingly, the program managers host a significant number of one-off lectures or office hours with experts on specific topics relevant to the field. These include lectures on intellectual property, company formation, SBIR/Small Business Technology Transfer (STTR) funding, selling into industryspecific channels, understanding relevant regulatory issues, and immigration and visa challenges. To help teams network with industry professionals, periodic showcases and demo days are also encouraged. These activities are often shared across the accelerator programs and/or university ecosystem since they are applicable to many kinds of ventures. The requirements for commercializing a technology within a given industry drive the timing and structure of the curriculum. As an example of how the above elements can be customized, all three of Columbia s accelerators host educational sessions, but each takes a different approach to how these sessions are implemented. PowerBridgeNY s program requires a two-day session for applicants, a one-day program for awardees, and monthly assignments during the award period. In contrast, the Columbia-Coulter medical technology accelerator hosts an optional, but highly recommended, 12-week course for applicants, and Combine hosts a required 10-week boot camp for awardees. This diversity of approaches derives from the goals and operating conditions of each program. For example, clean technology projects are typically technically involved, have long development timelines, and require more advanced prototypes prior to in-depth customer outreach. Hence, for PBNY, most of the customer discovery efforts that would normally be front-loaded are completed after the award in parallel to when teams are doing their prototype development. The abbreviated two-day boot camp during the application process largely serves as a teaser to engage the teams in thinking about the marketplace and to test how receptive they are to coaching and feedback. In medical technology, in addition to customer discovery, there are many details of the health care system, as well as regulatory and reimbursement processes, that teams need to understand before they move forward. This means that more time is spent on these issues, hence a longer course. On the other hand, given Combine s focus on media technologies, the customer discovery process is in many cases far more critical compared to further technical development. Accordingly, all of the instruction and training around customer discovery need to happen up front before the bulk of the award is given. Post-Award Process The final format of the post-award process will depend heavily on the particular industry, amount of funding, availability of the program managers, and other factors. Accelerators could include any combination of the following components: Award Setup: Since teams may receive feedback from reviewers at each stage, a team may need to change the milestones and/or budget from those presented in their initial proposal. Reviewers may require changes that teams must agree to in order to get funding. Scheduling a post-award notification meeting to discuss the general terms and conditions of the award, the specific details of the milestones, associated deliverables, and approved budget items allows program managers to set expectations and ensure everyone is on the same page. Tranches: In most cases, funding is released in tranches based on both business and technical milestones, not solely the progression of time or the incurrence of cost. Doing so allows program administrators to retain enough influence to keep teams on track and ensure that award funding is used to advance the technology towards the marketplace. The balance of business and technical milestones helps to keep teams thinking about their end goal of university exit versus solely building a prototype. Business milestones can include customer discovery, IP review, market assessment, competitive landscape analysis, business plan creation, incorporating a company, creating a cap table, executing an IP agreement, attending an industry conference, and creating

358 BYRD ET AL. marketing materials. While technical milestones and deliverables may change as the project progresses, we recommend that the customer discovery and business milestones be firmly maintained. Check-Ins: Having teams report back regularly on progress allows program managers to intervene early if there are problems. During these in-person meetings, which can happen weekly, monthly, or quarterly, teams are asked to get out of their comfort zone and discuss their business progress in addition to technical progress. The administrators reserve time at the end of the meeting for a quick overview of the technical progress or to schedule a separate meeting to do a lab tour. Administrators send out a written summary of what was discussed as well as the clearly articulated agreed-upon business and technical next steps to be completed prior to the next meeting. These check-ins are meant to be an opportunity for two-way communication, so inquiring about what teams are struggling with and how the program could be helpful can lead to unexpected requests and the opportunity for the program to be even more impactful. Always schedule the next meeting before adjourning, and always begin each meeting with a review of the action items from the prior ones. Reports: As a precondition to receiving further tranches, awardees are required to submit quarterly reports based on a provided template to collect important metrics both during the awardee process and up to two years after graduation (if feasible). Having metrics reported regularly and in a common format allows program administrators to easily aggregate the numbers and update marketing materials and/or reports to funders as appropriate. The chosen metrics may change depending on the program, but some key metrics include additional grant funding received, number of faculty and students involved in the project, number of commercial prototypes developed, number of in-field tests completed, number of start-ups incorporated, number of FTEs, number of license agreements signed, and total number of awardees and graduates. Traditional economic development metrics such as number of jobs created and revenue earned can be tracked as well, but, depending on the industry vertical, these metrics are likely to take quite a few years to become significant. Subsidized Resources: If a program has sufficient funding, there are supplementary resources that teams have found particularly useful. For instance, we have found that external SBIR/STTR consultants have been very helpful to teams, as SBIR grants are often the next funding sources for the emerging start-ups. Teams might also need specialized consultants with experience in clinical trials or insurance reimbursement. The accelerator may determine that a few thousand dollars spent on early intervention can mean the difference between success and failure. For other resources that are not team-specific, consider hosting group workshops to lower per-team costs. For instance, we provide joint group sessions on pitch training for all of our accelerators. It may be possible to secure free lectures or sponsorship funding from local law firms or other service providers to subsidize expenses. Furthermore, there may be other organizations in the entrepreneurship ecosystem that offer valuable training classes. The question of project funding is an important one. How much validation funding is required, and is it really necessary? While funding attracts teams, we have found that awards do not need to be huge nor do full awards have to be given to every team. For example, Combine found that a single lump-sum $25,000 award to be spent primarily on customer discovery and minimum viable product development is enough to attract quality applicants in the media space given that the majority of the envisioned products are software-based. On the other hand, given Columbia-Coulter s focus on medical devices, funding is tranched, with amounts ranging from $5,000 to $180,000 per team based on need. In some cases, teams are awarded less than their initial ask if it is determined that they can progress with less than their full proposed budget. CURRENT CHALLENGES While the Columbia programs have done well thus far, we face challenges that others are likely facing.

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 359 In this section, we outline a few of these challenges as well as some potential solutions we have tried. We look forward to hearing about how other programs have addressed these challenges in the ongoing conversation that we hope this paper and the creation of the public repository of resources will initiate. Metrics Funders of translational accelerator programs may have varying objectives and metrics. For example, our medical technology program, funded by a private foundation, is most interested in patient impact through successful product commercialization. The PowerBridgeNY and Combine programs, which are funded all or in part by governmental agencies, have significant economic development objectives, with metrics that include job creation, company revenue, products sales, etc. However, given the early-stage nature of the technology, the typical outcome metrics for any program will likely be negligible for several years regardless of how well the program is set up. There will also be questions regarding whether teams would have become successful even without the intervention. Conversely, even technologies that do not advance in a particular program can lead to future successes outside the program, as a team may learn valuable lessons that will help make their next venture successful. In fact, an underlying mentality of the educational curricula is to accelerate teams to a potential failure or pivot point so that they can instead allocate their time, energy, and resources to future projects. How can metrics capture this? The Columbia programs have all wrestled with the above problems in deciding what metrics to track and report. Once accelerator graduates have been operating their companies for three or more years, the quantitative business metrics may become more substantial, thus allowing accelerators to better demonstrate their value to potential sponsors, applicants, mentors, and other interested parties. However, quantitative metrics still cannot adequately capture the true impact of the programs. While moving current technologies out of the lab is an immediate goal, we also seek to affect cultural change within the university to encourage entrepreneurial efforts and increase long-term commercialization figures. The Columbia programs provide connections and training to individual student and faculty entrepreneurs, which may end up helping them on their next ventures. Culture change is difficult to measure, but anecdotal evidence from participating teams about how the program has changed their grant application approaches, how they work with their advisees, and how they behave in job interviews can help demonstrate a program s impact. Some programs do before-and-after videos of each team, for instance, to emphasize the team s growth during the program. However, programs would also be wise to not start solely believing their own storytelling at the expense of continuous and rigorous self-analysis and improvement. Sustainability Our current accelerator programs have been awarded multi-year grants, with the expectation that the programs will secure additional funding to continue beyond the contract period. This situation is fairly common, as the governmental or philanthropic sources that typically fund such programs often have finite funding timelines and view their resources as seed corn for larger third-party investments. The hypothesis is that, after several years, these programs will demonstrate their effectiveness and attract investments externally and/or from resources within the universities. Unfortunately, securing follow-on funding is often extremely challenging regardless of the industry area or program success rates. The universities in which these programs are housed have many competing demands for each funding dollar, with commercial translational accelerators often being lower on the list compared to basic research, student financial aid, and classrooms. While the participating industry and venture partners may benefit from the increased and improved deal flow, their own financial structures may limit their ability to significantly fund not-forprofit programs. As the initial funding source wanes, programs may consider limiting the number of awards made each year or reducing the amount of each award in order to stretch the funding. Another option is for the program to continue to offer the educational elements (boot camps, mentoring, etc.) and eliminate the large proof-of-concept awards altogether. As mentioned earlier, graduates of the program often report that, while the funding is helpful, the education is far more

360 BYRD ET AL. valuable, as the skills and knowledge they learn allows them to succeed on future projects as well. With this model, the university would still be able to generate positive press for supporting entrepreneurship and foster cultural change in the academic community toward a focus on commercialization. Nonetheless, removing the project funding would likely have a severe impact on application volume and engaged participation by busy faculty and students. Awardee Team Leadership Even with educational offerings and mentorship, accelerator teams often struggle due to a lack of full-time business leadership with relevant industry experience. While graduate students theoretically can make the transition into the CEO role, they also have STEM degrees from top universities, leading them to frequently get recruited by the very companies that they meet through the accelerator. While this is a good outcome from an ecosystem perspective, it can leave the technology stranded without a path to market. This is not evidence that the programs in their current incarnations are ineffective. There have been many technologies that have exited the university and are being sold on the market today, including some by start-ups led by former graduate students. However, with additional team-building support, there would likely be even more success stories. To address this, the Columbia programs try to pair serial entrepreneurs and/or MBA students with participating teams. MBA students can be a great resource while they are in school, but, with their busy schedules and lucrative internships and job offers, it can be difficult to secure enough of their time. Experienced serial entrepreneurs are clearly ideal but can be hard to find depending on your region. So far, our programs have not found a perfect solution to the CEO challenge. Leveraging Scale As mentioned, one of the benefits of running concurrent accelerator programs in multiple industries is that each program can learn from the experiences of the others. Now that our programs are relatively stable, we are exploring ways to gain further efficiencies from scale across the programs. For instance, the program managers spend a significant amount of time doing fairly routine, repetitive administrative duties, such as tracking and communicating with mentors, maintaining a web and social media presence, hiring external service providers to support the teams, managing sponsored project paperwork, and collating metrics for sponsors (Figure 2). At Columbia, we are exploring the creation of a shared core facility (the Columbia Accelator Network) to provide many of these administrative functions across the multiple accelerators while retaining industry-experienced program managers within each separate program. Our hope is that the accelerator core will allow for greater efficiency and effectiveness; coordinated scheduling to leverage physical presence of judges, advisors, and vendors; and increased branding for the university. We also hope that the core facility will allow us to more quickly and effectively launch new accelerators in more industries (such as Columbia s new therapeutics accelerator) when such opportunities arise. Any input from our peers would be appreciated. CONCLUSIONS: ADVICE FOR THOSE SEEKING TO START THEIR OWN PROGRAMS Collectively, our medical technology, clean technology, and media technology accelerator programs have discovered a core model and evolved it to fit their needs, resulting in more university-based technology being developed and ultimately commercialized. Over the past five years, participants of the three programs have earned $51.2 million in grant funding, including SBIR/STTRs, and $18.2 million in venture investment, with two of those companies already generating revenue. In addition to the ten IP agreements signed with the university spin-outs, five more technologies have been licensed to industry. We conclude that, by employing a few key lessons, a viable commercialization program that includes entrepreneurship training in tandem with project support has great potential to accelerate inventions to market and can be established for various technology sectors. 1) Use other programs as templates and customize where needed to reduce start-up time and cost. Universities looking to create a program would not need to copy the exact models presented in this article.

UNIVERSITY-BASED TECHNOLOGY ACCELERATORS 361 Figure 2. Common elements, synergies, and support. Core pieces, such as mentorship and educational elements focused on market validation, can and should be adapted as needed. Physical documentation, such as descriptions of the benefits, applications, review forms, terms and conditions, and even outreach emails, can be borrowed from similar organizations and tweaked as well. To that end, Columbia is happy to share our forms and templates upon request. Starting with the backbone of a model and templates for many of the materials can dramatically shorten the timeline to launch. For example, following the core elements of PowerBridgeNY and Columbia-Coulter and adapting boot camp materials from classes at a local university, Combine was able to go from ideation to launch in six months. The Columbia programs recognize that the access to high-caliber researchers, flexibility of funding sources, and access to industry representatives and mentors through the NYC area could be considered unique advantages that have helped the Columbia accelerators launch and grow quickly. However, even in smaller cities, tapping into the local ecosystem is a key component in building any successful commercialization program. 2) Results will not be immediately or objectively measurable; team testimonials can help. The timeline to scaling up and thus creating jobs for early-stage start-ups is long, which hinders the ability of an accelerator or proof-of-concept center to convey its importance in terms of near-term economic development. Many of the immediate successes are intangibles, such as a shift in perspective or even learning from a failed project. Quantifiable metrics will improve with time, while intangible metrics may forever remain unquantifiable. However, testimonials from participating teams can elucidate the true impact of the program as well as opportunities for improvement. Checking the pulse of teams