How to Succeed with Your Bug Bounty Program

Similar documents
HEAD TO HEAD. Bug Bounties vs. Penetration Testing. How the crowdsourced model is disrupting traditional penetration testing.

2016 Bug Bounty Hacker Report

Running a Bug Bounty Program

Penetration Testing Is Dead! (Long Live Penetration Testing!)

Joint Replacement Outweighs Other Factors in Determining CMS Readmission Penalties

1 Million Surveys and Counting: Big Data Reveals the Importance of Communication

VISION 2020: Setting Our Sights on the Future. Venture for America s Strategic Plan for the Next Three Years & Beyond

Are physicians ready for macra/qpp?

2015 Lasting Change. Organizational Effectiveness Program. Outcomes and impact of organizational effectiveness grants one year after completion

Pre-Audit Adaptation: Ensuring Daily Joint Commission Compliance

What Job Seekers Want:

Success through Offshore Outsourcing. Kartik Jayaraman Director Enterprise Relationships (Strategic Accounts)

Analysis of Nursing Workload in Primary Care

Corporate Entrepreneur Interview. Carlos Moreira,

A strategy for building a value-based care program

Virginia Growth and Opportunity Fund (GO Fund) Grant Scoring Guidelines

Inspiring Innovation: Patient Report of Hourly Rounding

how competition can improve management quality and save lives

Accountable Care: Clinical Integration is the Foundation

Special Open Door Forum Participation Instructions: Dial: Reference Conference ID#:

A Game-Theoretic Approach to Optimizing Behaviors in Acquisition

Better has no limit: Partnering for a Quality Health System

PG snapshot Nursing Special Report. The Role of Workplace Safety and Surveillance Capacity in Driving Nurse and Patient Outcomes

California Community Clinics

THE ULTIMATE GUIDE TO CROWDFUNDING YOUR STARTUP

THE STATE OF BUG BOUNTY

Discovery Innovation Application

The influx of newly insured Californians through

A Market-based Approach to Software Evolution

ebook How to Recruit for Local Government in the Digital Age

2014 Edition FUNDRAISING WITH ARTEZ INTERACTIVE WHITE PAPER FACEBOOK ARTEZ.COM FACEBOOK.COM/ARTEZINTERACTIVE

Develop a Taste for PEPPER: Interpreting

Recruiting Game- Changing Talent

FY2025 Master Plan/ FY Strategic Plan Summary

Accountable Care Atlas

time to replace adjusted discharges

Using PEPPER and CERT Reports to Reduce Improper Payment Vulnerability

The Single-Purpose App.

OVERVIEW. Helping people live healthier lives and helping make the health system work better for everyone

Hospital Inpatient Quality Reporting (IQR) Program

Guidance Document for Declaration of Values ECFAA requirement

Overview: Midlevels for the Medically Underserved. -Employer Information-

FY 2017 Year In Review

This memo provides an analysis of Environment Program grantmaking from 2004 through 2013, with projections for 2014 and 2015, where possible.

fundraising strategy Approaching donors Planning our fundraising work Strategies for each funding source Communicating with potential donors

Direct Hire Agency Benchmarking Report

Directing and Controlling

Demand and capacity models High complexity model user guidance

Mental Health Crisis Pathway Analysis

Hospital Brand Equity

Crowdsourced Security at the Government Level: It Takes a Nation (of Hackers)

Open Source Software Evolution

Click on the + next to any question to jump directly to that question and answer.

Virtual Meeting Track 2: Setting the Patient Population Maternity Multi-Stakeholder Action Collaborative. May 4, :00-2:00pm ET

The Value of Integrating EMR and Claims/Cost Data in the Transition to Population Health Management

ABOUT MONSTER GOVERNMENT SOLUTIONS. FIND the people you need today and. HIRE the right people with speed, DEVELOP your workforce with diversity,

Executive Analysis. In-depth philanthropic and wealth data on all of your prospects at a glance

Minnesota Statewide Quality Reporting and Measurement System: Quality Incentive Payment System

PSAT/NMSQT. Chapter 4. How the PSAT/NMSQT and the SAT Are Linked

3 Ways to Increase Patient Visits

How proctoring fits into current physician performance improvement models

ICANN Complaints Office Semi-Annual Report

Given enough eyeballs, all bugs are shallow? Revisiting Eric Raymond with bug bounty programs

SCAMPI B&C Tutorial. Software Engineering Process Group Conference SEPG Will Hayes Gene Miluk Jack Ferguson

Adopting Accountable Care An Implementation Guide for Physician Practices

Decision Fatigue Among Physicians

Risk Adjustment Methods in Value-Based Reimbursement Strategies

REGION 5 INFORMATION FOR PER CAPITA AND COMPETITIVE GRANT APPLICANTS Updated April, 2018

BRIGHAM AND WOMEN S EMERGENCY DEPARTMENT OBSERVATION UNIT PROCESS IMPROVEMENT

GLASSDOOR SURVIVAL GUIDE

NURSING SPECIAL REPORT

2010 HOLIDAY GIVING. Research and Insights into the Most Charitable Time of the Year THIS RESEARCH INDICATES:

Organizational Effectiveness Program

HOW HR AND RECRUITERS CAN NAVIGATE THE HEALTHCARE STAFFING CRISIS

Integrated Offshore Outsourcing Solution

The TeleHealth Model THE TELEHEALTH SOLUTION

Legal Medical Institute. Introduction to Nurse Paralegal

Alternative Mobile App Funding. How to Use Crowdfunding and Equity Partnerships to Fund Your Mobile App

What is a Pathways HUB?

Bangkok Hospital. Transforming the patient experience with smart practices that complement world-class hospitality. Leadership Spotlight

Career & Education Planning Workbook. Career Dimensions, Inc. All Rights Reserved, 2016

Exploring the Structure of Private Foundations

Discussion paper on the Voluntary Sector Investment Programme

2001 Rural Development Philanthropy Baseline Survey ~ Updated on June 18, 2002

Appendix: Data Sources and Methodology

Click on the + next to any question to jump directly to that question and answer.

WHITE PAPER. The four big waves of contact center technology: From Insourcing Technology to Transformational Customer Experience.

An Overview of NCQA Relative Resource Use Measures. Today s Agenda

The TFN Ripple Effect Our Impact To Date

REPORT TO RESEARCH PARTICIPANTS: Crowdfunding Innovation: It s Not about the Money

Interim Report of the Portfolio Review Group University of California Systemwide Research Portfolio Alignment Assessment

OUTSOURCING IN THE AGE OF INTELLIGENT AUTOMATION

The Nonprofit Marketplace Bridging the Information Gap in Philanthropy. Executive Summary

A Publication for Hospital and Health System Professionals

School of Global Environmental Sustainability Colorado State University Strategic Plan,

BUG BOUNTY BUZZWORD BINGO DEEP DIVE UNDER A JUMPED SHARK

Identifying Evidence-Based Solutions for Vulnerable Older Adults Grant Competition

The New York Women s Foundation

University of Michigan Health System. Current State Analysis of the Main Adult Emergency Department

3. Does the institution have a dedicated hospital-wide committee geared towards the improvement of laboratory test stewardship? a. Yes b.

Transcription:

The world s leading Vulnerability Coordination and Bug Bounty Platform How to Succeed with Your Bug Bounty Program

Foreword Thank you for downloading this ebook about how your organization can learn from the world s best bug bounty programs. HackerOne is on a mission to empower the world s hackers and security teams to fix software vulnerabilities together. Since our founding in 2012, we have had the privilege of working with security teams at the world s leading organizations, including The U.S. Department of Defense, Dropbox, Yahoo, Twitter, Uber, Slack, GitHub, New Relic, and CERT/CC. They are among 600+ customers who have chosen HackerOne for their bug bounty program, closing 30,000+ vulnerabilities and awarding $10,000,000+ along the way. We have closely observed what makes their bug bounty programs successful, leading to the HackerOne Success Index described in this ebook. Our special thanks goes to the hackers, the security teams, and the engineers who have found and fixed so many reports, protecting us all. I invite you to join us and do the same for your brand, your organization, and your customers. Mårten Mickos CEO September, 2016

HackerOne-Powered Bug Bounty Success Introduction Who this is for? Security Teams have launched over 500 bug bounty programs with HackerOne, and each has found a unique path to success based on their individual needs. Yet measurable patterns emerge when we dive into HackerOne s bug bounty data across these hundreds of programs. For companies new to bug bounty programs, we have collected these insights to help demonstrate the factors that successful programs share. You can expect to find key benchmarks, and a better understanding of the levers that will drive improve in each dimension of a successful program. Not all bug bounty programs are successful in the same way. Different organizational needs and capabilities will shape the specific path to a strong, sustainable bug bounty program, our data show that each success is built on some common pillars. This ebook is an exploration of those strategies, so that companies can find patterns of success and use those to help improve their own programs.

How to Succeed with Your Bug Bounty Program Measuring Success At HackerOne, we re deeply interested in the success of vulnerability disclosure programs, and are constantly striving to better understand just what drives their success. To shed light on what contributes to a successful program, we've been analyzing our unique set of data from hundreds of organizations.* Based on this, we re excited to share the HackerOne Success Index (HSI), a method to measure the effectiveness of HackerOne-powered vulnerability disclosure programs. The index calculates six dimensions, from 1 to 10, by which programs can benchmark their success each month. We briefly discuss each dimension below, and we ll explore them in more depth over the course of this series. Success is Multidimensional Our investigation shows that success doesn t simply come from doing well on a single dimension, but rather across a combination of them. Successful HackerOne programs those that consistently receive valid, security-enhancing reports excel in a variety of the six distinct but interconnected dimensions below: Vulnerabilities Fixed: Simply put, to be a thriving program, you need to receive and resolve vulnerability reports. The most successful programs also receive a wide array of vulnerability types across different security aspects. Performance in the other indexes will affect the volume and quality of vulnerabilities fixed. Reward Competitiveness: Higher bounties tend to attract higher reputation researchers who find more severe vulnerabilities, though there isn t a simple linear relationship between reward level and activity. In fact, as our index quantifies, there are successful programs that offer no financial rewards at all. Response Efficiency: Researchers appreciate clear, timely communication. The data show that programs that respond quickly to new reports, and keep open communication channels during the triage and resolution process, tend to get more reports and more repeat researchers, leading to a virtuous, security-enhancing cycle. In addition, the timely resolution of vulnerabilities reduces the risk of potential exploitation, leading to greater security. Hacker Depth: Researchers who repeatedly investigate your products are going to find more severe vulnerabilities as they learn your code. It s (data) science. Not to mention that repeat researchers tend to produce better reports, and have smoother communication with your team, as you work together over time. This metric also takes into account the Reputation of contributors, since the data show that high reputation researchers are more capable of finding critical issues.

Hacker Breadth: This is where Linus s law, given enough eyeballs, all bugs are shallow, really kicks in. With a large-enough testing group, problems in your code will be found quickly and fixes identified more efficiently. This is one of the reasons successful HackerOne programs continually add new researchers until ultimately opening up publicly, at which time they leverage the greatest potential testing pool on the planet: the entire population of the Internet. Signal Ratio: The measure of valid reports against the total number of issues received is a primary indicator of the value gained from a program. A high signal ratio means more actual vulnerabilities identified, and ultimately fixed, for the same amount of time spent triaging and responding. While we've made great strides in improving signal across the platform, it remains our top area of focus, and we have additional enhancements coming soon. The result of putting these dimensions together is an advanced framework for quantifying impact and assessing the performance of these programs. Dimension Vulnerabilities Fixed Reward Competitiveness Response Efficiency Hacker Depth Hacker Breadth Signal Ratio Input Factors* number of vulnerability reports resolved, breadth of vulnerabilities resolved average bounty, number of bounties, bounty award structure, maximum bounty report close time, first response time, bounty time, triage time sum of contributor reputation, number of repeat contributors number of new and existing contributors, public program percent clear signal, percent nominal signal * Factors are ordered by their weights. Successful programs neither display a single HSI profile, nor necessarily have high marks in every single dimension. These indices will reflect a variety of circumstances, notably the program s goals and organizational characteristics like security maturity, size, and attack surface. Vulnerabilities Fixed 10 Vulnerabilities Fixed 10 8 8 Researcher Breadth 6 4 Reward Competitveness Researcher Breadth 6 4 Reward Competitveness 2 2 Researcher Depth Response Efficiency Researcher Depth Response Efficiency Program 1 Program 2 Signal Ratio Avg. Enterprises Signal Ratio Avg. Enterprises

Take, for example, these spider chart visualizations of the HSI for two successful programs, graphically representing two large enterprise programs: one that offers bounties, and one that does not. Program 1, on the left, is one of the most successful programs in our dataset, topping the charts for Vulnerabilities Fixed and Hacker Breadth and Depth--advantages for public programs--and getting high marks in Reward Competitiveness as well. Program 2, on the right, also does very well in most dimensions, despite offering no monetary bounty at all. These examples suggest two things. First, you can clearly have a successful disclosure program without offering bounties, but with a slight cost to Hacker Breadth and Depth. Second, you should ignore dogma and use data to determine which incentives produce the ideal outcome for your organization and its unique circumstances. In this ebook, we ll further explore these dimensions including describing what goes into each one, show data on why that facet of the program is important, and make recommendations for how programs can improve their performance. As we operationalize the HSI, we are exploring ways to make it accessible to all HackerOne programs on an on-going basis. * Note: The Success Index is based entirely off of transaction data with no access to teams' vulnerability information.

Vulnerabilities Fixed HackerOne introduced the HackerOne Success Index, a method to measure the effectiveness of HackerOne-powered vulnerability disclosure programs. The index calculates values from 1 to 10 across six dimensions by which programs can benchmark their success each month. This chapter dives into the "Vulnerabilities Fixed" dimension, which describes the quality and frequency of security improvements from a vulnerability disclosure program over time. Vulnerabilities Fixed is a strong indicator of both the maturity of the overall program and security of the application, since all other index measurements will affect it to varying degrees. The number of vulnerability reports and the breadth of vulnerability types fixed make up this dimension, and are weighted for recency, giving newer reports a higher impact on the index. We take a deeper look at these two factors below. Number of Vulnerabilities Fixed Average Number of Vulnerabilities Fixed 24 18 12 6 0 5/1/2015 6/1/2015 7/1/2015 8/1/2015 9/1/2015 10/1/2015 VF 7-9 VF 4-6 Here you see in the chart above the average number of resolved reports in the last six months for HackerOne programs within two Vulnerabilities Fixed index bands, high performers between 7-9 and mid-level between 4-6. We see companies constantly ship new products, features, and updates which can include new vulnerabilities; these two groupings both contain large and small companies from a variety of industries that incentivize persistent examination of continuously changing code. The upper group is averaging a little over 20 vulnerabilities fixed each month, while the middle tier resolves about 6 reports per month. A long-term commitment to your program encourages researchers to stay involved and surface harder-to-find vulnerabilities. The Vulnerabilities Fixed dimension of this index favors a steady and continuous volume of high quality reports. A program s month-to-month count of resolved vulnerability reports is the most heavily weighted input because this most directly translates to enhanced security as issues are surfaced and fixed.

How do other dimensions affect Vulnerabilities Fixed? Researcher Breadth Researcher Depth Reward Competitivenes Response Efficiency Signal Ratio Vulnerabilities Fixed +++ +++ ++ + + * Pearson Correlation table representing level of positive correlation between Vulnerabilities Fixed and other. A correlation does not imply causation, only that some positive relationship exists between dimensions. As we mentioned earlier, the Vulnerabilities Fixed dimension is directly affected by other dimensions. We don t have causal proof in the data yet, but we can point to very strong positive correlations in the table above. You can see that improving your performance in any of the other HackerOne Success Index dimensions, but especially Hacker Breadth and Depth and Reward Competitiveness, is generally associated with increases in your Vulnerabilities Fixed dimension. Some common tactics include: inviting more researchers periodically (if your program is invitation-only); broadening your program s scope so that researchers have new challenges to focus on; and increasing your rewards over time to match researchers greater time investments. What variety of vulnerabilities are being found and fixed? Average Number of Vulnerabilities Types 15 12.5 10 7.5 5 5/1/2015 6/1/2015 7/1/2015 8/1/2015 9/1/2015 10/1/2015 VF 7-9 VF 4-6 The chart above shows the average number of unique vulnerability types that are resolved each month by HackerOne programs in the same two Vulnerabilities Fixed index bands, 7-9 and 4-6. Teams that fix a greater variety of vulnerabilities at volume will also improve their performance in the Vulnerabilities Fixed dimension, reflecting enhanced security for their products and properties. The HackerOne platform currently offers 14 vulnerability types (as well as a Non Applicable catch-all that we won t be examining here) for reporters to choose from. Nearly 10% of all fixed vulnerabilities represent rare but severe issues like Remote Code Execution, SQL Injection, or Privilege Escalation, along with a large number of more common bug types. Our data show that our most successful programs address on average about 13 different types of vulnerabilities each month, while mid-range programs average 8 per month.

Reward Competitiveness There are clear bounty patterns within HackerOne-powered programs, and this third chapter on the HackerOne Success Index (HSI) digs into data across hundreds of customers and nearly 15,000 rewards. A program s average bounty is the highest weighted factor in the Reward Competitiveness dimension, followed by equal weighting for the overall number of rewards, the bounty range, and the maximum award. While success in vulnerability disclosure does not require paying bounties, strong patterns have emerged from those programs that do offer monetary awards. Average Rewards $ 1,000 Top Performers Platform $ 750 & 500 $ 250 $ 0 1/1/2015 4/1/2015 7/1/2015 10/1/2015 The graph above shows a 90-day moving average of the mean reward amount on HackerOne over the last twelve months for both top performers in this dimension and the platform average. The platform average hovers just below $500 with a slight upward trend, while the top performers started below $750 but are nearing a $1,000 average with a clear increasing trend. Our data suggest a few lessons: 1. Programs usually start with lower awards, or even no bounty, as researchers find and address minor vulnerabilities. 2. Mature programs should target at least the platform bounty average, reflecting the fact that vulnerabilities become more difficult and time-consuming to discover. 3. To attract and retain the best researchers, programs need to target a higher bounty average and steadily increase rewards over time to maintain competitiveness with top performers

Reward Distribution $5K $4K Bounty $3K $2K $1K $0K 0% 20% 40% 60% 80% 100% Percentage The chart above shows the long-tail distribution of monetary rewards across the entire HackerOne platform. This reflects the power law (in particular, the Pareto principle, or 80-20 rule) in which we see just over 20% of bounties at or above the HackerOne average of $500 and nearly 80% of bounty amounts below. Such a distribution is both expected and desirable as it closely tracks that of vulnerability severity. In this chart, we broke the Y-axis to focus in on the distribution, with about 1% of bounties at $5,000 or above, up to our current highest single bounty of $30,000. Our data suggest the following lessons: 1. Researchers appreciate when bounties are paid in proportion to their risk ($100 for a small bug, $5,000+ for an RCE). 2. Minimum bounties should be set well below your target average, providing the flexibility to match reward to severity. 3. Set and communicate a maximum orders of magnitude above your minimum to attract deeper engagement.

Response Efficiency Quickly acknowledging, validating, and resolving submitted issues while recognizing the researcher s effort is vital for successful vulnerability coordination. This fourth chapter in our series on the HackerOne Success Index (HSI) explores response data across nearly 100,000 reports. We found that report resolution time, or the elapsed time between submission and closure as resolved, is the main factor impacting the Response Efficiency dimension. Smaller weight is given to first response time, and the times to bounty and to triage. The data offer insights into response best practices and when to award at resolution or at validation. Top-Performers Platform 25th 50th 75th Std. 25th 50th 75th Std. First Response 0.1 0.4 2.0 6.9 0.1 0.8 4.1 35.6 Triage 0.2 1.0 4.4 7.7 0.4 1.8 7.1 26.1 Bounty 1.6 5.3 12.9 19.8 4.6 16.7 52.7 69.3 Resolved 1.1 5.2 17.2 26.6 4.7 20.9 66.6 82.5 * Response times in days after submission. The table above gives the 25th, 50th, and 75th percentiles for the four Response Efficiency inputs across all resolved HackerOne reports, as well as the standard deviation for both top-performers (Response Efficiency Index of 7-10) and the entire platform. 50th Percentile Response Times 20 15 16.7 Top Programs All Programs Days 10 5 5.3 5.2 0 0.8 0.4 1.8 1 First Response Triage Bounty Resolved

Key findings from the data: To be competitive with the top performers, you should target a first response within 12 hours, and within 24 hours to align with the platform s 50th percentile. For triage, top performers should target within 24 hours while those aiming for platform parity would be within 48 hours. Consistency is more important than a particular absolute value, especially given the differences in applications and vulnerabilities across HackerOne. The data show that top programs have less variance in their response times. The best way to stay consistent is by establishing internal SLAs, and transparently communicate with researchers to help manage expectations. Response, triage, resolution, and bounty times should be proportional to the severity of the issue. The greater the risk to your security from an issue, the more quickly it should be addressed. There are naturally variations in time to resolve due to dependencies, complexity and urgency. When a you do have to deviate from your normal time range, communicate this with the researcher. They are familiar with reasonable deviations and will understand if you're transparent. Resolution Time vs Time to Bounty Our data shows some programs prefer to pay their bounties when a vulnerability is validated, and some pay when resolved. Bounty At Validation or Resolution Percenrage of HackerOne Programs 18% 34% 48% At Validation Mixed At Resolution The data show that about 50% of HackerOne programs award at resolution, 18% at validation, and 34% choose when to award on a case-by-case basis. We are seeing an increasing trend of awarding at validation as well, and it is emerging as an industry best practice. While we generally recommend consistency, there are a couple scenarios when it might make sense to be flexible on when you award bounties:

Rewarding quickly for a severe vulnerability can be a reflection of its priority and a signal to the researcher of its importance to you. An initial bounty can be supplemented later if it was even more severe than originally thought. Outlier vulnerabilities with long-term resolution shouldn't delay a bounty. Consider awarding upon validation for these. Awarding at time of resolution can help ensure accurate bounties by giving you time to correct any validation mistakes. Hacker Breadth and Depth Linus s Law states, given enough eyeballs, all bugs are shallow, meaning broader and deeper testing populations find issues faster. This fifth chapter in this ebook explores Hacker Breadth and Depth data from over 2,500 active hackers participating in hundreds of programs. Our data show the best performing programs on HackerOne attract not only more overall hackers but more repeat hackers. Repeat hackers are responsible for the majority of resolved reports and bounties on the HackerOne platform. There is a clear upward trend in bounty amounts as a hacker submits more unique, valid reports to the same program. The more time a hacker spends looking at your software, the more valuable the reports are likely to be. For programs, there s significant value in building hacker loyalty. For hackers, diving deep into the same code and building great relationships with security teams pays off. Breadth and Depth Averages 60 Total 56 Repeated Avg. Hackers Per Month 45 30 15 0 22 19 16 6 7 All Top Public 8 4 Private Programs

The above graph shows the average monthly hacker participation across the different HackerOne programs -- all, top performers in this dimension, Public, and Private. The average HackerOne program gets 16 participating hackers per month, and 6 of those are repeat hackers. The top programs attract an average of 56 hackers per month, with 19 repeating. The data also indicates that Public programs have better Breadth and Depth on average than Private programs. What does it all mean? The most competitive programs attract about 50 new hackers per month, and retain around 20 month-to-month. Want to attract more hackers? More hackers make you more secure. Don t believe us? Recent research demonstrates a very strong linear correlation between the number of participating hackers and the number of vulnerabilities fixed.* Hackers have diverse skills and approaches, and tend to scan different parts of in-scope properties. Go public! Public programs get more unique hackers and more repeat hackers. You have more control over a public program than ever with the new Signal Requirements feature. Repeat Hackers are Immensely Valuable 100% 75% 78% 88% % Resolved % Bounty 50% 62% 63% 25% 22% 38% 37% 12% 0% One-time Contributors Repeated Contributors Low Reputation High Reputation

The chart above demonstrates that, repeat or loyal contributors account for the vast majority of the resolved reports on HackerOne, and an even greater portion of the bounties paid. This shows that repeat hackers are more valuable to programs, because that s where the great majority of the bounty money goes. While Reputation is a good indicator of report quality and validity, the data show that multiple submissions to the same program are an even better predictor of validity and impact. $1,600 $1,200 Avg. Bounty $800 $400 $0 10 20 30 40 50 Number of Bounties Received from Program Bounty payments for hackers tend to go up on average after the first submission to a program, indicating the reports are more valuable. Let s look at the data. The above chart is generated by first selecting all HackerOne programs with more than 100 total bounties paid. Then, each hacker s bounty is sequenced (1st, 2nd, 3rd, etc, submission) for each program. For each rank, we calculate the average amount and plot it. The results show that at an aggregated level the average payout increases as a hacker finds more valid bugs in a bounty program.**

Key Takeaways: For programs, retaining hackers increases the likelihood of finding high severity issues. Treat your hackers well, and reward them for their loyalty, and you can expect them to reciprocate with significant issues as they get to know your code. For hackers, it pays to develop a good professional relationship with programs. Most programs will steadily grow rewards as report quality and impact increase. Keep building great relationships with security teams, and it will generally prove worth it. To learn more about launching a successful bug bounty program with HackerOne, please email us at sales@. * Mingyi Zhao, Jens Grossklags, and Peng Liu. "An empirical study of web vulnerability discovery ecosystems." Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security (CCS). ACM, 2015. <http://sites.psu.edu/mingyi/wp-content/uploads/sites/11890/2014/04/an-empirical-study-of-web-vulnerability- Discovery-Ecosystems.pdf> ** Thomas Maillart, Mingyi Zhao, Jens Grossklags and John Chuang. Given Enough Eyeballs, All Bugs are Shallow? Revisiting Eric Raymond with Bug Bounty Market. The 15th Annual Workshop on the Economics of Information Security (WEIS), 2016.