Opinion
Security and Privacy

Balancing Secrecy and Transparency in Bug Bounty Programs

Seeking more transparent and secure software ecosystems.

Posted
woman holding a surreal mirror among clouds, illustration

Bug bounty programs (BBPs) crowdsource vulnerability discovery—enabling ethical hackers to identify and report flaws for timely vendor fixes. However, vendors can also withhold information about these software vulnerabilities (SVs), creating an information asymmetry that hinders users’ ability to evaluate software security. In this column, we argue that mandatory disclosure requirements can help bridge this information gap and foster a more transparent and secure software ecosystem.

Software vulnerabilities are security flaw in software that attackers can exploit. Finding and fixing SVs through software testing is expensive and time-consuming, conflicting with pressure for rapid software releases. This dilemma forces managers to release software once they have mitigated residual security risks to tolerable levels. Thus, many vendors release software with residual vulnerabilities,3 choosing to patch them later—a trade-off that can lead to uncoordinated disclosures.

Vulnerability disclosure is a classic example of a negative externality, as a third party bears the adverse effects of disclosure. In the software context, vulnerability disclosure by any entity can impose costs on other software users because malicious hackers can learn about the bugs and exploit them. Vendors incur significant patching costs to develop, test, and release patches quickly, and software users incur costs to apply these patches to protect themselves. Due to these trade-offs and externalities, dealing with SVs has always been challenging. The incentives to release software early and minimize testing costs conflict with reducing SVs. Vendors prefer that any SVs discovered post-release remain hidden, especially from malicious hackers. At the very least, vendors want to delay disclosure until they can develop and distribute patches. In contrast, users prefer that the vendor fixes the issue quickly.

Ethical hackers (EHs) are a special group of software users who strive to find SVs without intending to exploit them. When EHs discover SVs, they want vendors to fix them promptly and be transparent. Due to the high cost of developing quick patches, vendors would instead move slowly and not disclose many details. Thus, EHs and vendors sometimes share different goals. These misaligned incentives have led to tussles between vendors and EHs. The slow response by vendors led EHs to resort to full disclosure, which provides transparency to other users and exerts pressure on vendors to release patches quickly. For instance, the BugTraq mailing list historically served as the conduit through which EHs would publicize SVs. Vendors pushed for limited disclosure because these uncoordinated public disclosures created chaos. Entities like the Center for Emergency Response/Coordination Center (CERT/CC) facilitate limited disclosures, which offer a middle ground and allow vendors a grace period to address SVs before publicly announcing them.

Dissatisfied with previous vulnerability disclosure arrangements, software vendors began offering bounties to users—especially EHs—for discovering and reporting vulnerabilities while requiring that these vulnerabilities not be publicly disclosed. Netscape pioneered this approach, which later gained traction broadly, with firms like Google and Apple among those adopting bug bounty programs. However, many smaller vendors struggled to attract EHs. To address this, bug bounty platforms such as Bugcrowd (https://www.bugcrowd.com) and HackerOne (https://www.hackerone.com) emerged, facilitating matches between vendors and EHs and offering services like defining reporting rules, processing payments, and mediating disputes. These platforms have transformed software testing into globalized gig work, expanding the pool of vulnerability hunters. Vendors are attracted to crowdsourcing vulnerability discovery, as it allows them to attract a broad set of ethical bug hunters who will put their software in diverse real-world scenarios that in-house testing cannot replicate. This approach has the potential to improve overall software quality.

Given their growing acceptance, researchers in multiple disciplines have studied different aspects of BBPs. In a survey, Akgul et al.2 found that EHs are motivated to participate in BBPs by rewards, learning opportunities, and reputational gains. The EHs’ main concerns about BBPs were poor program responses, unclear scope, and poor platform support. Zrahia et al.6 found that the supply of EHs increased during the pandemic, reducing rewards for valid SV reports. Ahmed et al.1 have suggested that BBPs have allowed software vendors to manage disclosure timelines more effectively. Gal-Or et al.4 studied the impact of BBP on software release timing and found that a firm with a BBP always releases software earlier with more bugs. Despite this growing body of research, several key issues remain unresolved, including EH incentives, program design, platform design, and the overall impact on software security.

For example, one major unresolved issue is that BBPs give vendors excessive control over vulnerability disclosure. Stakeholders often overlook the benefits of public vulnerability disclosure. Public disclosure creates market pressure for vendors to expedite patching while increasing awareness among software users and enhancing their preparedness. It can inform other vendors about bugs that apply broadly to a particular class of software. It allows EHs to learn from them2 to discover similar SVs in different software or chain SVs into more complex combinations. It enables investors to know about the risk to their investment due to the cybersecurity practices of the firm.a Finally, it generates market awareness about the quality of a firm’s products, increasing accountability for vendors’ security practices.

At their core, BBPs buy silence from EHs, either temporarily or indefinitely.b The latter occurs mainly with the private BBPs, in which vendors select EHs to find vulnerabilities under strict confidentiality agreements to ensure findings remain undisclosed. Private BBPs allow firms to carefully choose participating EHs, which can reduce frivolous or low-quality bug reports and lower BBP processing costs. However, the secrecy of private programs creates significant information asymmetries in the market. Private, invitation-only BBPs obscure both their existence and the vulnerabilities they uncover. Non-disclosure can remove vendors’ incentives for accelerated patch development, reduce security learning, and, crucially, remove information about firm quality from the market. This lack of transparency prevents consumers from making informed decisions about product security. Consequently, BBPs can incentivize vendors to release products early,4 potentially with less testing and more residual bugs.

On the other hand, BBPs offer clear benefits. For vendors, they reduce uncoordinated vulnerability disclosures—mitigating bad press and emergency patch releases. For users, they lower exploitation risks and speed up the launch of innovative products. Additionally, by providing a legal avenue for profiting from vulnerability discovery, BBPs help convert malicious hackers into ethical ones, strengthening overall cybersecurity.

However, BBPs enable vendors to obfuscate their products’ quality and potentially delay patches. Because consumers lack product security information, comparing products becomes challenging.

The Need for Transparency: Bridging the Information Gap

Delays in patching and non-disclosure of vulnerabilities undermine consumer protection and erode trust. Before BBPs became popular, end users and ethical hackers controlled vulnerability disclosure—which vendors dismissed as “information anarchy.”c With the rise of BBPs, especially private ones, the pendulum has swung in the other direction: vendors now exercise excessive secrecy. Yet transparency strengthens the software industry by fostering trust,5 improving security practices, and enhancing overall resilience.

To address these issues, we propose a multipronged approach that promotes transparency without undermining the benefits of BBPs. Although BBP platforms are well positioned to set disclosure policies given their close interactions with vendors and ethical hackers, they lack sufficient enforcement power because vendors can readily switch BBPs if terms become unfavorable. Since these delays and non-disclosures create negative externalities—where users bear the costs of vendor security decisions—market forces alone will not incentivize sufficient transparency, making government intervention necessary. Such intervention must harness the benefits of BBPs while minimizing their social costs, and do so without driving vendors away. We suggest actions for multiple government agencies based on their mandates: NIST (the National Institute of Standards and Technology) sets guidelines without enforcement power; CISA (the Cybersecurity and Infrastructure Security Agency) influences practices through Binding Operational Directives for federal executive branch agencies; and the SEC/FTC (Securities and Exchange Commission/Federal Trade Commission) have direct enforcement authority.

Recommendations for More Transparent and Better Security

Standardization and guidance for bug bounty programs.  NIST should develop and maintain standardized definitions and metrics for bug bounty programs. These should cover key aspects such as vulnerability severity, disclosure timelines, reporting formats, and patch effectiveness. These standards should be developed in collaboration with industry experts and iteratively updated to reflect evolving threats and best practices. NIST should also create standardized reporting frameworks for vulnerability and patch data obtained through bug bounty programs. This framework will ensure consistency in reporting requirements so that the SEC/FTC can establish and enable cross-industry comparisons of vendor security performance. NIST’s role is to provide guidance; these standards become impactful when adopted voluntarily or referenced by other agencies.

Vendor incentives and federal purchasing:  The Cybersecurity and Infrastructure Security Agency (CISA) should establish a voluntary vendor accreditation program. This program would reward vendors who commit to timely remediation, vulnerability disclosure, and reporting. Key criteria for accreditation should include compliance with NIST’s vulnerability disclosure and reporting standards, adherence to reasonable remediation timelines, and timely disclosure of vulnerabilities after a patch is released. Accredited vendors would receive preferential treatment in federal procurement processes, providing a strong market incentive for participation. To further strengthen this incentive, CISA should issue a binding operational directive (BOD) requiring federal executive branch agencies to incorporate a vendor’s accreditation status as a requirement in procurement decisions for software products and services unless a designated authority grants a waiver. CISA should also continue coordinating vulnerability disclosure processes across the public and private sectors, providing operational support and expertise to the SEC and FTC. While CISA can encourage timely remediation through the accreditation program and the BOD, direct enforcement of timelines on private-sector entities falls outside its primary authority.

Reporting by public and private software vendors.  The SEC should mandate that publicly traded software companies include specific bug bounty program metrics in their 10-K filings. This data should encompass: the number and severity of vulnerabilities discovered through their bug bounty programs; the average time-to-patch for vulnerabilities of each severity level; and the total monetary value of bounty payouts. The FTC, leveraging its consumer protection authority, should establish parallel reporting requirements for private software companies that exceed specific revenue or user thresholds. These requirements should utilize the standardized reporting framework developed by NIST, ensuring consistency and comparability across the industry. This dual approach ensures comprehensive coverage of the software market and provides valuable information to investors and consumers alike.

BBPs can not influence vendors’ product security decisions without stronger transparency mandates. While conscientious vendors will diligently patch SVs discovered through BBPs, others might deprioritize patching based on short-term business incentives when they can control disclosure timing. Mandatory disclosure requirements would restore market pressure on vendors to patch promptly. They could drive improvements in secure development practices as their security track record becomes visible to customers, competitors, and investors.

SVs stem from perverse economic incentives, not just technical or legal problems. Thus, while implementing the recommendations outlined in this article can foster a more transparent and secure software ecosystem, we must carefully study their economic and policy implications to avoid unintended consequences.d

Conclusion

BBPs are valuable tools for identifying vulnerabilities, but the current system’s emphasis on secrecy over transparency leaves users and investors in the dark about software security. There is an urgent need for action—embracing transparency while balancing temporary secrecy during remediation can lead to a more resilient and accountable software ecosystem.

    References

    • 1. Ahmed, A.  Vulnerability disclosure mechanisms: A synthesis and framework for market-based and non-market-based disclosures. Decision Support Systems 148 (2021).
    • 2. Akgul, O. et al. Bug hunters’ perspectives on the challenges and benefits of the bug bounty ecosystem. In Proceedings of the 32nd USENIX Security Symp. (USENIX Security 23) (2023).
    • 3. Ashish, A., Caulkins, J.P., and Telang, R. Sell first, fix later: Impact of patching on software quality. Management Science 52, 3 (2006); https://bit.ly/3HTs2M8
    • 4. Gal-Or, E., Zia Hydari, M., and Telang, R. Merchants of vulnerabilities: How bug bounty programs benefit software vendors. SSRN 4808742  (2024).
    • 5. Kwan, D., Cysneiros, L.M., and do Prado Leite, J.C.S. Towards achieving trust through transparency and ethics. In Proceedings of the 2021 IEEE 29th Intern. Requirements Engineering Conf. (RE 2021) (2021).
    • 6. Zrahia, A. et al. The simple economics of an external shock to a bug bounty platform. J. Cybersecurity 10, 1 (2024).

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More