The Center for Cybersecurity Policy and Law, an infosec think tank, teamed with several technology companies Thursday to launch the Hacking Policy Council, which aims to improve security research and vulnerability disclosure policy.
According to a Center for Cybersecurity Policy and Law (CCPL) webpage dedicated to the new initiative, the council "aims to make technology safer and more transparent by facilitating best practices for vulnerability disclosure and management, as well as empowering good faith security research, penetration testing, and independent repair for security."
The council's website lists four key goals: Establishing a "more favorable" legal environment for vulnerability disclosure, bug bounties, good faith security research and more; improving collaboration among the security community, businesses and policymakers; preventing new restrictions on ethical security research; and strengthening organizations' resilience via "effective adoption of vulnerability disclosure policies and security researcher engagement."
Founding members include HackerOne, Bugcrowd, Intel, Luta Security, Google and Intigriti. The CCPL is a nonprofit established by law firm Venable LLP in 2017 to develop practices and policies for improving cybersecurity worldwide.
HackerOne chief legal and policy officer Ilona Cohen told TechTarget Editorial she is excited about the council because it calls attention to the critical role ethical hackers carry within the security ecosystem.
"We also hope to further nurture the ethical hacking community," she wrote in an email. "Legacy laws can have a chilling effect on good faith security research due to the uncertainty they can create for ethical hackers. Explicit safe harbors and exceptions that protect good faith security research will also help foster an already promising talent pipeline for the cyber workforce as we all try to address the current skills gap."
Bugcrowd CEO Dave Gerry told TechTarget Editorial in an email that Bugcrowd joined "to help foster a more favorable legal, policy and business environment for security researchers, vulnerability disclosure and vulnerability management."
Gerry said protecting consumers through strong security practices within enterprises is Bugcrowd's primary goal.
"To achieve this, we would like to see improvements to recently passed and proposed policies in the EU and United States," he said. "For example, the EU's Cyber Resilience Act and the SEC's recently proposed incident reporting rule both create additional risk as they could require the public exposure of vulnerabilities. even if they are yet to be remediated."
He added that some legacy laws restrict beneficial security activity "due to a lack of clarity on various rules that don't distinguish between ethical security researchers and bad actors."
Harley Geiger, coordinator for the Hacking Policy Council as well as a cybersecurity attorney at Venable, told TechTarget Editorial that many of the council's members have been working on security issues like vulnerability disclosures for years.
He added that although the Hacking Policy Council launched Thursday, its members have been meeting for several weeks and the council "has already begun engaging governments on policy issues related to vulnerability disclosure."
Researchers and vulnerability disclosure
The vulnerability reporting practices face frequent criticism from researchers, whether it be the way vendor or third-party bug bounty programs handle vulnerability submissions or inconsistent communication with researchers.
Researchers also face hurdles involving vulnerability disclosure; many bug bounty programs do not allow researchers to publicly share research relating to the flaws they submit to rewards programs. This is considered controversial because nondisclosure agreements prevent researchers from being credited for their submissions, and they prevents the public from being aware of serious issues that pose risk to customers and users.
Artificial intelligence research company OpenAI launched a bug bounty program Tuesday that forbid researchers from publicly disclosing vulnerability submissions to the program. Katie Moussouris, founder and CEO of Luta Security, told TechTarget Editorial Wednesday that the decision was "shortsighted" and served neither OpenAI nor the public.
HackerOne's press release noted that "misinformed and outdated notions about vulnerability disclosure persist, and some organizations still struggle to effectively adopt best practices like vulnerability disclosure programs." Both HackerOne and Bugcrowd's platforms enable organizations to publish private bug bounty programs with nondisclosure.
Cohen said HackerOne recognizes how useful public disclosure can be for the security ecosystem, and that hackers are proud of and want recognition for their vulnerability research.
"That's why we always advise our customers to be as transparent as possible around disclosure, and to disclose vulnerabilities that the security of the broader internet will benefit from being aware of," she said. "However, building toward security maturity is a journey."
She added that some organizations and industries are more comfortable with public disclosure than others. Cohen explained that at this point, HackerOne wants to encourage policy "that will both educate organizations about the benefits of public disclosure and create a positive regulatory environment that encourages organizations to take the step toward publicly disclosing vulnerabilities according to best practice and existing vulnerability disclosure standards."
Dave GerryCEO, Bugcrowd
Asked about what kind of improvements he would like to see with disclosure policy long-term, Bugcrowd's Gerry said he hopes the council can set security standards to "encourage beneficial cybersecurity activities."
He added, "Governments and businesses around the world recognize the need for good faith security research and responsible vulnerability disclosure. As a security testing industry, now is the time to set aside competition to work in a tightly coordinated way to promote widespread adoption of best practices that enhance transparency to protect people."
Geiger said that long term, he hopes to see greater adoption of vulnerability disclosure and management best practices.
"That includes greater integration of vulnerability disclosure policies into organizational security programs, as well as fewer regulations that deviate from standards and best practices -- such as regulations that would require businesses to disclose unpatched vulnerabilities to government agencies, or laws that fail to distinguish between malicious criminal activity and good faith security research," he said.
Bugcrowd founder and CTO Casey Ellis wrote in an email that creating a more conducive environment for good-faith security research "has been an obvious core component of Bugcrowd's mission since its inception.
"My personal hope is an environment where the chilling effect around security research conducted in good faith goes away, and benevolent hackers can operate freely and without fear," Ellis told TechTarget Editorial. "Secrecy is brittle, and transparency is anti-fragile -- but transparency isn't yet the norm. I'm encouraged to see continued progress in the direction of vulnerability disclosure being seen as an act of maturity, as opposed to one of 'airing dirty laundry,' but there's a long way to go."
UPDATE 4/14: Google published a blog post Thursday on additional new security research initiatives in which the tech giant is participating. Besides the Hacking Policy Council, Google provided seed funding to the newly-launched Security Research Legal Defense Fund, and also updated its internal policy that commits to public disclosure when a vulnerability in any of its products is exploited.
Google head of security policy and post author Charley Snyder wrote that company joined the council due to laws being passed and proposed by governments to, in certain cases, require flaws to be privately disclosed to said governments. Snyder wrote that, "It is important that we get these laws right. "The defense fund, which Google is providing the seed funding for, was stood up to protect "good faith security research."
"In many cases, individuals act independently and in good faith to find and report vulnerabilities -- giving vendors a chance to address them before attackers can develop exploits," Snyder said. "Unfortunately, these individuals often face legal threats that can cause setbacks to security research and vulnerability disclosure, especially for individuals without access to legal counsel. The Security Research Legal Defense Fund aims to help fund legal representation for individuals performing good-faith research in cases that would advance cybersecurity for the public interest."
Google also published a full-length report dedicated to the initiatives.
Alexander Culafi is a writer, journalist and podcaster based in Boston.