Getty Images

OpenSSF GM talks funding, legal software supply chain issues

The OpenSSF leader lays out plans fund open source software supply chain security in a slowing economy and to speak out against the EU's Cyber Resilience Act.

Fears are mounting in the tech industry about economic headwinds but also about ever-worsening cybersecurity attacks. The general manager of the Open Source Security Foundation finds himself at the nexus of both.

So far, the foundation, known as the OpenSSF, has laid out ambitious funding and mobilization goals to improve open source software supply chain security in the roughly 18 months since its founding. These efforts have the backing of the Biden administration and large companies among its membership, such as Amazon, Google and Microsoft. However, it has not yet met last year's initial funding goal of $150 million.

Brian Behlendorf, general manager of OpenSSF, now confronts year two of the campaign to spur collective action to improve open source security in the wake of Log4j as well as pending cybersecurity legislation in the EU that has been weighing on the minds of open source advocates around the globe. TechTarget Editorial caught up with Behlendorf this month to discuss these trends and more.

TechTarget Editorial: [Linux Foundation Executive Director] Jim Zemlin said at KubeCon that the OpenSSF hadn't yet reached its $150 million funding goal. What does that funding picture look like now that we're in 2023?

Brian Behlendorf, general manager, OpenSSF Brian Behlendorf

Brian Behlendorf: The mobilization plan, [and] the 150-million-dollar number that was there, was intended to describe true north -- to say, 'Hey, if we did decide we could pull together some resources to go and tackle a few big things, here's what's possible.' Kind of like the first business plan that an entrepreneur comes up with, developed over the course of about a three-week sprint from some really sharp people, but that represents a first step. There's been further evolution -- things like the OpenSSF Incident Response Team proposal, [and] a proposal to invest more heavily in the education side to try to get the best practices and the training that we've developed out to developers and students in college. I expect this year we'll do an update to that plan that reflects a further year of research.

Meanwhile, we raised $7.5 million dollars for Alpha-Omega last year, and our hope is that we can [raise] that same amount this year. Frankly, with the economic headwinds, what we're looking at is, 'How do we ensure the resources we have now continue?'

What is Alpha-Omega?

Behlendorf: There's two halves to it. The first is about funding security teams at major open source foundations and upgrading their security processes. This Alpha side of Alpha-Omega made grants last year totaling about $2 million to [groups] like [the] Python [Software Foundation], the Node.js Foundation, and the Eclipse Foundation, to go and buff out their security teams. If we can help them see the value of resourcing [security] teams not just as a defensive measure but to proactively put better processes in place, then those communities will fund themselves in the long term. On the Omega side, you could think of it like an open source equivalent to Google's Project Zero. How do we set up both a team and an infrastructure to systematically scan the top 10,000 open source projects for new vulnerabilities and attempt to close them at scale? Could we systematically go and see if anyone else is vulnerable to the same thing -- systematically open pull requests to go and close 100 bugs at once? [We could] manage that the same way you would do a coordinated vulnerability disclosure process, which is so essential to getting this stuff fixed in a way that is the least disruptive possible.

When the mobilization plan was published last year, several companies made substantial investments toward the $150 million goal. Were you surprised that you couldn't get to that goal last year, with the White House involved and so many big companies participating?

Behlendorf: What we did get were pledges of $30 million from the existing OpenSSF members. On that day in May when we released the report, it wasn't, 'Here's the cash and we're gonna go and run.' It was, 'Come up with things and prove them out.' And we intentionally decided to take the time to substantiate many of those projects with further research.

I had hoped that with government stating this is a priority, there'd perhaps be new kinds of actors, like insurance companies that are starting to write cyber risk policies and sources of other funding there. But those sales cycles and those opportunities are long. In Washington, there's still talk about policies that go in the right direction, and funding that might be helpful as well. I don't want to count any chickens before they hatch.

Then we see the European Union going in a direction with the Cyber Resilience Act that we think might be actively harmful to efforts across the software industry, not just open source. We haven't published any comments on it yet, but the Eclipse Foundation did recently put out a blog on this. We'll likely put something out over the next week on this as well.

What's harmful about the Cyber Resilience Act?

Behlendorf: The Cyber Resilience Act is a proposed policy that would place obligations on the publishers of open source software that's used in critical infrastructure, as they define it, that are expensive to meet and trigger merely on the publication of code, not just on its use. What they're proposing is that even to publish open source code, you have to follow a whole set of rigor and steps and be audited in your process and that sort of thing. I think that's not the way to get there, with the open source community, or with technology in general.

Contrast that to the U.S. government's approach around a more specific thing, like SBOM. They've been working with industry to talk about what are the right standards, what are the right nudges? And then eventually, they'll require SBOM for government procurement, possibly even things like medical devices. But they haven't yet said, 'In the United States, to publish open source code, you'd have to have an SBOM.' The CRA goes even further than that in specifying a lot of additional things.

At the same time, there's a growing sense of crisis about cybersecurity -- about how the attacks keep mounting, the breaches keep getting bigger and more frequent. Do you see that sense of frustration, and what do you think the answer is?

Behlendorf: If Log4Shell was the last major supply chain breach, that'd be very nice. But that's not likely to happen. There's a constant escalation between defensive techniques and offensive techniques. And just as quickly as we find ways to tighten the ship, around a whole domain -- like typosquatting, for example -- the bad actors will move on to the next level. What you hope is that it doesn't devolve into a mere war of attrition -- that we do things that help seal off whole categories of vulnerabilities at once.

We see the European Union going in a direction with the Cyber Resilience Act that we think might actually be actively harmful to efforts across the software industry, not just open source.
Brian BehlendorfGeneral manager, Open Source Security Foundation

In the early days of the internet, we didn't encrypt communications because you thought you could trust the people running the networks not to read your email or be snooping on your web traffic. And now we know, you do everything over TLS. In the same respect, I think you're going to see a lot of moves toward memory-safe languages, like Rust and Go. You'll see folks start to demand not only SBOM but signatures using Sigstore or some other tool and raise the bar for the kinds of components that they pull into packages and platforms like Kubernetes that enterprises decide to consume.

This is a space of constant diligence, and that's somewhat the price of being on the cutting edge and making choices about the use of innovative technologies. There's going to be some sharp edges, but if you use the right tools, you set the right defaults in place. That's the key thing. Then we can trend toward a safer internet and look for ways to measure success other than just the lack of the next major crisis. Things like Scorecard start to give us that. We can objectively look at the mass of a million scanned repos and go, 'Over the course of a year, did the average score come up? Were we able to move the masses and not just set a high bar but a high floor for what's acceptable in terms of software quality and around security?'

But it being a requirement just to create software, you're saying, is going too far.

Behlendorf: The CRA -- the proposed policy triggers that on publication. Like when an open source project does a release, it has to certify that it's not [vulnerable to] X, Y and Z. And for a subset of them, the most critical ones, [they must] have an independent third-party audit attest to that. That would be expensive and, from a process point of view, pretty onerous and would put the brakes on a lot of, at least, the European Union's use of open source code. For their sake, it wouldn't be great. But given so much open source code comes from Europe these days, it would affect the rest of us as well.

There's another kind of funding issue here -- other people talk about how open source developers need to be paid. What's your take on that?

Behlendorf: I've never been paid directly for working on open source code. And most people I know have not. But they worked in open source code not out of charity but because their job demanded it indirectly. The vast majority of open source development has always been done by people doing that for a commercial purpose -- to incorporate into the website they're launching or the service they're building. The crisis isn't so much in just raw funding of developers of open source code. It's in funding the kinds of services and proactivity that leads to more secure software. It's really about providing value to third parties and sometimes [incentivizing] people to do that, when the overriding motive is, everybody is just there to scratch their own itch, so to speak. Getting that sense of collective action has been a challenge for open source code for 25 years. But specifically doing that around security is both our opportunity and our challenge, driving that sense of collective action.

Beth Pariseau, senior news writer at TechTarget, is an award-winning veteran of IT journalism. She can be reached at [email protected] or on Twitter @PariseauTT.

Next Steps

Intel exec leads OpenSSF, CNCF open source security efforts

Dig Deeper on IT systems management and monitoring

Software Quality
App Architecture
Cloud Computing
SearchAWS
TheServerSide.com
Data Center
Close