your123 - stock.adobe.com
As secure software development captures enterprise attention, AI security -- and its potential risks -- are also being thrown into the spotlight.
Amid a complex and fast-evolving threat landscape in 2023, which will include an increase in open source vulnerabilities, AI-assisted developer security workflows can help dam the deluge. To remain competitive, enterprises will need tools that improve developer efficiency and keep them engaged without burning them out, said Matt Carbonara, managing director at Citi Ventures, the investing arm of Citigroup, based in Palo Alto, Calif.
"Amid a widening developer skills gap, these tools will also be central to retaining top talent," he said. "If software is the battlefield in 2023, developer talent is the cavalry."
Matt CarbonaraManaging director, Citi Ventures
But it's also only a matter of time before bad actors wrangle AI for more nefarious purposes, according to some experts.
In addition to lending developers a helping hand, AI can also create deepfakes of text, images or pictures, Carbonara said. It could even generate fake code repos as well; as such, developers and security teams will soon be on the offensive for AI-generated attacks.
"That is something we ought to be thinking about, and how you defend against that," he said.
The use of tools powered by machine learning, such as GitHub Copilot, will result in vulnerabilities similar to ones from copying and pasting code, said David Strauss, CTO at WebOps vendor Pantheon.
"They may be more subtle and pervasive in work that's been assisted that way," he said.
Although snippets copied from trustworthy sources tend to account for known risks or warn developers to consider them, sanitizing data requires a systematic approach that likely exceeds today's machine learning capabilities, Strauss said.
Open source adds to software development security strain
One of the top threat vectors for future attacks is open source software, increasingly in use among enterprises for many reasons -- among them recent austerity measures in a tight economy. This rise in open source usage means a commensurate rise in bugs, Carbonara said.
The onus is on the developer to ensure that updates to open source code they download are secure, but that's not always something of which developers are aware.
"People will pick it up and use it and don't realize that there is some type of vulnerability," Carbonara said.
"Developer teams are once again finding themselves in this situation of 'Did we download it? Do I have it? Where do I have it? Who made the choice?'" said Ilkka Turunen, field CTO at Sonatype, a software supply chain management platform.
PyTorch is the tip of the iceberg, because tens of thousands of similar incidents occur throughout the year. As such, software development security will see an increased emphasis in the coming year.
For example, developers can expect to see tighter controls on third-party repositories soon, said Liav Caspi, co-founder and CTO at Legit Security, a software supply chain security SaaS provider headquartered in Palo Alto, Calif. Maintainers may need to sign code for authenticity and downloads would come with metrics about its reputation. For developers, this will mean more tool sets about decision making when choosing a third party, he said.
2023 could be an AI security tipping point
The continuation of economic austerity into 2023 will put even more pressure on developers, while DevOps has gone too far in making many software developers solely responsible for software development security, Turunen said.
The industry doesn't yet have the tooling available to help react to situations such as escalating open source vulnerabilities as quickly as possible, but that could change this year, he said.
"This year will be all about adjusting the balance," he said. "AI and automation will free software developers from constantly putting out fires. It might not be mass adoption this year, but the genie is out of the bottle."
AI-assisted workflows, which include code reviews, will feature prominently in the coming year, said David DeSanto, chief product officer at GitLab. GitLab's 2022 DevSecOps survey found that 31% of respondents now use AI/ML as part of code review.
If AI can help developers to understand the structure of an application or generate code, it should also be able to help generate unit tests to make sure that code is working as expected, DeSanto said.
"Artificial intelligence and machine learning will further enable rapid development, security remediation, improved test automation and better observability," he said.