your123 - stock.adobe.com
If 2023 was the year of generative AI, 2024 will be the year it goes mainstream, with pervasive impacts on IT professionals' day-to-day work, according to DevSecOps experts.
Generative AI (GenAI) exploded onto the scene with the general availability of OpenAI's ChatGPT in November 2022. By November 2023, Microsoft and GitHub had re-founded themselves on the Copilot generative AI tool they developed in partnership with OpenAI. AWS and Google quickly followed with generative AI additions to every part of their product lines, from cloud computing to DevOps and enterprise productivity tools. Although many of these updates have yet to reach general availability, and few enterprise IT organizations have deployed generative AI in production, experts said the technology has already passed a point of no return.
For developers, this means potentially major changes to the fundamental work of software engineering, from increased developer productivity to non-technical colleagues who can create their own applications. For security, the risks of AI -- from data leakage to regulatory compliance -- loom as major challenges. And for IT ops, the proliferation of AI-enabled SaaS tools will raise the spectre of shadow IT.
"For the citizen developer, in a new, truly ubiquitous way, the path between idea and compiled code has just shrunk by several orders of magnitude," said Mike Bechtel, chief futurist at Deloitte Consulting, headquartered in New York. "IT departments cannot necessarily put that genie back in the bottle."
Mike BechtelChief futurist, Deloitte Consulting
While only 4% of 670 IT professionals surveyed in 2023 by TechTarget's Enterprise Strategy Group (ESG) had deployed generative AI tools in production, the majority were in the early stages of implementing it or willing to consider it. Just 15% of respondents to the overall survey said they had no plans to adopt generative AI and were not planning to consider it. An even smaller group -- 2% out of 324 respondents to questions about application development -- said they had no plans to invest in generative AI.
"If you look at generative AI as … just the next page in the book of increasingly intelligent machines that we've been writing at least since 1956, it starts to feel less like an alien life form," Bechtel said. "It's really just a tool."
The potential uses for large language models (LLMs) and products based on them have only just begun to be explored, said Andy Thurai, an analyst at Constellation Research.
"There are already many GenAI-related use cases in the advanced piloting stage waiting for board approval, budgeting, etc.," he said. "A lot of enterprises/vendors have implemented enhancements to existing products that will help them search … which will include some form of natural language query. I expect their adoption to be much quicker than straight [generative AI model] adoption."
Meanwhile, LLMs themselves are continuing to develop, Thurai said.
"We are still scratching the surface when it comes to accuracy, predictions, multi-modality, languages other than English and real-time decision making," he said. "Innovation will continue. But use it with caution."
Devs: Will robots take our jobs?
Generative AI has already arrived for application developers as a hot topic of conversation as well as some production usage in enterprise code generation and testing systems. Between those trends and dramatic headlines made by generative AI companies throughout the year, interest in the search term "AI taking jobs" in the U.S. hit an all-time peak in October, according to Google Trends.
So far, senior developers seem to get more of a productivity boost from generative AI than junior developers, according to David Strauss, co-founder and CTO at WebOps company Pantheon. Strauss attributed that difference to senior devs' experience with how systems work and what they ask the software to do -- otherwise known as prompt engineering. Much of the grunt work that had been delegated to interns and junior devs can now be done by generative AI, letting senior managers who transitioned away from day-to-day coding -- including Strauss -- get back into developing apps again, he said.
Strauss took an optimistic view to how this will affect training the next generation of developers, saying AI assistants could help junior engineers get up to speed.
"AI is not just capable of performing an intern role [under] a senior engineer, but it's actually possible for it to be a coach in a lot of ways," Strauss said. "An intern could provide a chunk of code to AI and have it say, 'The way this is written here will be a problem based on this assumption elsewhere in the [language] framework.' I'm hoping it can help cultivate that channel of talent even as it makes the entry-level tasks less necessary."
Thurai took a different view on how generative AI could affect developers at work. He said he believes it will help junior developers function at the level of senior developers and make it more difficult for senior developers to command higher pay.
"Senior-level developers could face issues in differentiating themselves," he said. "I don't expect co-pilots to replace developers anytime soon; they will assist and augment."
ESG's research points to generative AI helping DevSecOps teams keep up with accelerating cloud-native app deployment velocity. Of 378 respondents to a 2022 ESG survey about cloud-native applications, 30% reported weekly releases of code to production and 19% reported several code releases per day. In the March 2023 edition of the survey, 20% reported weekly releases, while 33% said they were releasing several times per day.
"The landscape is changing at such a rapid pace that it is going to outperform what humans can do. You can't throw enough bodies at the problem," said Paul Nashawaty, an analyst and the author of the ESG reports, in an interview this month. "So what is going to have to happen is the use of automation and generative AI to address those challenges."
Meanwhile, the August 2023 ESG survey also indicated a significant number of enterprises -- 36% of 324 respondents to application development questions -- plan to use generative AI to refocus skilled workers on more strategic projects rather than replace them.
Still, it's inevitable that generative AI will change knowledge work over the long term, including application development, said Donnie Berkholz, founder and chief analyst at Platify Insights, a tech industry analysis firm.
"People who thought about themselves as creators will find themselves moving more into the role of creative director and editor," he said. "They'll supply basically the creative brief for what they want the machine to do up front in whatever language is easiest for them, and then afterwards, they have to play the role of editor. … But the central role in that loop of creation is going to shift more and more toward the machine."
At least some companies will try to use generative AI to reduce headcount, Bechtel said, but he believes that will turn out to be a mistake.
"There's a subset of folks out there who are looking at generative AI as a crash diet … but these things have a way of coming back to bite you," he said. "Some of our more pioneering clients, they're saying, 'This isn't a weight loss pill. It's rocket fuel for that backlog of strategic ambitions.'"
Sec: Risk and governance concerns escalate
Generative AI has already shown benefits for security operations, but industry watchers predict that AI risks to data privacy and intellectual property, along with AI bias, will continue to be major issues for enterprises in 2024.
"This happens to every emerging tech, but with AI it is a little tricky as AI is moving at lightning speed and is affecting every nook and corner of any and all enterprises in every industry," Thurai said. "Other technologies are more specific to some industries or just certain areas of the company."
Another potential governance issue that looms for enterprises is copyright litigation that's widely expected to set an important legal precedent for generative AI. Pantheon's Strauss said he expects initial precedents to be set by courts in 2024 but likely not fully settled -- perhaps by the U.S. Supreme Court -- until 2025.
"Most people assume that the courts will come out in favor of the idea that things produced by AI are not generally derivative works of the things used to train the AI," Strauss said. "But I wouldn't be surprised if we get some uncertainty first. There's going to be some district court that's going to rule that AI is producing derivative works of the training set … before we start having higher-court rulings."
Overall, the outlook on generative AI for DevSecOps is ambivalent at best, with plenty of qualms and caveats. Among 260 respondents to security questions in ESG's August 2023 survey, 40% said they were still doing due diligence and research on AI-driven tools, the most popular response. More respondents -- 38% -- said they were using machine learning rather than generative AI tools for security, which came in at 33%. Developing a corporate governance structure for generative AI followed closely, at 32%.
Ops: Shadow IT on steroids
As generative AI tools proliferate, IT organizations will be challenged to maintain corporate control over their use, particularly as non-tech workers get into becoming citizen developers.
"What we've thought of historically as shadow IT, we'll look back and think we hadn't seen anything yet," Bechtel predicted. "IT departments can either play the governance role, where none of it is allowed at work … others might decide, 'We're going to have to make sure that people will choose the tools we're using and be less inclined to go rogue.'"
One tech CEO encouraged companies to consider and plan for the latter approach.
"Much like SaaS, IT can be on the sidelines and watch it happening or they can push the company forward with education about how to use AI, the ethics around it," said Uri Haramati, founder and CEO at SaaS management vendor Torii. "[The first option] isn't going to stop it … but there's huge potential to be part of building the productivity of the company."
Beth Pariseau, senior news writer at TechTarget, is an award-winning veteran of IT journalism. She can be reached at [email protected] or on Twitter @PariseauTT.