Getty Images/iStockphoto

The promises and risks of AI in software development

Incorporating AI into software development could be transformative, but it raises ethical and practical concerns. What does AI-generated code mean for the tech sector's future?

Increasingly, companies turn to artificial intelligence to accelerate software delivery. But the growing popularity of AI-based development tools raises concerns about security, accuracy and implications for tech workers.

Large language models (LLMs) such as GPT-4 can suggest code snippets, answer technical questions and even write portions of simple applications. Mike Gualtieri, vice president and principal analyst at Forrester Research, expects AI tools to have a "huge impact" on software development: "I think it's conservative to say that this will make a developer twice as productive, if not more," he said.

Those productivity gains could mean large-scale changes in the tech sector. As AI becomes more widespread, however, tech leaders question how to safely integrate it into software development -- and workers are concerned about the quality of AI-generated code, as well as what AI means for their jobs.

AI in software development is an evolution, not a revolution

Despite widespread hype around ChatGPT and other highly publicized models, AI in software development isn't as groundbreaking as it might initially seem. AI has existed in development and IT workflows for some time, albeit mostly in the form of simple code completion and automations.

Although AI can now generate full code snippets, even that capability isn't so different from familiar manual processes. Instead of wading through Stack Overflow answers for relevant code, developers can now get targeted suggestions directly in their integrated development environments (IDEs). "The key change is now it's universal and more accessible," Gualtieri said.

Recent AI advancements are best understood as enhancing an existing development paradigm. "It's suddenly becoming possible for some of these tools to start influencing the way programmers work," said B.C. Holmes, chief technologist at software development company Intelliware. "It's more a change in our awareness than it is about the actual underlying technology."

Holmes worked on AI systems in the finance industry more than 20 years ago, for example, although the term AI wasn't used at the time. Similarly, Frank Huerta, CEO of security software company Curtail, noted that some security tools hyped as AI use pattern matching techniques similar to methods long used in threat monitoring and attack detection.

But still, experts agree that recent advances in AI and rapidly rising adoption indicate a turning point. Even if the underlying technology itself isn't entirely new, AI's capabilities and scale of use are growing at a transformative rate. "Things are going to change in a very, very large way over the next decade," Holmes said.

The benefits and risks of AI-generated code

The most promising areas for AI in software development include planning and exploratory work, accelerating automation of routine tasks, and writing boilerplate code.

Although LLMs' ability to generate functional code is limited, they're powerful tools for answering high-level but specific technical questions. For example, ChatGPT could explain how to connect to an AWS service using Java or Python and provide sample code, Holmes said.

Much of day-to-day IT and development work involves tasks such as making API calls or moving data between objects. Automating those actions with AI frees up tech professionals to focus on more intricate, creative projects that require human insight. Combined with the emerging discipline of platform engineering, this could help relieve overloaded DevOps teams that are managing sprawling, complex IT environments.

Ravi Parikh, co-founder and CEO of developer platform startup Airplane, sees AI playing an important role in advancing developer tools. Integrating a tool such as GitHub Copilot into an existing developer platform's automation workflow, for example, could further increase productivity.

But although AI development tools are increasingly good, they're not perfect. When experimenting with AI-assisted software development, "one risk is that it generates bad code, just like a person generates bad code," Gualtieri said.

This includes security risks. Code generation and recommendation tools reflect the security vulnerabilities and misconfigurations of the code used as training data. Likewise, security and privacy concerns have arisen around the sources and sensitivity of the data used to train and refine AI systems.

To mitigate these risks, enterprises need to treat AI-generated code the same way they would -- or should -- treat its human-written counterpart. That means applying the same security and governance policies across the board, whether code comes from a human being or an AI model.

Code that doesn't contain security vulnerabilities can be flawed in other ways. Models don't always generate code that does what the user requested. And even if the code produced technically works, it might not be particularly concise or efficient.

Using AI-generated code is especially risky if users can't validate it, whether because they don't have sufficient technical knowledge or because a tool discourages users from checking its output. "Even if it gets it right nine times out of 10," Parikh said, "if one time out of 10 you're shipping bugs in your code, that's pretty bad."

AI's implications for low code and no code

Combining generative AI with low code and no code could let nondevelopers build entire applications.

Similar to creating graphics and UIs with design tools such as Canva and Figma, future low-code/no-code platforms could take in visual inputs and then generate application code behind the scenes using an LLM. "Right now, [AI] is assisting developers," Gualtieri said. "But we think in the future, it will assist entire development teams by generating not just snippets, but full applications that can be used within the enterprise."

Others are more skeptical of such tools, especially in the hands of users without adequate technical knowledge. "Already, I think low-code [and] no-code tools, when used incorrectly, can result in some really poorly written software that creates more damage than actually solving problems for people," Parikh said. "I think that AI being injected into that makes that even worse."

Another risk is related to copyright and intellectual property. Although some tools, such as the AI coding assistant Tabnine, take steps to ensure code used in training data has a permissive license, the same isn't true for all AI, including popular LLMs trained on enormous data sets.

The models underlying ChatGPT, for example, were trained on a massive corpus of data scraped from the internet, including text from books and articles. If that data contains proprietary algorithms that are reproduced in ChatGPT's output, it could raise intellectual property issues, Huerta said: "What if you generate something you just think is your own, and all of a sudden now you're treading on somebody else's patent?"

Typically, patent disputes between people or enterprises are resolved through legal mechanisms, he said. But it's not clear what the resolution process would be for disputes arising from code generated by tools such as GitHub Copilot and ChatGPT.

Customizability is key to the success of future AI tools

AI intended specifically for software development looks more promising than LLMs when it comes to actually writing code.

GitHub Copilot has emerged as an early front-runner thanks to its access to GitHub's massive data set and integration with the popular IDE Visual Studio Code. Parikh attributed Copilot's popularity in part to its transparency and flexibility, which leave users in control of whether to actually implement suggestions.

Other popular AI development tools have a similar structure: They provide suggestions, but users still ultimately decide what to implement. Similar to GitHub Copilot, Tabnine provides optional inline code suggestions as developers type. Likewise, IBM's Project Wisdom automatically constructs Ansible playbooks based on natural language requests, but developers can review and modify the results.

This customizability is essential in future AI tools, experts said. In particular, AI development tools need to work with existing enterprise systems -- an area where current capabilities fall short.

"Most organizations have systems that are decades old, and they're all different, and they're weird, and they're unique to each organization," Holmes said. "And to build systems, you need to actually figure out what's there and how to connect to it and attach things together."

Gualtieri sees a category of AI system known as a derived model as particularly promising in this area. Foundation models such as OpenAI's GPT-3 and Google's Palm are large, flexible systems trained on broad data sets that users can adapt to specialized use cases. Future AI-assisted developer tools could let an enterprise feed its codebase to a foundation model to build a customized derived model.

"That model is going to be more focused on their enterprise and therefore generate code that's more akin to their existing architecture," Gualtieri said.

Parikh foresees a somewhat similar application in the developer tool space: reducing the learning curve for new platforms and tools by creating custom assistants. In internal prototypes at Airplane, he said, the company has experimented with training AI models on its documentation to create a "semi-intelligent agent" fine-tuned to Airplane's software. This type of specialized chatbot could help users quickly pick up new tools without having to trawl through documentation or reach out for support.

Will AI replace software developers?

AI won't replace software developers anytime soon, due in part to its risks and technical limitations. But humans also simply have different capabilities than AI -- and that's something to view as an asset.

Our brains still do things a little bit differently than how computers do it. That needs to be an advantage and not get displaced.
Frank HuertaCEO, Curtail

As an example, Huerta described his difficulty replicating the human process of object recognition while working on an autonomous vehicle for an early Defense Advanced Research Projects Agency Grand Challenge. "It's similar with the AI approach to writing things," he said. "Our brains still do things a little bit differently than how computers do it. That needs to be an advantage and not get displaced."

Human empathy and communication are also essential to IT and development. "There's a whole people element to the IT job, which probably isn't going to change dramatically," Holmes said. "There's a lot of listening to people, understanding what it is that they need, making sense of the complex environment that we have today."

Rather than replace developers and IT staff, AI will more likely assist them. "I don't yet think that AI is good enough where it's going to replace a software developer if you yourself don't know how to write code," Parikh said. "I think the tool is at its best when it's not substituting for your intelligence ... but substituting for your lack of knowledge."

AI could help mitigate the skills gap in tech

Demand for IT professionals has historically been high, and despite recent layoffs at high-profile companies, there's still a skills gap in the tech sector. Thus, it's unlikely that AI will completely displace human workers -- in fact, AI-assisted software development could be necessary just to address the existing talent shortage.

Bar chart of the top 10 IT skills found in North American employers' job postings, according to Gartner.
Demand for IT skills -- including software development expertise -- remains high, according to 2022 data from analyst firm Gartner.

"There aren't enough people out there to handle the jobs," Huerta said. "There aren't enough developers, there aren't enough IT people, there really aren't enough security people. ... So, AI can help as a useful tool to help bridge some of those gaps."

At many organizations, this could mean unlocking greater productivity with existing teams rather than downsizing. Enterprises with large IT budgets "often have an internal user base that feels like they're not getting what they want for that budget," Holmes said. At the other end of the spectrum, she said, AI could be a boon for small businesses for which advanced IT has been financially out of reach.

In such scenarios, AI could speed up internal IT delivery and assist with projects that haven't been feasible due to staffing shortages or budget limitations. "It doesn't have to be, 'I can do what I'm currently doing today with half the people,'" Holmes said. "I think you should say, 'I could keep this same number of people and get more accomplished.' I think that's the better -- and actually probably more business-savvy -- approach."

AI coding tools could also lower the barriers to entry for software development. Just as accountants no longer need to know how to do long division by hand, certain programming skills that have so far been fundamental to coding jobs might become less important as tooling advances.

But combined with the recent downturn in the tech sector, this could mean a drop in average wages. Tech companies have historically paid notoriously high salaries, even for entry-level developer jobs. Especially in a tougher economy, widespread AI adoption could change that.

"It's probably going to actually help raise people up who haven't had the same opportunity or experience, and it's going to deflate a lot of salaries," Gualtieri said.

Holmes also pointed out that, historically, automation in one industry has led to much deeper economic changes. Consequently, it's important to look at greater automation in software development in a broader context that goes beyond the tech sector.

"Trying to address that is a much broader social, society-wide conversation," she said. "What does it mean? What is our perception of the nature of work in a future of extensive automation? I don't think I could solve that problem. But ... you have to solve it at that level, I think."

Next Steps

Generative AI ethics: 8 biggest concerns

3 ways to implement AI in software development

New skills in demand as generative AI reshapes tech roles

Dig Deeper on IT operations careers and skills

Software Quality
App Architecture
Cloud Computing
SearchAWS
TheServerSide.com
Data Center
Close