putilov_denis - stock.adobe.com

Five generative AI trends to look for in 2024

The boom will persist as enterprises become acclimated to the technology. More enterprises will start using genAI systems and organizations will incorporate governance measures.

As a year of rapid growth comes to a close, 2024 will be the year of diving deeper into what generative AI and AI technology can do.

After the release of OpenAI's ChatGPT just over a year ago, few could have predicted that within five short days of the release of the new generative AI chatbot, a million users would have used it. By January, 100 million users had experimented with it.

ChatGPT led to "AI" becoming a household word. It highlighted some challenges with the new technology, while swiftly spreading it around the world. It sparked an intense AI race between Google and Microsoft and helped inform the public about large language models; open source LLMs; and now multimodal models capable of generating text, images, video and audio.

If 2023 has been the year of generative AI, 2024 will see that growth trajectory zoom upward. Here are a few predictions of how generative AI will shape 2024.

AI technology at work and 'bring your own AI'

At the start of the generative AI boom, many organizations were cautious and restricted employees from using ChatGPT. Some of those companies include Samsung, JPMorgan Chase, Apple and even Microsoft temporarily.

They were skeptical about ChatGPT's training data and feared using the generative AI tool could lead to breaches of internal data.

While many enterprises might still be careful about generative AI and classic AI technology, 2024 will see enterprises give their employees the ability to use more generative AI, Forrester Research analyst Michele Goetz said.

"We know that nearly two-thirds of employees are already playing around and experimenting with AI in their job or in their personal life," she said. "We really expect that this is going to be more normal in our everyday business practices in 2024."

That scenario might look like more employees using generative AI to be more effective and productive.

Many employees will also likely use a "bring your own AI" (BYOAI) system, Goetz said. BYOAI means that employees will use any form of mainstream or experimental AI service to accomplish business tasks whether the business approves of it or not. These tools could either be generative AI systems like ChatGPT and Dall-E or software with other embedded AI technology that's not sanctioned by the business.

Enterprises will have little choice but to invest more in AI and encourage employees to use it safely.

British multinational oil and gas giant BP is one organization that's already infusing generative and classic AI technology into its culture.

"One of the things that we're taking very intentionally is an idea about AI for everyone," Justin Lewis, vice president for incubation and engineering at BP, said during a panel discussion at The AI Summit New York conference on Dec. 7.

BP's idea of AI for everyone means more than having all employees get their hands on AI tools and technology, Lewis said.

The oil and gas company's goal is for every employee, both with and without technical experience, to be able to build their own AI tools, and publish, share and reuse them.

"The lower barrier to entry that we're seeing with LLMs and generative AI in general makes that more possible today than it ever has been," Lewis said.

"If we can remove the bottleneck and get to a point where citizen developers, or citizens, with no experience building any technical tools are able to build their own AI tools and leverage and scale them and share them, then we'll be able to advance AI innovation much faster," he continued.

That kind of innovation also comes from using AI to help workers be more productive.

One way generative AI will do this in 2024 is with "shadow AI," Goetz said.

Shadow AI is the use of AI technology to complement or boost employees' productivity without hiring more employees. BP is already using one form of shadow AI, Lewis said.

"What we're seeing most impactful are in the places where you're helping humans perform 10 times better, 10 times faster," he said.

An example is in software engineering. BP has a team that does code reviews as a service using AI, Lewis continued. BP has built the team so that one engineer can review code for 175 other engineers, Lewis added.

"It has a radical impact on the way you think about shaping the organization," he said.

Google Bard chatbot predicts some 2024 AI trends.
Prompt to Google Bard generative AI chatbot produces several AI trends for 2024.

Enterprises will invest in more AI governance

More enterprises and organizations allowing generative AI technology internally means more need for AI governance.

"There are a lot of risks that can come from personal AI use or bring your own AI," Goetz said.

Many enterprises will proactively invest in governance and AI compliance to get ahead of this, she said. Governance will take several shapes.

For organizations equipping their employees with generative AI and allowing personal AI use in the workplace, governance means controlling how employees are using the technology by flagging when people who are composing inappropriate or low-quality prompts, Lewis said.

Governance is also looking externally at government regulations that have been proposed or enacted and trying to get ahead on compliance, Goetz said.

Existing regulations that touch on AI include the New York City AI hiring law and California Data Privacy Law.

Preparing for upcoming regulation not only benefits organizations from a profit perspective but also protects them, she added.

Tech companies creating capabilities that meet existing or potential regulations will help their customers. That then should lead to more revenue, she said.

Also, compliance reduces the risks of lawsuits because generative AI uses intellectual property to build models and could put organizations at risk of illegally appropriating IP.

Staying on top of governance and regulation also gives organizations a chance to participate in potential regulation, Goetz added.

"It's also the ability to influence and know how much teeth are in these regulations," she said.

Moreover, as enterprises and organizations start to think deeply about AI governance, insurers may start to offer policies to protect against hallucinations -- when AI systems produce false or distorted information -- Goetz said.

"Insurance companies are also recognizing that their policies may not actually be covering all of the risk permutations that newer AI capabilities are going to introduce -- hallucinations being one of those," she continued.

More autonomous agents

While 2023 was marked by rapid advancement of LLMs and generative AI chatbots like ChatGPT, promise AI systems act as personal assistants has not yet been fulfilled.

That's according to Trevor Back, chief procurement officer for AI vendor Speechmatics, who spoke during a panel discussion at the New York conference. Speechmatics develops speech intelligence and speech recognition software.

"With large language models, we've got that ability to move closer to [a] personalized system," Back said.

However, moving to more personalization will require less prompting, Gartner analyst Arun Chandrasekaran said.

Currently, systems like ChatGPT require a lot of prompting and user intervention.

"In the future, we might see more agents that can take higher-level actions [and] higher-level intent from you, and they can translate that into a series of sub actions that they can execute on your behalf," Chandrasekaran said.

In other words, the LLMs that power these generative AI systems will better understand the intent behind the prompt.

For example, instead of just asking generative AI systems to fulfill a task, such as creating a background for an image, users might ask an AI system to create a marketing logo or check if a domain name is available and help create a website.

"You give it a higher-level action and then it just orchestrates a series of steps," Chandrasekaran said. "It enables human beings to do more with less prompting and less intervention."

More multimodal and open models

More personalized AI models will also likely lead to more multimodal models, Chandrasekaran said.

Multimodal models combine different types of modes or data. The models can convert a user's input of text into an output of an image, audio or video, for example.

Current iterations of this are image-generating models like Dall-E that convert text to images.

However, Google's recent introduction of Gemini, a model that can train on and generate text, images, audio and video, shows what multimodal models can look like in the future.

We really expect that this is going to be more normal in our everyday business practices in 2024.
Michele GoetzAnalyst, Forrester Research

Multimodal models will also lead to more advanced applications, Chandrasekaran said.

For example, combining speech with text and images could lead to better disease diagnosis in healthcare.

"The potential for multimodality in terms of enabling more advanced use cases is certainly immense," Chandrasekaran added.

Other than multimodal models, open source models will proliferate, Goetz said.

"What you're going to see is almost this capitalist type of effect, where all of these models are going to come to market, and then it's going to get Darwinian" in terms of the stronger models beating out the less successful ones, she said.

Enterprise adoption of the models will also evolve, she added.

More AI startups and better-sophisticated offerings

Generative AI opened the floodgates for many AI startups to step on to the scene.

While many more startups will emerge in 2024, they will bring more sophisticated offerings that what is on the market now, Forrester Research analyst Rowan Curran said.

Instead of producing offerings centered around AI chatbots like ChatGPT, new startups will create more specific application-centric offerings, Curran said.

"That's going to be driven and supported by the increasing array of those open source and proprietary tools to build on top of some of these core models themselves," he said.

These could be LLMs, diffusion models (models that can generate data like the data they are trained on), traditional machine learning models or computer vision models, he added.

Moreover, some new startups will emerge in response to the development of domain-specific models or smaller language models, Curran said.

2024 will be a year of focusing on just how generative AI will shape the larger enterprise IT ecosphere.

"We really have to remember that this was just a first year of getting acclimatized to these things," Curran said. "Maybe into next year and even a year beyond is where we start to see a type of clarity come into what type of services are built with these things."

Esther Ajao is a TechTarget Editorial news writer covering artificial intelligence software and systems.

Next Steps

Top AI and machine learning trends

Dig Deeper on AI technologies

Business Analytics
CIO
Data Management
ERP
Close