Nabugu - stock.adobe.com

Agents, semantic layers among top data, analytics trends

The top 10 predictions for the next few years are all influenced by the increasing deployment of AI to help make business decisions and take action.

Agents are the dominant trend in data and analytics, and agentic AI will become so pervasive within the next 18 months that half of all business decisions will be assisted or executed by agents.

That's according to research and advisory firm Gartner, and is one of 10 data and analytics trends its analysts predict will be among the most important over the next few years.

Other significant trends include the rising importance of semantic modeling and prioritizing task-specific models over general-purpose large language models (LLMs).

Meanwhile, all the data and analytics trends relate to AI, according to Gartner analyst Rita Sallam, who unveiled the firm's predictions during a webinar recorded earlier this year. The webinar was made widely available in conjunction with the recent Gartner Data & Analytics Summit in Sydney, Australia.

The theme of our predictions is AI, since data and analytics are a foundational accelerant for AI success. AI seems to have really taken over the world.
Rita SallamAnalyst, Gartner

AI has been the all-encompassing trend in data and analytics since OpenAI's November 2022 launch of ChatGPT marked a significant improvement in generative AI (GenAI) technology. With GenAI potentially making workers better informed and more efficient, many enterprises have substantially increased their investments in GenAI development. Data management and analytics vendors have responded by building environments that make it easier for customers to use proprietary data to inform GenAI tools, as well as by adding GenAI tools of their own to make their platforms easier to use.

"The theme of our predictions is AI, since data and analytics are a foundational accelerant for AI success," Sallam said. "AI seems to have really taken over the world."

With AI and its deployment within enterprises quickly expanding, most of Gartner's 10 data and analytics trends are expected to become commonplace within the next two years, including one that the firm expects to take hold by the end of 2025.

"Today's AI innovation is tomorrow's commodity," Sallam said. "The pace of innovation, the pace of change is like dog years every day, [so] how to think about technical debt becomes something that has to be considered more than ever before."

Agents become ubiquitous

Agentic AI is now the vanguard.

During the first year after ChatGPT was launched, applications that enable users to interact with their systems using natural language were the dominant trend in data and analytics. Such copilots and assistants allowed users to query and analyze data without having to write code, enabling more nontechnical workers to make data-informed decisions. In addition, by reducing the coding requirements on data engineers and other trained experts, GenAI tools made them more efficient.

In 2024, enterprise AI evolved beyond bots that require prompts from humans to act.

Agents are AI applications that, unlike bots, are capable of reasoning and have contextual awareness. Given those attributes, agents can act autonomously, taking proactive measures based on their understanding of circumstances, executing next steps, and even fixing themselves when they are incorrect.

Within enterprises, agents can serve as virtual personal assistants to workers by performing certain tasks independently and helping with others. In addition, as agents become more interoperable, they can consult with one another, just as someone in one department might confer with someone in another to gain perspective before making a key business decision.

Given their potential, agents have replaced bots as a trend in data and analytics.

However, agents are complex and costly to develop, and the reasoning capabilities of the models that inform agents are not yet reaching the level that agents can be trusted to act fully autonomously. Thus, agentic AI within enterprises is relatively nascent.

That will change within the next 18 months, according to Gartner, which predicts that by 2027, half of all business decisions will be assisted by agents or made by agents on their own.

Gartner analyst Rita Sallam recently unveiled the research and advisory firm's 10 data and analytics trend predictions.Rita Sallam

"Decision-making is getting redefined," Sallam said. "Self-service analytics, which we've all been working toward for the past 25 years, is likely going to be redefined and redesigned."

The implication is that analytics tools, for the first time, become perceptive, no longer requiring users to develop dashboards and reports that must be pored over before decisions are made and actions taken, she continued.

However, when relying on agents -- sometimes acting without human involvement -- there is an elevated risk for an enterprise. AI governance becomes not merely important but critical, according to Sallam.

"Agents are not a cure-all," she said. "They have to go hand in hand with effective governance."

Emphasizing semantics

AI, whether bots or agents, needs context to understand an individual organization's operations. Proprietary data provides that context.

But if the proprietary data used to inform an AI application is low quality or irrelevant, it won't provide the proper context. It could also increase the likelihood of AI hallucinations, incorrect and often misleading outputs that could lead to poor decisions and organizational embarrassment.

Data governance frameworks put standards and practices in place to help ensure the proper use of an organization's data. As part of that proper use, measures such as data lineage tracking and data observability address quality, making sure the information used to inform decisions can be trusted. Meanwhile, the implementation of tools such as data catalogs helps make data more easily discoverable, so relevant data can be found when needed.

However, neither data governance frameworks nor data catalogs are infallible.

Semantic modeling adds another layer of governance and discoverability that can further improve the accuracy of data-informed applications and reduce the occurrence of hallucinations.

Semantic models are metadata management tools that enable organizations to define key metrics and standardize the terms that describe their data. By doing so, they help make data consistent across an entire organization, aiding its quality and discoverability and reducing the likelihood of data duplication.

As evidence that semantic modeling is a growing trend in data and analytics, analytics vendors such as Tableau and ThoughtSpot have recently added semantic layers; others, such as Google's Looker and Strategy (formerly MicroStrategy), have long provided semantic modeling capabilities.

As the trend grows, organizations that prioritize semantic modeling over the next 18 months will increase the accuracy of their AI tools by 80% and reduce the cost associated with developing and maintaining those tools by 60%, according to Gartner.

"All roads point to metadata," Sallam said. "With AI, the criticality of metadata gets even higher. GenAI without context means more hallucinations, less accuracy, so [the result is] having to use more tokens to get to the accuracy you need, which leads to more cost."

Organizations that have an active metadata strategy are more ready for AI, she continued, citing a recent Gartner survey on data management. In addition, among organizations satisfied with the value they are getting from AI, most have AI-ready data fueled by data management strategies.

"You can't orchestrate AI agents that access different data sources without metadata, so it's increasingly important," Sallam said. "Now, metadata gets its new day in the sun, a new prominence, as a key to success."

Prioritizing precision

No matter how well organizations prepare their data for AI development, models and applications informed by that data still sometimes hallucinate.

Semantic modeling and other practices that address data quality and the relevance of data being fed into applications help from the data perspective. Selecting -- or building -- the right AI model can help from a technological perspective.

More than two years after ChatGPT's launch, numerous competing LLMs have been developed and released, including well-regarded LLMs from Anthropic, Google, Meta and Mistral.

However, most LLMs are general-purpose AI models, designed to perform across a wide array of tasks but not optimized for a specific business purpose. To increase the accuracy of AI outputs, smaller task-specific AI models will prove to be a better option, according to Gartner.

Task-specific AI models are designed and optimized for precise applications such as code generation, bug detection, identifying complex patterns in data and supply chain management, among many other potential uses.

Because they are pretrained for a specific task, they require less fine-tuning than general-purpose LLMs when combined with an enterprise's proprietary data. In addition, because they are smaller, they require less compute power to run and are, therefore, less expensive to use when developing and deploying AI tools.

As a result, the deployment of task-specific AI models is a rising trend in data and analytics. By 2027, enterprises will use such models three times more frequently than they use general-purpose LLMs when developing AI tools, according to Gartner.

"We are already seeing this start to happen," Sallam said.

Many enterprises start with LLMs when developing pilot programs but fail to get accurate enough results at a price they can pay, she continued. Others in highly regulated industries find that LLMs don't provide enough IT protection and control.

"Increasingly, there is a lower break-even point for moving on from [retrieval-augmented generation] techniques on large general-purpose models and leveraging those models' APIs to viable small language models, assuming you have the skills," Sallam said.

More trends

Beyond agentic AI, prioritizing task-specific models and the rising importance of semantic models, Gartner predicts the following data and analytics trends:

  • Organizations that emphasize AI literacy for executives will achieve 20% higher financial performance compared with those that do not by 2027.
  • By 2030, AI agents will replace 30% of SaaS application user interfaces, relegating SaaS applications to semantically enriched domain data sources.
  • Natural language will become the dominant method to query and interface with existing data ecosystems within the next year, leading to 10 times better data consumption.
  • By 2028, the fragmented data management market will converge into a single market focused on data ecosystems enabled by data fabric and GenAI, which will lower costs related to buying technology and integrating systems.
  • The number of legal claims involving deaths due to automated decisions and lack of sufficient AI guardrails will have doubled by 2029 from the previous decade.
  • By 2027, 60% of data and analytics leaders will face AI governance, model accuracy and compliance failures due to a reliance on synthetic data.
  • By 2027, data governance teams in 60% of all enterprises will be tasked with prioritizing governance of semistructured and unstructured data to get value from previously unused resources and improve the quality of GenAI-driven decisions.

"Everything on the planet is directly or indirectly influenced by AI today, and AI is powered by data," Sallam said.

Therefore, enterprises need to invest in their data ecosystems to keep pace with competitors and avoid pitfalls such as causing accidental harm via automation and running afoul of regulations due to poor synthetic data.

"Focus on your superpowers, your source of differentiation, which is data and metadata," Sallam said.

Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Data science and analytics