Getty Images

4 trends that will shape data management and AI in 2026

The tendencies that influence the next 12 months build on what began last year, including rising mergers and acquisitions and the need to simplify developing and deploying agents.

The contextual awareness of agents and consolidation among vendors will be among the biggest data management and AI development trends in 2026. So will rising adoption of protocols such as Agent2Agent (A2A), which address communication between agents, and agent-fueled process automation.

To make it past the pilot stage, agents need the proper context to be trusted. Semantic layers help provide that context, which will lead to more widespread use in the coming year. Once projects are past the pilot stage and into production, agents will automate previously manual tasks. And as enterprises build multi-agent systems, they will need A2A or other similar protocols to assist with orchestrating agentic networks.

"2025 was about building agents," said Michael Ni, an analyst at Constellation Research. "2026 is about trusting them."

Meanwhile, specialized data management and AI vendors could be casualties of the singular focus on agentic AI development -- which is more complex and costs far more than traditional data science and analytics -- and vendor consolidation will be a significant trend in 2026.

Context is key

The Model Context Protocol (MCP) provided a standard method for connecting data with AI agents. But connecting agents with data sources is only one part of agentic AI development. Agents need to be connected to the data that provides the context for carrying out their intended task.

However, discovering the appropriate data for a given agent can be difficult. Without proper data preparation, it can be nearly impossible.

As the use of AI has expanded, and the importance of data to those AI efforts has become more apparent, it's also become apparent that accessing data without a semantic model is like trying to drive to your destination without a roadmap. The semantic model is the roadmap to the data and its meaning.
David MenningerAnalyst, ISG Software Research

As a result, semantic modeling gained momentum in 2025 and is expected to become an even more significant trend in 2026, according to David Menninger, an analyst at ISG Software Research.

"As the use of AI has expanded, and the importance of data to those AI efforts has become more apparent, it's also become apparent that accessing data without a semantic model is like trying to drive to your destination without a roadmap," he said. "The semantic model is the roadmap to the data and its meaning. It is just as important as access to the data."

Semantic models are data governance tools that enable organizations to standardize their approach to defining and categorizing data.

Vendors such as AtScale, DBT Labs, Google's Looker, Strategy and ThoughtSpot have provided semantic layers for years. But semantic models were additive instead of critical when organizations were building static reports and dashboards rather than autonomous agents.

Now, AI development is making semantic models crucial, according to Baris Guletkin, vice president of AI at data platform vendor Snowflake.

"As companies begin moving more AI projects into production, they quickly realize that their AI initiatives struggle not because of a lack of intelligence, but because they lack business context," he said. "This is what will push semantic modeling from a background discipline to a strategic priority in 2026."

Beyond properly informing individual agents, semantic models provide agents enterprise-wide with a shared meaning and context, allowing them to work together and across departments, Gultekin continued.

"Agents need shared meaning and context, not just shared data," he said. "If an AI agent doesn't understand the business definitions behind the data powering it, its reasoning breaks down quickly. Semantic layers give agents the structure and grounding required to deliver trustworthy answers, ensuring that AI outputs remain consistent across teams and departments."

However, while valuable, semantic modeling still needs improvement before it becomes as ubiquitous as MCP, according to Menninger.

A group of vendors, including Salesforce and Snowflake, formed the Open Semantic Interchange in September to improve and standardize semantic modeling. But beyond the work of the consortium, which may or may not result in a standard being adopted, vendors need to improve their semantic layers.

"I still see a hole in most of the semantic models … that has to do with metrics," Menninger said. "The majority of semantic models are limited to capturing and expressing metrics in the form of SQL definitions. SQL is not rich enough to capture all the logic that goes into a business model. Unfortunately, I don't see much movement to expand the richness of the semantic modeling capabilities yet."

Setting a standard for agentic interoperability

While many agents are task-specific and operate in isolation, others are part of multi-agent systems that require them to communicate and collaborate. For example, to optimize a supply chain, an enterprise might deploy separate agents for inventory management, warehouse operations and plotting delivery routes, which all need to work together.

A framework that standardizes and simplifies agent interaction is beneficial, and the adoption of such frameworks is expected to be a trend among data management and AI vendors in the coming year.

Developed by Google Cloud and launched in April, A2A does just that, and dozens of vendors -- including AWS, Microsoft, Oracle, Databricks and Snowflake, among others -- provide support for the protocol. In addition, the A2A project merged with IBM's Agent Communication Protocol in September 2025, unifying two competing frameworks.

However, there remain alternative frameworks, so whether A2A support becomes as big a trend among data management and AI vendors as MCP support did in 2025 remains to be seen, according to Donald Farmer, founder and principal of TreeHive Strategy.

Unlike MCP, which can be used to aid in developing all agents, A2A and other frameworks that simplify and standardize agent interactions are only needed when organizations deploy multi-agent frameworks.

"A2A has a tougher road ahead," Farmer said. "The protocol is only needed when organizations actually run agent swarms that need to work together. … The momentum will come. But it will follow enterprise demand for managing multiple agents rather than vendor excitement for a new standard."

Meanwhile, Chris Aberger, vice president of data catalog specialist Alation, noted that while a framework for agent-to-agent interactions is needed, he predicted that A2A won't ultimately be the standard. Instead, although A2A addresses a problem many AI adopters face, Aberger expects that MCP will evolve to add the capabilities provided by A2A.

"This market doesn't reward 'best in theory,'" he said. "It rewards the protocol that becomes the default. And MCP is rapidly becoming that default. … The A2A pitch is appealing. Agent communication feels like the next frontier, and [A2A] addresses areas that MCP was not originally designed for. But it is strategically lagging in adoption."

The core use case for A2A and similar protocols won't disappear, Aberger continued. But rather than A2A adoption being a significant trend among data management and AI vendors in 2026, MCP will evolve.

"MCP won't coexist with A2A," Aberger said. "It will absorb the useful parts [of A2A] because the ecosystem will demand one standard, and it won't be the one that arrived second."

A graphic displays the different characteristics of agentic AI, non-agent chatbots and GenAI.Informa TechTarget

Automation on the rise

With greater contextual awareness plus protocols in place for connecting agents with data sources and orchestrating networks of agents, automation will be a rising trend in data management and AI in 2026, according to Menninger.

In particular, he predicted that the automation of complex data management tasks, which currently slow development and analysis, will increase over the next 12 months.

For example, data observability specialist Monte Carlo provides an agent that monitors data and implements rules to ensure data quality. Informatica's Claire Agents handle tasks such as data quality monitoring, data exploration and building data pipelines. ThoughtSpot, meanwhile, recently unveiled agents due for general availability early next year that will automate its entire platform, including agents for dashboard development, building semantic models and embedding BI.

"The agentic AI trend is all about automation," Menninger said. "Over the next 12 months, we'll see more and more automation of data management-related tasks -- everything from collecting and processing data to protecting data to analyzing data. … We've really just scratched the surface."

Beyond automating data management and AI development work, the trend will also include an increase in enterprises launching agents that take on business operations such as customer support and service, menial finance and accounting tasks, such as invoice processing and payment approvals, and optimizing supply chains.

"In 2026, the mandate shifts from operationalizing data to consolidating around orchestrating intelligence across the enterprise, automating decisions into workflows and governing execution at runtime," Ni said. "Together, they define the shift from data stewardship to decision leadership."

Farmer, meanwhile, predicted that a byproduct of increasing automation throughout 2026 will be that some organizations will suffer identity crises. As agents perform more tasks previously performed by people, enterprises will need to determine how to effectively integrate agents and humans.

"I expect the 2026 conversation to center less on a single new capability and more on integration," Farmer said. "Where does the AI end and the organization begin? Businesses like to say things like, 'We are our people,' to signal their identity, culture and integrity. In 2026, our AIs may become as distinctive to our business differentiation as our people."

Cost breeds consolidation

Driven by the high cost of AI development and the need to simplify complex AI pipelines, signs of a consolidation wave started in 2025.

In May, longtime independent data integration specialist Informatica was acquired by Salesforce in a deal that ultimately closed in November. In October, data integration vendor Fivetran and data transformation specialist DBT Labs agreed to merge. And in early December, IBM acquired streaming data vendor Confluent.

That may be the start, according to Farmer, who predicted that consolidation will become a major trend among data management and AI vendors in 2026.

Cost is one reason for consolidation. Developing agents and other AI applications is more complex and time-consuming than building analytics reports and dashboards. In addition, AI tools require bigger data workloads than data products to be accurate and effective. That combination is leading to significantly more spending on insight-generating applications.

One way that enterprises are mitigating some of the higher costs is by foregoing tools from independent specialists and instead using a single provider with pre-integrated capabilities. That, in turn, makes it more difficult for the specialists to compete with hyperscalers.

Simplification

In addition to cost, simplifying AI development systems is driving organizations to single providers. Integrating and maintaining specialized platforms for data ingestion, integration, modeling, vector search and storage, governance, discovery and other aspects of AI development is difficult, time-consuming and expensive.

"The pattern around Informatica, Confluent, and Fivetran/DBT Labs suggests that data infrastructure is ready for simplification," Farmer said. "The trend is driven by AI-focused platform strategies and the need to streamline the crowded landscape of enterprise tools. IT teams are eager for fewer vendors to manage and are seeking consolidation. Additionally, IT teams recognize the need for integrated data."

Snowflake's Gultekin similarly suggested that a desire for simplification is driving the consolidation.

"Gone are the days of using a patchwork of point solutions," he said. "Enterprises are now looking for unified platforms that allow them to move faster, eliminate complexity and make AI work on top of all their data. The industry consolidation we're seeing is a reflection of that shift."Meanwhile, acquisition targets could be data catalog vendors, data observability specialists and extract, transform and load (ETL) vendors, according to Farmer.

"These are related and often sold to the same buyers, so [they are] natural targets for consolidation," Farmer said. "Furthermore, [private equity] has historically consolidated fragmented middleware."

Room remains for some survivors

Amid consolidation, there remains room for certain specialists to remain independent, according to Farmer.

"Advanced data teams still prefer specialized tools where they perceive a clear difference," he said. "Consequently, not every independent vendor is a takeover target."Alation CEO Satyen Sangani similarly foresees space for independent vendors even as consolidation increases, noting that some enterprises still favor integrated ecosystems from various vendors and fear vendor lock-in.

"The trend will continue. … [But] the counterweight is true vendor neutrality -- cloud-neutral, compute-agnostic, model-flexible and API-first, so architectures stay portable as models and infrastructures evolve," he said.

Ultimately, it remains to be seen whether the enterprises still favoring integrated systems from various independent vendors will continue to do so, according to Ni. He noted that enterprises are increasingly moving AI initiatives from experimentation to execution. As they do so, simplification could soon surpass all other considerations.

"A consolidation wave is not just a prediction, but has already started," Ni said. "Data and AI platforms are maturing, which means that trust, observability and integrated governance increasingly outweigh the flexibility of using best-of-breed tools. At the same time, hyperscalers have announced data pipeline and management capabilities that bundle features, putting pressure on standalone vendors."

Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Data management strategies