sdecoret - stock.adobe.com

Tableau repositions for AI, unveils new knowledge layer

The Agentic Analytics Platform is designed to help users operationalize contextually relevant data for agents and demonstrates the vendor's ongoing evolution.

Once a driver of human knowledge through data, longtime analytics provider Tableau on Tuesday unveiled new capabilities designed to feed agents and other AI applications the contextual knowledge they need to be trusted by enterprises to autonomously perform in production.

The Agentic Analytics Platform was introduced during Tableau Conference, the vendor's annual user conference in San Diego.

Designed as a knowledge layer that automatically feeds agents and other AI tools the relevant data they require to operate as intended, the Agentic Analytics Platform unifies proprietary data, the metadata that describes data to make it identifiable, and business logic through semantic modeling to prepare data for discovery.

Among others, key features of Tableau's knowledge layer for AI include data knowledge engine based on 20 years of semantic modeling, conversational analytics delivered through a natural language interface and security and governance.

Business intelligence platforms such as Tableau were historically passive, enabling users to derive insights from visualizations that led to strategic decisions. The Agentic Analytics Platform is a significant addition because it marks Tableau's evolution from passive analytics to an AI-driven knowledge engine that fuels insights and actions, according to Matt Aslett, an analyst at ISG Software Research.

"The Agentic Analytics Platform builds on Tableau's established functionality for existing users but evolves Tableau into a knowledge engine that can provide trusted context to enable human and agentic decisions and actions with advanced recommendations, summarization and automated actions," he said.

In addition, the new suite will help Tableau, a subsidiary of CRM giant Salesforce, compete as other vendors similarly add capabilities aimed at fueling trusted business decisions and actions, Aslett continued.

"Tableau … is already ahead of many of its rivals in terms of the delivery of AI-driven analytics," he said. "Its lead will be maintained by the new Agentic Analytics Platform capabilities even as all analytics software providers are in the process of adding AI-powered functionality to their products to support conversational and agentic analysis."

Intelligence for AI

While enterprises continue to invest in developing agents and other AI tools aimed at transforming their business by making employees better informed and processes more efficient, many are struggling to build tools that can be trusted enough to put into production.

The problems preventing AI initiatives from moving past the pilot stage vary, but one of the main hindrances to date has been discovering and delivering the high-quality, contextually relevant data AI tools need to operate as intended.

The Agentic Analytics Platform builds on Tableau's established functionality for existing users but evolves Tableau into a knowledge engine that can provide trusted context to enable human and agentic decisions and actions with advanced recommendations, summarization and automated actions.
Matt AslettAnalyst, ISG Software Research

Throughout 2026, numerous data and analytics vendors have introduced capabilities designed to improve the discovery and delivery of data that will provide chatbots, agents and other AI applications with proper context.

For example, Databricks, GoodData, MongoDB and Teradata have all added capabilities that aim to improve data retrieval for AI. Now, Tableau is doing the same with its knowledge layer for AI.

Mark Recher, who was appointed general manager of Tableau in March after Ryan Aytay departed following three years as the vendor's CEO, noted that the Agentic Analytics Platform represents Tableau's growth beyond self-service and augmented analytics to agentic analytics.

"It's taking actions -- pairing insights with actions and the ability, in your organization, to surface information someone needs to know before they even know they need to know it," he said.

Key is the knowledge engine, he continued. Connecting AI to data is not enough. AI requires context to be effective, and Tableau's new capabilities are designed to provide that needed context.

"We've had a knowledge layer -- a semantic layer -- inside Tableau for decades," Recher said. "What we're announcing is a knowledge graph. You cannot provide agentic analytics without trusted knowledge which actually understands the context of your business."

Tableau's Agentic Analytics Platform includes the following:

  • An engine that delivers trusted knowledge based on more than 20 years of providing semantic modeling tools.
  • A natural language interface that enables users to query and analyze data within any dashboard.
  • A decision engine that turns insights into action by surfacing insights and triggering workflows.
  • Agentic analytics anywhere through an open architecture that enables organizations to deliver contextually relevant data, through Model Context Protocol (MCP) servers, to custom-built external AI tools, as well as public large language models such as Anthropic's Claude and OpenAI's ChatGPT.
  • A command center for agentic analytics that combats agent sprawl by serving as the primary interface for an organization's agentic analytics strategy.
  • The combined governance and security of Salesforce and Tableau.

Conversational analytics capabilities and some MCP servers are now generally available. The knowledge engine will be GA in June and the command center will be GA in the fall.

Like Aslett, William McKnight, president of McKnight Consulting, noted that the Agentic Analytics Platform is an important addition for Tableau because it uses context to move beyond rear-facing analysis to AI-powered insight generation and action.

"Tableau's Agentic Analytics Platform introduces new capabilities that transform the software from a passive visualization builder into an autonomous system capable of taking trusted, proactive actions," he said. "By leveraging a company's existing business logic, it empowers … while maintaining centralized governance."

In addition, the Agentic Analytics Platform features capabilities that could help distinguish Tableau as data and analytics vendors reposition themselves for AI-powered analysis and process automation, McKnight continued.

"Most competitors treat AI as a feature inside their own walled garden," he said. "You have to use their chatbot to access their data. Tableau is taking a different path by positioning itself as an authoritative data service. Through MCP, it enables external agents to reach in and retrieve trusted numbers."

However, there are more capabilities that Tableau could include with the platform  to enable its users to build and manage agents, McKnight continued.

"The new platform appears to provide a robust governance framework, [but] the platform is missing resolution of multiple agent logic overlap and the incorporation of unstructured data context into the agent processing," he said.

Aslett, meanwhile, noted that by focusing on context, Tableau's Agentic Analytics Platform is in line with the capabilities that competing data and analytics vendors are adding to their own platforms.

"The leading providers are already looking beyond conversational interfaces and guided analytics for existing reports and dashboards," he said. "They're providing a context layer that captures established enterprise knowledge and semantic understanding and enables analytics via agents as well as external AI tools and applications."

Looking ahead

While Tableau's Agentic Analytics Platform shows that the vendor is evolving to keep up with competitors and provide what enterprises require to operationalize AI, engagement with Tableau users provided part of the impetus for the context layer, according to Recher.

Tableau has a large user community, and its feedback is what drives the vendor's product development.

"We use them for information on where we're taking product strategy," Recher said. "It wasn't just hearing it from our customers, it was hearing it from the community, the people who are working it Tableau and having good understanding of what they need capability-wise to be as productive as possible in the AI era."

Future product development will focus on improving the capabilities of Tableau's knowledge graph, adding more decision intelligence to Tableau and enabling users, including agents, to act based on their insights, he continued.

"Those are probably the three spaces … that you will see more announcements," he said.

McKnight, meanwhile, suggested that Tableau not only continue to invest in semantic modeling capabilities, but also add features that ease customers' transition from using the vendor's platform as a front-facing tool for analysis to an underlying layer for AI.

Its embrace of the Open Semantic Interchange, an open standard that provides a universal structure for defining data so that semantics are the same across systems, will help. But changing Tableau's purpose within the data and AI workflow could still be challenging for users.

"Tableau needs to shift from a visual destination to a governed semantic engine that grounds AI agents in trusted, consistent logic," McKnight said. "By adopting open standards … the platform ensures its business rules remain the 'single source of truth' across a headless data stack, but the transition may be brutal pivot from roots as a beloved visual interface to a background infrastructure and trust engine." 

Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than three decades of experience. He covers analytics and data management.

Dig Deeper on Data science and analytics