Getty Images

Tableau in transition as AI forces BI vendors to evolve

With its new context layer for AI, the vendor is attempting a needed evolution as agents and other cutting-edge tools reduce enterprise reliance on traditional analytics.

Tableau, which has long been one of the most respected platforms for business intelligence, is in a time of transition as its customers' data needs evolve from traditional BI to AI.

The vendor, which is a subsidiary of Salesforce, was part of a small group of vendors last decade, including Qlik and Microsoft with its Power BI platform, that enabled users to develop vibrant data visualizations that made data accessible to self-service users in addition to trained analysts and data scientists.

The advent of the cloud and limited AI capabilities such as limited natural language processing (NLP) and decision intelligence brought new competition including ThoughtSpot and Domo, but Tableau's platform was still consistently recognized as one of the best for BI.

AI has dramatically altered the paradigm for traditional analytics vendors.

True NLP that allows anyone to query and analyze data, and autonomous agents that surface insights and execute business workflows, have lessened enterprises' emphasis on traditional data products such as reports and dashboards. Consequently, BI vendors such as Tableau are in flux, trying to serve their customers by finding a new role in what is no longer an analytics workflow, but instead an AI workflow.

Tableau on Tuesday unveiled the Agentic Analytics Platform, a new set of features including a knowledge engine designed to feed agents and other AI tools the contextually relevant data they require.

With the platform, Tableau is positioning itself not as an endpoint for analysis, but an underlying layer for AI-fueled actions. And by doing so, Tableau is demonstrating its attempt to remain viable by providing a different kind of value to its customers than in the past, according to Micheal Ni, an analyst at Constellation Research.

"Tableau hasn't lost relevance, but it has shifted from setting the pace to trying to reassert its role in a market that's moved from being defined by insights to being defined by AI-first interaction models," he said.

However, Tableau is no longer part of a small group of BI vendors that is significantly more advanced than its competition, Ni continued.

"Tableau has adapted by adding Tableau Next, Model Context Protocol servers, agentic analytics [and pushing its] semantic layer, but in response to the market rather than as the innovation pacesetter."

Demetri Salvaggio, vice president of customer experience and operations at Engine, a travel platform provider based in Denver, and a Salesforce user for about eight years, similarly noted that Tableau is evolving, and so far doing so in step with its customers with features such as Tableau Pulse and the new knowledge layer for AI.

"Tableau Pulse, natural language querying and the Agentic Analytics Platform have all landed in roughly the same window we've been scaling," he said. "The platform is moving in the same direction we are, and that alignment matters more than any single feature."

Addressing customer needs

Founded in 2018, Engine provides a travel platform built for small and medium-sized businesses, enabling its customers to book and manage business trips.

Almost from its inception, Engine used Salesforce to capture customer relationship management (CRM) data. More recently, it became a Tableau customer as well, driven by an increasing investment in the Salesforce ecosystem that also includes the CRM giant's Data Cloud and Agentforce, according to Salvaggio.

Tableau hasn't lost relevance, but it has shifted from setting the pace to trying to reassert its role in a market that's moved from being defined by insights to being defined by AI-first interaction models.
Michael NiAnalyst, Constellation Research

Engine was using BI tools before adopting Tableau, and it looked at vendors in addition to Tableau when re-evaluating whether its analytics layer was set up to support its growth.

"Tableau won because of the ecosystem fit, not because of a feature checklist," Salvaggio said. "Native Salesforce integration, Data Cloud as a shared foundation, and a roadmap that was clearly heading toward agentic analytics -- that combination meant we could consolidate rather than stitch."

Now, Tableau's addition of capabilities that help deliver context to agents is keeping Engine a customer, he continued.

Tableau is not abandoning its BI platform. In fact, the vendor recently launched a new premium version of Tableau Desktop. In addition, enabling the user community that the vendor calls its DataFam to see and understand data remains at the core of its mission, according to Mark Recher, who was named Tableau's new executive vice president and general manager in March.

To remain relevant to its customers as AI becomes more ubiquitous, Tableau needs to do more than merely provide analytics capabilities, he noted. Toward that end, it is adding the enablement of users to take action to its mission of helping customers see and understand data, Recher said.

That addition -- or transition -- is key, according to William McKnight, president of McKnight Consulting.

"Tableau is in the middle of a high-stakes transformation," he said. "It still holds its reputation as the gold standard for visual storytelling and deep data exploration, but the AI era has challenged its role as the center of the analytics universe [and it is] fighting to become more of the 'brain' of the enterprise, as opposed to the great dashboard builder."

Similarly, Salvaggio said that evolving beyond traditional BI to become an enabler of AI is critical for Tableau to retain customers and serve their growing AI needs.

Engine is building an agentic enterprise, he noted. To date, it has developed EVA, a virtual support assistant that helps customers book flights, hotels and rental cars. EVA currently handles half of all chat support cases without human intervention, and Engine plans to develop expand its use of AI.

As a result, it needs tools that enable its agents to act appropriately.

"The biggest unlock test is for Tableau to keep closing the loop between insight and action," he said. "That's where we want it to go, and so far, the trajectory is right."

Part of the pack

Although Tableau is meeting the changing needs of customers such as Engine, it is not the only BI provider to evolve beyond its roots as AI has continued to evolve rapidly over the past few years.

Microsoft and Google are each taking a similar approach to Tableau, according to McKnight. As Microsoft builds out Fabric, an AI-fueled platform for data management and analytics that includes Power BI, and Google adds new functionality to Looker, both are aiming to position their tools as APIs that push intelligence to the rest of user AI ecosystems.

"Tableau is positioning itself as the authoritative API that feeds the rest of your AI ecosystem, which is a forward-thinking move [but] not innovative," McKnight said. "It is the entry stakes for staying relevant in a workplace where people no longer want to browse for insights."

In addition to hyperscalers Microsoft and Google, traditional BI specialists GoodData and ThoughtSpot are adding capabilities that are designed to discover contextually relevant data for agents and other AI applications.

GoodData in March launched Context Management, a layer like Tableau's Agentic Analytics Platform that is built on semantic modeling to discover and deliver the data that enables AI to produce accurate outputs. ThoughtSpot has similarly emphasized its semantic layer in recent product development initiatives, and in March unveiled agents with industry-specific contextual awareness to engender trust in AI.

Still other traditional BI vendors such as Qlik -- perhaps Tableau's closest competitor of the past -- and Domo have evolved to become more full-featured data platform providers.

Qlik added a data integration platform prior to the dawn of the AI era and has since created an environment for customers to build AI tools. Domo similarly now provides a development environment that enables users to build agents and other AI applications.

"Tableau's Agentic Analytics Platform is credible and competitive, but it reflects convergence with the market more than clear separation from it," Ni said. "Tableau's platform is strong in intelligence, providing depth in semantics and governance to serve as a trusted decision input, while letting competitors focus on areas like AI-native user experiences and building analytic applications."

Matt Aslett, an analyst at ISG Software Research, similarly noted that Tableau appears to be making an effective transition from providing BI capabilities to playing a role in AI. Key to Tableau's evolution -- and the evolution of all former BI specialists -- will be facilitating the understanding of relationships between data and enabling federated querying across sources such as data lakehouses without forcing users to move data into a single system.

"The analytics providers that are first to deliver this combination of functionality to market will be in pole-position to lead the race towards agentic analytics, and also fend off growing competition from data platform providers attempting to disintermediate analytics providers with conversational and analytics functionality of their own," Aslett said.

For Engine, the key to remaining with Tableau will be how it evolves to fit into Engine's growing data and AI architecture, according to Salvaggio. A keen observer of not only Tableau but also its competitors, he noted that Power BI and ThoughtSpot each have added impressive NLP capabilities and agentic AI tools.

"The question isn't, 'What's the best standalone analytics product?' It's, 'What fits the architecture we've already committed to?' Salvaggio said. "We run on Salesforce, Data Cloud and Agentforce. Putting a non-native analytics layer on top of that stack would create exactly the integration and governance overhead we deliberately moved away from. The Agentic Analytics Platform widens that gap rather than closes it."

Remaining relevant

While the Agentic Analytics Platform represents evolution for Tableau, it's only part of what the vendor must do to serve the needs of its customers as they transition away from traditional BI reports and dashboards to AI-powered insight generation and actions.

For example, Engine has a list of features it would like to see Tableau add as it makes AI enablement a focus in addition to BI.

In the near future, Salvaggio said it would like tighter integration between Tableau Next -- an agentic platform that integrates AI into workflows -- and Agentforce so that analytics findings can directly trigger agents to take action or review workflows without human intervention. In addition, it is hoping that Tableau adds agent observability capabilities that proactively detect and surface anomalies.

Longer-term, Engine wants Tableau Next to evolve into a decision layer for agents, Salvaggio continued.

"Tableau Next [should be] the unified decision layer across the agentic enterprise -- service, sales, supply, finance, all feeding into the same governed substrate," he said. "We're already moving that direction architecturally. We'd love the analytics layer to meet us there."

McKnight noted that Tableau is wisely taking advantage of its existing semantic modeling capabilities to develop tools that feed AI applications with contextual awareness.

"Tableau needs to shift from a visual destination to a governed semantic engine that grounds AI agents in trusted, consistent logic," he said.

However, Tableau's transition from an interface for BI to an infrastructure layer for AI won't be easy, he continued.

Aslett similarly pointed out that transitioning to a context layer for AI is a logical evolution for BI providers such as Tableau and its peers.

Ni, meanwhile, suggested that Tableau focus on providing vital information that enables customers to understand why things are happening within their business so they can act on that understanding.

"If Tableau wants to win the next phase by evolving from 'trusted knowledge' to a 'trusted decision' system, it needs to help operators answer which decisions actually moved the business, and which didn't," he said. "That means shifting the question from 'margin declined in region X' to 'this pricing change improved margin by Y%.'"

Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than three decades of experience. He covers analytics and data management.

Dig Deeper on Business intelligence technology