Confluent joins agentic AI fray with Streaming Agents
The vendor's new environment for developing agents includes tool calling via model context protocol and connections with key sources such as LLMs and vector databases.
Confluent on Tuesday introduced Streaming Agents, a set of features designed to simplify developing and embedding agentic AI applications informed by real-time data.
Unlike chatbots that enable users to interact with data using natural language but require prompts before providing information, agentic AI tools have reasoning capabilities and contextual awareness that enable them to act autonomously.
For example, they can continuously search massive amounts of data to surface insights that humans might not otherwise discover, recommend actions based on the information they gather, and take on repetitive tasks to relieve humans of time-consuming work.
Streaming Agents, now in preview, can embed agents within stream processing pipelines. Key capabilities include tool calling via model context protocol (MCP), which enables agents to draw information from the most relevant sources and connections to AI models, and vector databases to foster secure integrations with those data sources.
Many enterprises are attempting to build agents but face challenges associated with developing this new technology. Confluent's unveiling of Streaming Agents is valuable for the vendor's users, according to David Menninger, an analyst at ISG Software Research.
Agentic AI is the hot trend, but it's still nascent. There are still challenges associated with AI agents, including governance, integration and access to the right data at the right time. Confluent seeks [to make] the most current data available to AI agents.
David MenningerAnalyst, ISG Software Research
"Agentic AI is the hot trend, but it's still nascent," he said. "There are still challenges associated with AI agents, including governance, integration and access to the right data at the right time. Confluent seeks [to make] the most current data available to AI agents. By incorporating agent capabilities into the Confluent framework, it also helps address governance and integration."
Based in Mountain View, Calif., Confluent is a streaming data specialist whose tools are built on Apache Kafka, an open source platform that enables users to process high volumes of data in near real time.
New capabilities
With data providing the intelligence in AI, data management vendors are working to create environments that simplify AI development by helping customers combine proprietary data with large language models (LLMs).
Databricks and Snowflake have been among the more aggressive data management vendors in terms of creating environments for customers to build AI tools, including agents, that understand their business. Qlik, Informatica and Domo are among the many vendors that have built AI development suites.
Confluent formed a partnership with Databricks in February that enables the companies' customers to develop AI applications using streaming data. Now, with Streaming Agents, Confluent provides an environment for agentic AI development of its own in a move that BARC U.S. analyst Kevin Petrie, like Menninger, said is compelling.
"This capability is significant for AI adopters," he said, noting that BARC research shows that half of organizations already feed real-time data into AI models. "Most AI use cases require real-time user interactions, and streaming data pipelines are the ideal way to meet this requirement for tabular or vectorized inputs."
In addition to tool calling via MCP and connections that enable secure integrations with models, Confluent's Streaming Agents provides access to non-Kafka data sources such as relational databases and REST APIs, providing the AI agents with access to a more comprehensive set of relevant data. It also features capabilities that allow developers to evaluate agents using real data without risk.
The framework is based on Apache Flink subproject FLIP-531 and is essentially Confluent's fully managed version of the subproject, Menninger noted. As a result, Confluent's Streaming Agents is limited to the scope of FLIP-531, which addresses only embedding agents in stream processing flows.
While more limited than the agentic AI development capabilities being offered by some vendors, Confluent's new capabilities are nevertheless valuable, according to Menninger.
"The fact that you can incorporate them into streams is helpful to ensure you are acting on the most recent data," he said.
Petrie, meanwhile, noted that Confluent's approach to applying agentic AI capabilities is a logical one that takes advantage of capabilities that organizations with streaming data pipelines already know, including Kafka brokers, feature stores and stream processors.
"I like Confluent's approach to applying agentic capabilities to streaming data pipelines that most organizations already have," he said.
The impetus for Confluent's creation of Streaming Agents came from customer feedback, according to Andrew Sellers, the vendor's head of technology strategy.
"We're seeing more enterprise customers adopt agentic AI to act as the eyes and ears for their business," he said. "However, data infrastructure challenges are holding them back from making this a reality. … Our customers found that they weren't being successful for those use cases where their agents needed to respond to events as they happened."
Next steps
Confluent's product development plans will continue to focus on agentic AI over the remainder of 2025, according to Sellers.
"This is just the beginning of their agent investments," Menninger said. "Broadening the scope of the agents and providing tools for creating the agents would help make them accessible to a larger audience."
One particular area of focus within the broader spectrum of agentic AI where Confluent could add capabilities is governance, according to Petrie. Because agents are trusted to act more autonomously than the AI and analytics applications that preceded them, data quality and intellectual property protection are of utmost importance.
"I'll be interested to see how Confluent addresses the governance requirements of agentic AI," Petrie said. "Agents and models exacerbate existing data risks … and they create new risks such as bad decisions and misguided actions. Confluent users need help addressing these governance risks in their real-time workflows."
Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.