Getty Images/iStockphoto

Confluent platform update targets performance, simplicity

The vendor's latest release replaces its coordinating technology to make its tools easier to use and updates its Control Center for more efficient streaming data workloads.

The latest Confluent data streaming platform update aims to upgrade performance, remove operational complexity and improve data protection.

Released on June 24, Confluent Platform 8.0 is built on the open source Apache Kafka 4.0 platform that was introduced in March.

Confluent's platform update addresses operational complexity by replacing Apache ZooKeeper as the coordinating technology for Confluent with Apache KRaft, improves efficiency by launching a new version of the Confluent Control Center and adds to its data protection capabilities with a new level of encryption.

In addition, among other new features, Confluent's platform update includes the open preview of FlinkSQL to process and analyze streaming data using SQL code, a new release cycle for the Community Version of Confluent's platform that follows the Kafka community support schedule, and new features in both Confluent for Kubernetes and Confluent Ansible.

Collectively, the new features are important additions for Confluent users, according to Stephen Catanzano, an analyst at Enterprise Strategy Group, now part of Omdia.

"This update strikes me as a significant one because it delivers some strong architectural changes, such as replacing ZooKeeper with KRaft, major performance improvements, enhanced security capabilities, all of which address core operational challenges for enterprise streaming platforms."

Based in Mountain View, Calif., Confluent is a data streaming specialist whose platform is built on Apache Kafka to enable users to process high volumes of data in near real time. Competitors, in addition to Kafka itself, include independent vendors such as Aiven and Redpanda as well as tech giants including AWS, Google Cloud and Microsoft that offer streaming data capabilities.

New capabilities

Once a specialized technology for industries such as financial services to enable electronic trading and fraud detection and telecommunications to improve service, data streaming is now fundamental aspect of modern data architecture, according to Catanzano.

Business applications for data streaming now include personalized customer experience in retail, predictive maintenance and quality control in manufacturing, supply chain optimization, patient monitoring and resource allocation in healthcare, and many more.

"Streaming data platforms are something enterprises truly need rather than just nice-to-have luxuries," Catanzano said. "They enable real-time insights and informed decision-making that are increasingly critical in industries where immediate data processing can directly impact business outcomes, customer experiences and competitive advantage."

In addition, as enterprises develop more AI applications such as agents, real-time data will keep such tools current with the most recent data, he continued.

As the business applications for data streaming grow, so too does the volume and complexity of the data being processed to inform those applications.

To meet the increasing operational demands being placed on data streaming systems, the Kafka community, including participation from Confluent, developed KRaft as a replacement for ZooKeeper, according to Rohit Bakhshi, Confluent's director of product management.

ZooKeeper is an independent service that acts as a system coordinator, performing tasks such as tracking which servers are active, knowing where data is located and synchronizing aspects of the system when workloads are running. ZooKeeper, however, requires its own setup and management, which adds operational complexity.

KRaft eliminates the operational complexity of Zookeeper by integrating Zookeeper directly into Kafka, which both simplifies deployment and improves performance and scalability, according to Bakhshi. KRaft, which automatically replaces ZooKeeper in Confluent's platform, uses the Raft Consensus Algorithm to manage metadata, eliminating the need for a third-party metadata management system, and supporting significantly larger data clusters.

"KRaft … was developed to unlock better scalability and increase the operational efficiency of Apache Kafka to meet the demands of modern and increasingly massive data streaming architectures," Bakhshi said.

A new version of the Confluent Control Center is designed to further improve performance.

Now powered by a Prometheus-based architecture, the Control Center better supports high-throughput Kafka environments by integrating with the open telemetry protocol and removing the need to run multiple Kafka clusters to execute workloads.

Beyond improving performance and removing operational complexity, Confluent's platform update features a new layer of data protection with client-side field-level encryption. This feature complements existing data protection measures such as role-based access control, enabling organizations in regulated industries such as financial services and the public sector to keep sensitive information safe.

While the added security is valuable, the addition of KRaft and updated Control Center are the highlights of Confluent's platform update, according to Catanzano.

"The KRaft implementation stands out as most valuable because it completely eliminates ZooKeeper dependency, simplifying Kafka's architecture while … enhancing scalability and reliability," he said. "The next-gen Control Center is also interesting [by] making large-scale Kafka deployments significantly more manageable."

Next steps

With the latest Confluent platform update now available, the vendor's focus over the second half of 2025 will be on enabling users to develop and update AI applications such as agents using real-time data, according to Bakhshi.

That focus is wise, according to Catanzano, who suggested that Confluent add more integrations with AI and machine learning platforms, such as its partnership with Databricks, to foster the development of AI applications fed by streaming data.

"Looking ahead, Confluent might … expand its AI/ML integration capabilities to help enterprises leverage streaming data for predictive analytics," he said.

In addition, Confluent can serve the needs of its users and perhaps attract new ones by continuing to focus on performance improvement and removing operational complexity, Catanzano added.

"[Confluent should] continue enhancing its hybrid cloud capabilities to serve organizations with complex multi-environment deployments, all while maintaining its performance edge in high-volume, mission-critical streaming scenarios," he said.

Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Data management strategies