Swim Continuum advances event streaming data intelligence
The vendor is continuing to build out its technology aimed at enabling organizations to build applications that use real-time data to improve operations.
Data intelligence vendor Swim released its Continuum 4.1 platform update and said it also raised new funding.
The 2015 startup, based in Campbell, Calif., has been building out technology to enable organizations to use event streaming data sources. Swim has steadily evolved its technology in recent years, and in 2019 it launched Swim DataFabric, a platform for enterprises to organize streaming data for data intelligence and operational insights.
The Swim Continuum technology is a superset of capabilities on top of DataFabric, with tools to help organizations more easily connect and use streaming data sources such as Apache Kafka.
Alongside the Swim Continuum 4.1 update, the vendor said it raised new money in a funding round led by Verizon Ventures. Swim had previously raised $25.4 million, according to Crunchbase.
Swim's technology is aimed at enabling organizations to respond to data faster, said Kevin Petrie, an analyst at Eckerson Group.
Kevin PetrieAnalyst, Eckerson Group
The data intelligence software is ideal for applications that require low-latency processing of real-time event streams from hundreds, thousands or even millions of distributed elements, Petrie said. For example, logistics companies need to optimize how they dispatch their vehicle fleets, and cities need to optimize how they route traffic during rush hour, he noted.
"All kinds of distributed systems need to get faster, more efficient and more reliable," Petrie said. "Swim seeks to address these needs with an agent-based approach to aggregating, modeling and responding to those real-time event streams in context."
How Swim uses event streaming for data intelligence
Swim CEO Ramana Jonnala said the vendor's central focus is to enable organizations to effectively use streaming data.
"There are all kinds of assets in an organization's environment that are streaming data continuously, and you want to be able to analyze that at the rate at which it's being generated," Jonnala said.
Event streaming technology use has grown in recent years, in large part due to the success of Apache Kafka and Confluent. Jonnala said that Confluent's success has in turn helped raise awareness and adoption of Kafka. While Kafka enables event streaming, organizations still need to operationalize and make sense of the data -- outcomes that Swim Continuum enables, he said.
"The adoption of Kafka actually makes the need for technologies like Swim more acute," Jonnala said.
Swim Continuum provides tools that help organizations build data applications from event streaming data. The applications could be data analytics and business intelligence, or operational dashboards that provide real-time insight into activity.
New event data streaming features in Swim Continuum 4.1
Among the enhanced features in Swim Continuum 4.1 is the ability for users to more easily connect raw streaming data into applications.
Also among the enhancements are updated user interface toolkits to map correlations between data sources. Swim can ingest data from multiple sources, including streaming as well as static data sources such as relational databases and data lakes.
Swim helps organizations do near real-time correlations across data. For example, with an event in a data stream, a Swim user can correlate it with other streams to see if there is a potential anomaly that could indicate a service or availability problem, according to Jonnala.
"We just make it very simple for users to be able to stand up their own streaming applications," Jonnala said.