Getty Images/iStockphoto

Cribl aims to ease data observability with LogStream update

LogStream 3.0 brings new configuration capabilities to Cribl's pipeline technology that can help organizations optimize log and metrics data.

Data observability vendor Cribl updated its LogStream platform with a new release that integrates usability improvements for the platform.

The vendor, based in San Francisco, develops the LogStream data pipeline technology that enables organizations to clean and organize data so that it can be sent to other data platforms for observability and analytics.

With the new LogStream 3.0 version, released in general availability on June 2, Cribl said it is making it easier for users to define and share observability pipeline configurations with a feature called LogStream Packs.

Until now, LogStream had largely been deployed on premises, with its cloud service in beta. Alongside the LogStream 3.0 update, Cribl is making the LogStream Cloud generally available as a managed service for data observability.

Among the use cases for Cribl's data observability technology is to improve data flow into the Splunk platform.

The MIAX Exchange Group, which manages 15% of the world's financial option volume daily, is one of Cribl's customers. The group uses the LogStream platform to create an observability data pipeline into Splunk.

Gov Gopal, principal technologist and site reliability engineer at MIAX, said that as part of the initial rollout of LogStream, the company is handling approximately 100 GB per day of data and expects that volume to keep growing.

MIAX uses LogStream to shape and enrich production systems events before they are indexed in Splunk.

"Moving forward, the focus will be to use Cribl to increase data services while controlling the growth of data in Splunk," Gopal said.

Gopal noted that his team is interested in the Packs feature in LogStream 3.0. He said he anticipates that after reviewing all the available Packs, MIAX will have some ideas about how to enhance its existing data flow and will consider other events and metrics -- that are currently not indexed in Splunk -- that could be onboarded into the platform.

Advancing the data observability pipeline

The concept of the data observability pipeline that Cribl has enabled with LogStream should resonate in the data management market, said Bob Laliberte, senior analyst at Enterprise Strategy Group.

Laliberte said most organizations have an assortment of tools to monitor specific areas or functions within the data center and cloud environments. Unfortunately, most of the data collected by these tools remain siloed and space is often constrained.

The space constraint means that there is a limited amount of storage, and as such, organizations have to decide or are forced to decide how much data to keep, typically measured in days, weeks or months, Laliberte explained.

The vendor's answer to the data collection challenge is not to replace existing tools and agents, but rather to collect all the data from them while deduplicating and normalizing it.

Cribl enables a layer of abstraction between the agents and tools. This enables every analytics and observability tool to access all the data an organization collects, Laliberte said.

The platform also enables organizations to reduce the input into their collection of tools, which will save them money on licensing and storage costs. This is the essence of Cribl; it enables you to observe more but pay less while doing it.
Bob LaliberteAnalyst, Enterprise Strategy Group

Cribl also uses low-cost cloud storage that enables organizations to keep data for longer periods of time. Laliberte noted that keeping large volumes of data is fundamental for observability because an organization often doesn't know what it needs to look for. The ability to query all or a much greater amount of the data is therefore extremely useful.

"The platform also enables organizations to reduce the input into their collection of tools, which will save them money on licensing and storage costs," Laliberte said. "This is the essence of Cribl; it enables you to observe more but pay less while doing it."

How the Cribl data observability pipeline works

Clint Sharp, co-founder and CEO of Cribl, explained that a data observability pipeline can be thought of as a data router that forwards data from the source to a destination.

LogStream technology is that data router and it processes the data it receives from the source, before it is forwarded. The processing includes what Sharp referred to as "enrichment techniques," which provide more value to the data. For example, if the data is security-related, it can be mapped against a threat list.

Screenshot of Cribl's LogStream
Cribl's LogStream enables users to create data observability pipelines from different sources, including log data that is then normalized and enriched before forwarding to analytics and visibility dashboards such as Splunk.

Sharp explained that Cribl can collect observability data from any number of log or metric sources, including open source options such as Prometheus and Fluentd. That type of data is often then consumed into another platform, which could be Elastic stack, Splunk or Grafana.

LogStream sits between the data source and the destination, helping to enrich and filter the data.

With the new Packs feature in LogStream 3.0, Sharp said users will now be able to use reusable configuration bundles for data observability pipelines.

"Packs are really this reusable content that's allowing us to build up a knowledge base from our existing users that can be shared amongst customers, which will help them get to better data faster," Sharp said.

Enterprise Strategy Group (ESG) is a division of TechTarget.

Next Steps

Monte Carlo Incident IQ looks to improve data observability

Monte Carlo raises $60M to advance data observability

Dig Deeper on Data governance

Business Analytics
SearchAWS
Content Management
SearchOracle
SearchSAP
Close