Nmedia - Fotolia
Streaming database startup Materialize today released a public preview of its cloud database-as-a-service offering.
Materialize, based in New York City, was founded in 2019 and has spent the last few years building out its database platform, which provides a streaming data capability that enables users to execute standard SQL queries.
Materialize is able to handle streaming data sources, including Apache Kafka. At the foundation of the platform is the Timely Dataflow open source stream data processor technology, which enables users to directly query streaming data.
With the new Materialize Cloud offering, organizations will get a managed cloud service for operating and managing the streaming SQL database.
Until now, users have deployed and managed Materialize on their own. One such organization is online alcohol delivery platform Drizly.
Dennis HumeStaff data engineer, Drizly
"We're really excited about the Materialize Cloud offering," said Dennis Hume, staff data engineer at Drizly. "We're still self-hosting Materialize, but we're looking to move potentially to the Materialize Cloud in the future."
Using streaming SQL to improve e-commerce
Emily Hawkins, data Infrastructure lead at Drizly, explained that Drizly uses multiple tools within its data architecture, including a MySQL database, Snowflake for data analytics, Looker for business intelligence, dbt for data transformation and Confluent as a streaming platform.
A challenge has been how to better make use of the company's streaming data for its applications' business logic.
For example, a key issue that many e-commerce vendors face is cart abandonment. Drizly needed a way in real time to better respond when a potential customer doesn't complete a sale.
Drizly uses Materialize to ingest incoming Kafka data streams. Hawkins said that with Materialize, Drizly enables an event-driven data architecture. When a certain event is identified in the data stream, another series of actions can be triggered.
"If someone 'adds to cart' and if they don't check out in 30 minutes, we say it is an abandoned cart," Hawkins explained. "We can get data from Materialize and then that kind of triggers our CRM tool to send out some sort of notification to that user saying, 'You haven't checked out, you forgot these items in your cart.'"
Hawkins said that of particular value to her organization is the fact that Materialize works with standard SQL, which is already familiar to her team members. As a result, it has been faster and easier to get started with Materialize and build out its use for Drizly.
How Materialize works as a streaming SQL database
While streaming data is a familiar concept to many thanks to Apache Kafka, Materialize CEO Arjun Narayan said that in many cases, it has been hard to use that data.
"Materialize is a database that delivers incremental view updates for standard SQL on top, of fast-changing streams of data," Narayan said.
Timely Data Flow provides a materialized view of the incoming data. With a materialized view, data is computed into a format that can be queried with SQL.
For SQL queries, Materialize has built out a PostgreSQL-compatible layer so users can use the same SQL queries they would against a PostgreSQL database for the Materialize streaming data.
"What's nice about this is you get all the benefits of the next-generation stream processor, but from the user-experience end of it, you can just take queries that run on PostgreSQL and have them just run on top of Timely Data Flow," Narayan said.