Latest Databricks tools use AI to simplify AI development
Fueled by user feedback, the vendor's newest features include a framework that automates building agents and a no-code interface for creating the pipelines that feed applications.
Databricks on Wednesday unveiled Agent Bricks and Lakeflow Designer, new tools that use AI to make it easier for customers to develop AI applications.
Agent Bricks is an automation framework that addresses problems many developers face when trying to not only develop AI-powered agents but also move them into production. Lakeflow Designer, meanwhile, is a low-code/no-code environment for building extract, transform and load (ETL) pipelines for AI development.
Agent Bricks is in beta testing while Lakeflow Designer is in preview. Both were introduced during Data + AI Summit, Databricks' user conference in San Francisco.
These announcements help Databricks users apply AI and GenAI applications to their proprietary data sets and thereby gain competitive advantage.
Kevin PetrieAnalyst, BARC U.S.
Given that they simplify AI development, the new features will likely be significant additions for Databricks users once generally available, according to Kevin Petrie, an analyst at BARC U.S.
"These announcements help Databricks users apply AI and GenAI applications to their proprietary data sets, and thereby gain competitive advantage," he said.
Based in San Francisco, Databricks is a data management vendor that during the past few years has grown to include an environment for users to build generative AI (GenAI) and agentic applications.
New capabilities
Agents are AI applications with reasoning capabilities and context awareness. Unlike chatbots, which can only react to user prompts, agents can act autonomously to surface insights and execute tasks. Given their potential to make workers better informed and more productive, agents are the latest trend in AI.
However, many enterprises are struggling to move agents beyond development and testing and into production. It's estimated that about three-quarters of all AI projects fail, which includes agentic AI.
High cost and low quality are two of the main hindrances, according to Joel Minnick, vice president of marketing at Databricks.
Contributing to the high cost and low quality of agents are the absence of any way to benchmark their performance during development, which turns it into a trial-and-error exercise, and a lack of enough data to inform an application so it doesn't hallucinate, Minnick continued.
"We got that feedback [about cost and quality] from customers and said, 'Let's solve this,'" he said. "Agent Bricks is the answer, providing customers with a whole new way to build agentic systems."
Agent Bricks essentially automates developing and deploying agents.
After users provide a high-level description of the agent's purpose, Agent Bricks automatically generates task-specific evaluations. It also produces LLM judges to assess quality, creates synthetic data that mimics an enterprise's proprietary data so the agents have the requisite amount of data, and applies performance optimization techniques to keep development costs under control.
Finally, the framework provides users with choices related to quality and cost. For example, one option could be an agent that results in 95% accuracy at a certain cost, while a second option could be an agent that results in 85% accuracy but costs 25% less to develop.
Once that choice is made, Databricks' Model Serving takes over and automatically puts the agent into production.
Beyond helping simplify development, Agent Bricks is significant because of who it targets, according to Petrie. Databricks and Snowflake have long been rivals. However, Databricks historically has targeted trained experts while Snowflake has targeted less technical users.
"Agent Bricks reinforces Databricks' strategy of helping data scientists build agentic AI applications," Petrie said. "This contrasts with Snowflake, which is more focused on providing premade agents and agentic templates to less technically skilled data scientists."
While Agent Bricks aims to help Databricks users deploy agents, Lakeflow Designer is designed to provide nontechnical users with some of the same data engineering capabilities previously available to only trained experts.
Low-code/no-code tools have long enabled nontechnical users to do some data modeling and other data management work, but they're usually limited in scope. To build complex data and AI pipelines with features such as CI/CD (continuous integration and continuous delivery), coding skills and other expertise are generally required. In addition, as pipelines age and new demands such as increased scale are placed on them, they sometimes break.
The result is more work for data engineers who both build pipelines and fix those that may have been developed years earlier by engineers no longer with their company, using tools that are now outdated.
Lakeflow, introduced in June 2024, is a data engineering environment for code-first users.
Lakeflow Designer is a no-code environment that provides nontechnical users with the same data engineering capabilities that Lakeflow provides experts, automatically translating natural language instructions to SQL code and applying data governance capabilities through Unity Catalog. In addition, Databricks Assistant provides AI-powered support, ensuring queries are properly structured, the right data is being used, and any errors get fixed along the way.
Unlike Agent Bricks, which caters to Databricks' historical audience of trained experts, Lakeflow Designer targets a different set of users and could potentially enable a different set of workers to use the vendor's platform, according to Petrie.
"Lakeflow Designer targets less technically skilled users, namely data analysts, who need to democratize their access to company data sets," he said. "This overlaps more with Snowflake's strategy."
As with Agent Bricks, customer feedback provided Databricks the impetus for developing Lakeflow Designer, according to Minnick. In particular, it was customers' desire to make business analysts more independent while reducing the burdens on trained engineers.
In addition to Agent Bricks and Lakeflow Designer, Databricks introduced the following new capabilities:
Full support for Apache Iceberg tables in Unity Catalog, including native support for Apache Iceberg REST Catalog APIs.
MLflow 3.0, an updated version of Databricks' platform for managing the lifecycle of machine learning and other AI applications that is now generally available.
The general availability of Lakeflow.
An integrated user interface in Lakeflow that unifies previously disparate tasks.
No-code data ingestion connectors between Lakeflow and Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL and Secure File Transfer Protocol.
The new capabilities unveiled during Databricks' Data + AI Summit come one week after rival Snowflake also revealed a swath of new features aimed at improving its AI development environment.
As Snowflake continues to improve in AI after a slow reaction to sharply rising interest in the technology, competition becomes less about who offers a complete AI development environment than how vendors differentiate themselves, which Databricks does by integrating MLflow with other capabilities, according to Petrie.
"The more AI teams can holistically manage the entire lifecycle of models and applications, from design to training, deployment, monitoring and optimization, the better they can iterate over time," he said. "This iteration is critical to adjust to changing business requirements and lessons learned."
Looking ahead
Just as Agent Bricks and Lakeflow Designer use AI to reduce the technical expertise required to use the Databricks platform, the vendor's product development plans include developing additional tools aimed at enabling more people within organizations to work with data and AI, according to Minnick.
"What we're always in pursuit of is continuing to lower the technical bar to be successful," he said.
That focus is wise, according to Petrie.
Databricks already has a strong base of trained experts. One way to add more customers would be to target less technical users with more tools like Lakeflow Designer that provide the same capabilities as other Databricks tools but with a simplified user interface powered by AI.
"I'd recommend that Databricks cater to less technically skilled data scientists and developers by giving them premade agents or agentic templates," Petrie said. "This would broaden their addressable market and help them take share from competitors such as Snowflake."
Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.