Getty Images/iStockphoto

Databricks partners with Mistral AI to aid GenAI development

The data cloud vendor joins Microsoft and Snowflake in partnering with -- and investing in -- the startup to provide customers with access to Mistral's open source language models.

Databricks on Thursday unveiled a partnership with generative AI specialist Mistral AI, including integrations with the vendor's language models and an investment in the company.

With the move, Databricks follows closely behind Microsoft and Snowflake, which also recently formed partnerships with Mistral AI that include integrations and an investment in the April 2023 startup.

Microsoft on Feb. 26 revealed its partnership and integrations with Mistral AI, including a $16.2 million investment. Snowflake followed on March 5 by revealing an integration and partnership that includes an undisclosed investment.

Like Snowflake, Databricks kept the amount of its investment in Mistral AI private.

Based in Paris, Mistral AI is an AI vendor offering both open source and proprietary large language models (LLMs). Competitors include more established generative AI vendors including OpenAI, Anthropic and Cohere.

In just 11 months, Mistral AI has raised more than $550 million in funding and developed LLMs that perform well against more established LLMs in benchmark testing.

However, perhaps Mistral AI's greatest attraction as a partner for data platform vendors is that it represents a new alternative among generative AI platforms, according to Doug Henschen, an analyst at Constellation Research.

Given that generative AI is still an emerging technology and vendors are quickly improving their LLM capabilities, the more choices data platform vendors can provide their customers, the better.

"Part of the attraction is just having more model options, and Mistral AI is out there … issuing claims of superior performance as compared with popular alternatives," Henschen said. "Model choice is a good thing that every AI platform vendor wants to add."

In addition to Mistral AI, Databricks provides access to language models from Anthropic, AWS Bedrock, Azure OpenAI, BGE, Llama 2 and OpenAI, among others.

Once accessed, the models can be imported to Databricks, where they can be managed using the vendor's security and governance capabilities.

"More is always better," Henschen said. "Google Cloud, for one, has more than 100 options in its Model Garden."

Kevin Petrie, meanwhile, noted that in addition to Mistral-Large, a proprietary LLM that has performed well in comparisons with other LLMs, Mistral AI provides open source models Mistral 7B and Mixtral 8x7B. Those open source models -- Mixtral 8x7B is an LLM while Mistral 7B is a small language model -- make the vendor's capabilities an attractive option beyond just the performance of its flagship LLM.

Small models use less compute power than LLMs, Petrie noted. In addition, they are able to deliver accurate responses while absorbing data more efficiently than their larger counterparts.

"Mistral represents an emerging class of what we call small language models," Petrie said. "These require far fewer parameters than the early wave of large language models … [and] save money compared with some of the larger alternatives."

In addition, Mistral AI's models can be customized as enterprises begin using language models to build AI applications trained with their own data to understand their unique business needs, Petrie continued.

"Mistral also is small in the sense that it helps companies prompt and fine-tune its open-source models on their own small, domain-specific datasets," he said. "This domain-specific focus represents the next wave of innovation in GenAI."

The partnership

Based in San Francisco, Databricks is a data lakehouse pioneer whose platform was built on the open source Apache Spark framework. Founded in 2013, 11 years later, the vendor is still an active participant in the open source community, making much of its product development open source.

For example, Databricks in 2023 worked with the Delta Lake community to build Delta Lake 3.0, an open source storage format that aims to unify data.

Mistral AI similarly has its roots in open source development.

As a result, culture played a role in Databricks and Mistral AI forming a partnership, according to Prem Prakash, head of AI/ML product marketing at Databricks.

"Mistral also shares one of our core beliefs that open source solutions will fuel innovation and transparency in generative AI development and research," he said.

Toward that shared belief in the value of open source capabilities, the partnership between Databricks and Mistral AI includes the full integration of Mistral AI's open source language models, Mistral 7B and Mixtral 8x7B, with the Databricks Data Intelligence Platform.

Databricks, however, is not yet integrated with Mistral Large, according to the vendor.

The open source models are now available in the Databricks Marketplace where Databricks customers can use the models in the Mosaic AI Playground. There, customers can use the models to help develop generative AI applications of their own that are trained on proprietary data so they can be used for business purposes.

Once built, customers can deploy and operationalize their customized models through Mosaic ML Model Serving.

Like Henschen, Petrie noted that the true value of the integrations between Databricks and Mistral AI doesn't lie as much in which models are available to whom but in providing customers with choice so they can discover the models that best fit their unique needs.

"Databricks correctly recognizes that companies want the flexibility to choose their own language models," he said. "It's an innovation arms race, and companies need to experiment with multiple models to figure out which ones work best for different use cases."

As a result, Databricks is building an ecosystem in which customers can choose the tools that work best for their needs, Petrie continued.

Henschen, meanwhile, noted that even within the integration between Databricks and Mistral AI, there is choice.

Beyond adding another vendor for Databricks customers to select for their generative AI development, the integration adds multiple models from Mistral AI from which to choose.

"Having more model options is always a good thing," Henschen said.

While Mistral Large has outperformed other LLMs in benchmark testing, Mixtral 8x7B outperformed OpenAI's GPT-3.5 in testing, he continued. Mistral 7B, meanwhile, is a smaller model that supports high volumes of data with low latency, Henschen noted.

"Each one has specific strengths," he said.

More choice, in fact, was a motivating factor for Databricks forming a partnership with Mistral AI and developing integrations with the startup's open source models, according to Prakash.

"Databricks wants to give customers the tools and flexibility to choose the right model for the right job, including popular open source models," he said.

More than 1,000 enterprises were using Mistral AI models on Databricks before the integration, Prakash continued. As a result, Mistral AI was a logical partner for the data platform vendor.

"This partnership was a natural next step for us to further help our customers," he said.

Focus on AI

While Databricks' partnership with Mistral AI adds to Databricks' generative AI ecosystem from a technological perspective, it also represents the continuation of the commitment the data platform is making to traditional AI and generative AI.

In the 16 months since OpenAI released ChatGPT, which represented a significant improvement in LLM capabilities, Databricks has aggressively built a platform for AI.

The vendor developed Dolly, an LLM, in March 2023. Three months later, Databricks acquired MosaicML for $1.3 billion add generative AI development capabilities. In October 2023, the vendor unveiled new LLM and GPU optimization capabilities to help users improve their generative AI outcomes. The following month, it revealed plans to combine its existing lakehouse platform with AI and rename its flagship tool the Data Intelligence Platform. In December, Databricks introduced a suite of tools, including retrieval-augmented generation (RAG) capabilities, to enable customers to train AI models.

Part of the attraction is just having more model options, and Mistral AI is out there … issuing claims of superior performance as compared with popular alternatives. Model choice is a good thing that every AI platform vendor wants to add.
Doug HenschenAnalyst, Constellation Research

With those steps, Databricks has been quicker to make AI a priority than some of its competitors, including Snowflake. That company is, perhaps, Databricks' biggest rival and has only recently made moves that signal a heightened commitment to AI.

However, given that generative AI is still in its early stages, Databricks will have to continue developing capabilities, acquiring others and forming partnerships to remain competitive, according to Henschen.

"I'd call Databricks a leader, but it has to keep up with some big, deep-pocketed [peers], including AWS, Microsoft and Google," he said. "Access to models is just one aspect of the competition."

Enabling customers to develop customized models will likely be the more important competition, Henschen continued. Databricks' acquisition of MosaicML addresses that need.

"That's likely to be the larger and more important battleground long term. But we haven't begun to see the competitive landscape shake out there," Henschen said.

Moving forward, Databricks would be wise to add even more model-building and fine-tuning options, according to Henschen.

Petrie, meanwhile, said he'd like the vendor to expand its vector search capabilities.

Databricks offers vector search capabilities that feed RAG pipelines but could provide more options than those it currently offers through partnerships and integrations with specialists.

"I'd like to hear more about Databricks' strategy in the vector database segment," Petrie said. " The question is the degree to which Databricks will compete or partner with vector database companies such as Pinecone and Weaviate."

Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.

Dig Deeper on Business intelligence technology

Data Management
SearchAWS
Content Management
SearchOracle
SearchSAP
Close