Snowflake on Tuesday unveiled a series of new capabilities aimed at helping customers securely develop generative AI applications within the confines of the vendor's data cloud.
The features were revealed during Snowflake Summit 2023, the vendor's user conference in Las Vegas.
Among them are the private preview of Snowpark Container Services that will enable developers to access generative AI software and the launch of a new partnership with Nvidia that gives users access to Nvidia's graphics processing units (GPUs).
In addition, Snowflake unveiled the private preview of Document AI, its own large langue model (LLM) aimed at enabling users to derive deeper insights from documents. The vendor also showcased a new application framework in public preview on AWS in which developers can access applications built by others within the Snowflake ecosystem and share and monetize their own custom-built applications.
Together, the new capabilities demonstrate Snowflake's prioritization of generative AI and commitment to making it easy for customers to use the technology to enhance their data and analytics operations, according to Doug Henschen, an analyst at Constellation Research.
"Snowflake made it very clear … that it is intent on helping its customers to build their own generative capabilities with various partners," he said. "The theme across all these announcements is that it's enabling customers to do that work using the customer's data within the Snowflake environment and without having to move or copy that data."
Snowflake and generative AI
Snowflake is a data cloud vendor whose primary competitors are data lakehouse vendor Databricks and cloud computing giants AWS, Google and Microsoft.
Its platform is designed to enable customers to not only store their data in the cloud but also query and work with their data -- using the BI platform of their choice -- where it's stored in Snowflake. Previously, users had to extract, transform and load it into their BI platform each time their data is needed.
The potential of generative AI, meanwhile, is wider use of data management and analytics tools within organizations due to true natural language processing capabilities as well as increased engineering and analysis efficiency resulting from a reduced need to write code.
As a result of its potential, in the months following OpenAI's November 2022 launch of ChatGPT -- which marked a major advancement in generative AI and LLM capabilities -- many Snowflake competitors unveiled integrations with OpenAI and other means of enabling customers to access generative AI.
Until recently, however, Snowflake did not reveal any plans for adding generative AI to its existing capabilities. Instead it continued its strategy of developing industry-specific versions of its data cloud designed to increase the efficiency of organizations in industry verticals, such as retail and financial services.
Then in late May, Snowflake acquired Neeva, a search engine vendor whose platform was fueled by generative AI and LLM technology. Now it has unveiled a roadmap focused squarely on enabling customers to access generative AI.
Christian Kleinerman, Snowflake's senior vice president of product, stated during a virtual press briefing on June 22 that generative AI is now the vendor's priority.
"We want to be the platform of choice for building generative AI experiences, assistants, co-pilots and user applications," he said.
Toward that end, Snowflake is partnering with Nvidia.
Nvidia is a specialist in AI software and hardware. Its NeMo platform is designed to enable users to build their own LLMs. The vendor also provides GPUs that enable users to embed generative AI in cloud applications.
Through the partnership between Snowflake and Nvidia, Snowflake customers will be able to use Nvidia's tools to create their own generative AI applications within Snowflake's data cloud, which Snowflake says provides built-in security and governance measures that public generative AI and LLM platforms lack.
Along with its partnership with Nvidia, Snowflake revealed plans to update Snowpark, its platform for developers.
Doug HenschenAnalyst, Constellation Research
Snowpark Container Services will be the vehicle through which developers can securely access Nvidia's capabilities as well as a range of other AI and machine learning features that they can build into their data applications.
Given its role as the access point for Snowflake's integrations with generative AI and LLM tools, the eventual launch of Snowpark Container Services is a welcome move for Snowflake customers, according to Henschen.
"The Snowpark Container Services announcement is a lynchpin for the partnership announcements with Nvidia and other partners," Henschen said. "That makes Snowpark Container Services an important catalyst for a lot of the capabilities that are being promised."
Beyond what Snowpark Container Services promises when eventually available to the public, it will serve as Snowflake's platform for adding new AI and LLM capabilities for developers, according to Torsten Grabs, Snowflake's senior director of product management.
One tool on Snowflake's roadmap that the platform will eventually support is an automatic translator like those released by data management vendors Monte Carlo and Dremio. The tool will convert natural language commands and queries to code such as Python and SQL, he noted during the media briefing.
"It will serve as the underlying infrastructure to add additional productivity enhancements for users," Grabs said.
In addition to its partnership with Nvidia and Snowpark Container Services, Snowflake unveiled other tools during the user conference:
- Document AI, Snowflake's first internally developed LLM built as a result of the vendor's September 2022 acquisition of Applica. The tool uses Applica's generative AI technology to enable users to better understand text documents and convert the unstructured data of such documents into a format in which it can be combined with structured data and used to inform analysis.
- Iceberg Tables in private preview so users can work with data in Snowflake using the Apache Iceberg format for open tables.
- The general availability of the Snowflake Performance Index (SPI) to enable customers to quantify the efficiency of their Snowflake deployments and better analyze and understand their cloud computing costs.
- The Snowflake Native App Framework of more than 25 applications -- to date -- developed by Snowflake customers and partners available to install from the Snowflake Marketplace.
Iceberg Tables and the SPI each resulted from requests by customers, according to Kleinerman, who acknowledged that cloud costs can be unpredictable when not monitored closely.
Henschen, meanwhile, said the capabilities all combine to create an environment where developers can securely build their organizations' needed applications.
"[They] are aimed at helping customers use their data to develop their own generative AI, data science capabilities and cloud-native apps, all without having to take data outside of the customer's Snowflake Data Cloud," he said.
Beyond the new capabilities, Snowflake on June 26 revealed an expanded partnership with Microsoft to enable product integrations centered around AI, applications development and data governance. Microsoft is one of the main investors in OpenAI, committing $10 billion to the generative AI vendor in January 2023 after an initial investment in 2019.
While Snowflake did not reveal specifics about its generative AI roadmap beyond the text-to-code tool mentioned by Grabs and those capabilities that are now in preview, the vendor's roadmap will continue to focus on generative AI, according to Kleinerman.
Specifically, it will continue to concentrate on bringing generative AI capabilities to organizations' data where it's stored in Snowflake.
"Think of Snowflake as the way to bring generative AI applications to the enterprise data and without having to ship enterprise data to a third-party end point," he said. "Unambiguously, we want to be the place of choice for building these experiences."
Henschen, meanwhile, said he'd like Snowflake to focus on doing more to help customers control cloud computing costs.
While the SPI is a start, he noted that there's more the vendor can do to show users the value of all the new capabilities it has under development and how those new capabilities won't make users' Snowflake deployment more expensive.
"They make some pretty sweeping statements about how much simpler, easier, more efficient, and secure their approach makes things. But total cost is a very important customer consideration in this economic environment," Henschen said. "Snowflake should go beyond the rather high-level, imprecise SPI and deliver measures or case examples that really show those efficiencies."
Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with more than 25 years of experience. He covers analytics and data management.