your123 - stock.adobe.com
AWS, Hugging Face and the growing generative AI competition
The tech giant and LLM vendor team up as rivalry intensifies between Microsoft and Google in the generative AI war. The partnership is Amazon's response to the competition.
AWS and Hugging Face formed a partnership to make the training, fine-tuning and deployment of large language and vision models more accessible to developers.
The pact between the tech giant and the open source large language model (LLM) developer, revealed on Feb. 21, is Amazon's indirect response to the sudden emergence of a high-stakes AI chatbot war between competitors Microsoft and Google.
The two vendors said the partnership will make it easier for developers to use AWS services and deploy Hugging Face models for generative AI applications.
Customers of AWS can now use Hugging Face models on AWS with SageMaker JumpStart and Hugging Face AWS Deep Learning Containers and AWS tutorials for deploying models to the AWS Trainium chip (for training machine learning models) or AWS Inferentia (for inferencing large language models).
Hugging Face AWS Deep Learning Containers let developers deploy generative AI applications that can perform tasks such as summarizing text, generating code, and creating images at scale within hours, according to the vendors.
The AWS and Hugging Face collaboration comes as competition grows quickly in the generative AI market after Microsoft earlier this month incorporated AI upstart OpenAI's ChatGPT into its Bing search engine and Google responded with Bard.
For AWS, allying with Hugging Face lets it rapidly enter the competitive arena, said Forrester Research analyst William McKeon-White.
Because Google, Amazon and Microsoft have been in direct competition for years, a pivot toward a specific new market must involve all three, he said.
"Where one goes, the others have to go. Otherwise, they risk losing out on the massive market," he said.
Daniel NewmanAnalyst, Futurum Research
LLMs, exemplified by ChatGPT, disrupt the traditional business model of how people get information from the internet. That is true to some extent about AWS' own internal search engine, which is critical to the tech giant's consumer business. Entering more deeply into LLM technology could help AWS keep its own search methodology current, McKeon-White said.
"It doesn't matter if you're an e-commerce or if you're a search engine or if you're just a web page with a bunch of information on it," he said. "They scrape your content. So suddenly, people don't need to go and visit your website or stack rank your e-commerce capability against others."
Meanwhile, this foray into LLM technology buys Amazon time.
"This is AWS' short-term answer to what Microsoft has done with OpenAI. And of course, Google has done a lot of this in house," said Daniel Newman, an analyst at Futurum Research. "In terms of commercialization Microsoft and Google are ahead. But it will be a mistake to count Amazon out as being able to catch up."
A good partner
Moreover, Hugging Face is a good partner with whom to enter the LLM market because most of the vendor's products are on AWS and not on Google Cloud Platform or Azure, said Constellation Research analyst Andy Thurai.
Hugging Face has had a technology relationship with AWS for some time, he noted, with its deep learning and machine learning pre-configured containers optimized for AWS.
While not competing directly with ChatGPT and Bard, AWS can spawn LLMs generated by its alliance with Hugging Face and compete commercially in the enterprise arena, Thurai said.
The strategic partnership with Hugging Face also lets AWS train the next generation of Bloom, an open-source AI model on Trainium, in size and scope with ChatGPT's underlying LLM. AWS then has room to test and train the model and avoid criticism of racist or otherwise offensive, inaccurate or unpredictable behaviors that have come with the release of some of the generative AI systems.
"This allows AWS not to enter the race directly yet compete with ChatGPT and Bard while potentially limiting any liabilities that might arise from it," Thurai added.
While the public has yet to see the full range of Hugging Face's capabilities, linking the AWS developer ecosystem and Hugging Face could provide the open-source vendor with a vast crowd developer community, Newman said.
"What developers are going to be able to do with something like Hugging Face and the tranches of their own proprietary data could provide a really meaningful opportunity to see what enterprise AI in the future looks like or AI commercialized with datasets that aren't just on the internet," he said.
The partnership also provides visibility for AWS infrastructure technology such as Sagemaker, Trainium and Inferential, Newman said.
In addition, Hugging Face can now use that infrastructure to train and maintain its LLMs.
"Any of these organizations providing the LLM generative capability need patrons and large amounts of revenue they can't generate themselves," McKeon-White said.