Starting today, Microsoft will offer the Azure OpenAI service in preview. Microsoft revealed the news during its Ignite 2021 virtual conference.
Microsoft's partnership with OpenAI began in 2019, when it invested $1 billion in the AI research organization that was originally founded by Elon Musk in 2015. Last year, Microsoft gained the exclusive license to OpenAI's GPT-3 language model.
What Azure OpenAI Service offers
With the Azure OpenAI Service, Microsoft said customers will have access to enterprise capabilities such as security, compliance and scale requirements on the Azure public cloud.
Sid Nag, an analyst at Gartner, said if Microsoft realizes the potential of the GPT-3 deep learning-based language prediction model, the technology will be appealing to enterprises.
"Enterprise-hardening things like security, compliance and scalability are definitely attractive features," Nag said.
Microsoft said the new service can be applied to different use cases, including summarization, content and code generation.
According to Andy Thurai, an analyst at Constellation Research, enterprises can use the Open AI service on Azure for common language conversion, math functions or even writing short stories and novels. It can also auto-complete images and produce a quality image out of a partial, incomplete or deteriorated image, thereby providing faster automation of image, photo and video editing.
Another notable feature is labeling of raw data, Thurai said.
"Labeling of raw data has become quite expensive," he said. "Potentially, this can help tame that cost by auto labeling using AI."
"As we see more and more enterprises move mission-critical apps, they're also looking for additional cloud-native function, API controls and such," Nag said. "That's an important benefit that Microsoft would have insofar as attracting clients to use this service."
The Azure OpenAI service is a step forward for Microsoft to be competitive in AI language services. However, others in the market are already ahead, said Holger Mueller, an analyst at Constellation Research.
"The competition is really out there between whatever AWS is doing and what Google is doing," Mueller said.
Google is generally ahead because the vendor started with neural networks for speech, he continued. He said Google has a big advantage because it operates its speech technologies at internet scale, making it cheaper and faster, and it also boasts the Tensor platform.
Holger MuellerAnalyst, Constellation Research
The Azure OpenAI service is "a jumpstart for Microsoft to get going to be competitive," Mueller said.
Mueller added that while Microsoft may try to be competitive with pricing of Azure OpenAI, enterprises with speech-intensive applications may be more concerned about the infrastructure behind their applications than price.
Both Google and AWS -- Microsoft's chief competitors -- have an advantage over Microsoft because they have their own propriety chip architecture, Mueller said.
"The hardware, which runs on specific development of algorithms on silicon … that's what really matters," he added.
While Microsoft may be behind on the hardware side, the vendor recently partnered with AI hardware giant Nvidia on a new natural language generation model: the Megatron-Turing Natural Language Generation model. The model is more than triple the size of GPT-3, at 530 billion parameters compared with GPT-3's 175 billion parameters, Thurai said.
Mueller said that Microsoft may be able to compete with AWS and Google on the hardware side if it develops its own propriety chip architecture for running its own algorithms.
Other challenges for Azure OpenAI
Microsoft's challenge isn't only in hardware. According to Thurai, one possible concern with the OpenAI service is that while the GPT-3 model works well for simple queries and translation, it's still limited for complicated queries.
Some have also criticized GPT-3 for containing various built-in biases.
"There's also a concern about negative, racist, sexist and biased comments and language, which worries AI ethics professionals," Thurai said.
Also -- like other AI applications that process massive amounts of compute and storage -- the model is "non-environment friendly," he noted.