Getty Images

New Microsoft AI chip pressures Nvidia and hardware market

The tech giant's new chip reflects a need to cut costs while scaling large models. The move could force AI hardware providers such as Nvidia to make changes.

Microsoft might soon release its own AI chip.

The tech giant has been developing what it calls the Athena chip since 2019, according to a source with direct knowledge of the project. Athena will reportedly power AI chatbots such as ChatGPT.

Tuesday's news comes as Microsoft has spent early 2023 incorporating ChatGPT into office applications and the Bing search bar.

It also comes amid reports that Google, which makes its own Tensor Processing Unit (TPU) processors, is searching for ways to refine its search business.

A search for options

The revelation that Microsoft is plunging into AI hardware illustrates a new pattern of big cloud service providers trying to cut costs -- by lessening their reliance on AI hardware/software vendor Nvidia and longtime chipmakers Intel and AMD -- while making innovations in AI, said Chirag Dekate, an analyst at Gartner.

For example, Google's engineers realized that to do the type of innovation they wanted in AI, they needed to make their own custom application-specific integrated circuits (ASICs). They then created their TPUs.

Similarly, Amazon has developed its Triton and Inferentia processors for training and inferencing AI at a fraction of the cost. Also, some major foreign cloud providers, including China-based Tencent, which revealed on April 17 that it has started mass-producing some AI chips, are getting into the AI hardware game.

So as Microsoft continues to forge ahead in the generative AI market largely through its alliance with research lab OpenAI, an AI chip enables the tech giant to train larger and more accurate AI models while reducing the cost to customers, according to Dekate.

"It enables them to continue to develop leadership-class generative AI capabilities," he said. "If they execute this to perfection and scale, and if they have multiple generations of this, chances are Microsoft will enable not just creating larger models, but also training them at a fraction of the costs, at faster time scales than they otherwise would have required."

The effect on Nvidia

While Microsoft's new AI chip might help it save costs, it will be a different story for Nvidia, a leader in AI hardware.

This is going to create some unique pressures for companies like Nvidia.
Daniel NewmanAnalyst, Futurum Research

"This is going to create some unique pressures for companies like Nvidia," said Futurum Research analyst Daniel Newman.

The AI chipmaker and software provider might have to consider not only lower-cost alternative products to its GPUs, but also reducing the hardware's energy and power consumption.

"GPUs tend to be the power hogs comparatively to some of these application-specific chips," Newman said. "When you're amping up the volume or utilization, you are also going to be impacting carbon, the amount of power that the data centers are using."

Amping down power

The power consumption dilemma will likely lead to a shift in the market in which workloads will not be about the highest performance or most power, but rather the right amount of performance for a specific workload. However, despite that gradual shift, Nvidia is still in good shape to remain dominant because of its market position, Newman said.

"They're in a good place, but I do think they're going to have new pressure," he said. "Lots of new pressure, lots of new competition, lots of new entrants, both big companies and smaller. They're going to try to create lower-cost and more power-efficient silicon to run generative AI workloads."

Moreover, the shift in the chip market means users are craving choice.

"These innovations are not necessarily designed to be replacing a GPU strategy from a cloud perspective," Dekate said, adding that the move toward ASICs won't mean users will no longer need GPUs.

Instead, tech giants such as Microsoft will most likely offer ASICs and GPUs. In addition, customers moving from on-premises to the cloud will be attracted to providers that provide different silicon environments.

"It's the 'rising tide raises all boats' sort of dynamic in the marketplace, as opposed to a zero-sum game of replacement dynamic," Dekate said.

Microsoft declined to comment for this story. The chip project was first reported by The Information.

Esther Ajao is a news writer covering artificial intelligence software and systems.

Next Steps

Microsoft launches custom AI, server chips for Azure

Dig Deeper on AI infrastructure

Business Analytics
CIO
Data Management
ERP
Close