Getty Images

Intel's data center strategy rests on accelerator chips

Intel will release three generations of 200 GB, 400 GB and 800 GB infrastructure processing units over the next four years. The first two generations will include ASICs and FPGAs.

Intel has laid out a four-year data center strategy around three generations of infrastructure processing units that would compete with AMD, AWS and Nvidia chips.

Intel plans to release the first generation, comprising two 200 GB IPUs, in 2022. Subsequent generations would include 400 GB IPUs in 2023 and 2024. Each generation would consist of an application-specific integrated circuit (ASIC) and a more programmable field-programmable gate array (FPGA).

The company, which introduced the chips at the Intel Vision conference this week, plans to release 800 GB IPUs in 2025 and 2026. It did not disclose details.

Intel designed the 200 GB ASIC IPU, codenamed Mount Evans, to offload network virtualization, firewall and virtual routing functions. The IPU can also provide storage performance in line with data transfer speeds between devices supporting the non-volatile memory express storage protocol.

The second 200GB IPU, code-named Oak Springs Canyon, is an FPGA that customers can program to meet the needs of their networks. The IPU uses Intel's Xeon D and Agilex FPGA to let customers select the workloads to offload to the accelerator chip.

The two IPUs build on Intel's first IPU, released in 2021. The chips handle security and network and storage virtualization functions, freeing up CPUs to dedicate processing power to improve application performance.

The 400GB IPUs, code-named Mount Morgan and Hot Springs Canyon, are scheduled for release in 2023 and 2024, respectively. Mount Morgan is Intel's next-generation ASIC IPU, while Hot Springs Canyon is the next-generation FPGA IPU.

Intel positioned the IPUs to compete with data processing units (DPUs) such as Nvidia's Bluefield-2 and AWS's in-house Nitro chip. AMD acquired accelerator company Pensando last month to compete in the market for DPUs deployed on network interface cards to offload security and network services from CPUs.

The products each have unique competitive angles. While Pensando focuses on networking functions, Nvidia tends to give extra attention to AI and Intel to machine learning and real-time telemetry. AWS offers its Nitro chip to customers of its cloud computing platform.

The DPU market is still nascent, said IDC analyst Shane Rau. However, IDC expects it to become more important as systems are tasked with more complex workloads than CPUs alone can handle efficiently.

"[Intel has] ground to make up, but so do their competitors," Rau said. "They're all trying to adapt to this new landscape created by the deluges of data and new data types."

Habana Gaudi2
Intel has optimized the Habana Gaudi2 processor for training AI models.

Intel's IPU strategy is the latest move to regain its leadership position in the chip market. They include a $44 billion investment in new or updated production facilities in Arizona, Ohio and New Mexico. Intel has also thrown its weight behind the Chips Act, a legislative initiative to boost domestic investment in the semiconductor industry.

Also at Intel Vision, the company announced that its deep learning processor team, Habana Labs, is launching two new processors for AI workloads, the Habana Gaudi2 and the Habana Greco. The 7 nanometer Gaudi2, optimized for training AI models, would compete with Nvidia's A100 80 GB GPU. The 7 nanometer Greco is for AI inference.

Madelaine Millar is a news writer covering network technology at TechTarget. She has previously written about science and technology for MIT's Lincoln Laboratory and the Khoury College of Computer Sciences, as well as covering community news for Boston Globe Media.

Next Steps

Intel revenue drops again; company to speed up next-gen chips

Dig Deeper on Cloud and data center networking

Unified Communications
Mobile Computing
Data Center
ITChannel
Close