DDN adds more performance to AI storage offering

At Nvidia's GTC conference this week, DDN introduced its highest performing AI storage while supporting nearly all the GPU-as-a-service cloud providers.

DataDirect Networks has added to the performance of its high-end AI storage platform thanks to changes in networking and memory. It's also deepened its partnership with Nvidia and cloud providers offering GPU as a service at a time when the demand for high-end performance is on the rise.

At this week's Nvidia GTC conference, DDN launched AI400X2 Turbo, which is aimed at AI workloads and provides a 33% bandwidth improvement over the AI400X2 in the same form factor. The uptick in performance comes from an increase in memory and better networking. DDN is also expanding its support for the cloud providers in Nvidia's Partner Network, growing its presence where AI workloads are running.

DDN was founded in 1998 as a high-speed storage vendor that primarily focuses on high performance computing, supercomputing and academic environments. It is the underlying storage for Nvidia DGX SuperPod. With the acquisition of Tintri in September 2018 and the release of the object store Infinia in November 2023, DDN has been expanding its suite of products for more common enterprise use cases, according to Simon Robinson, an analyst at TechTarget's Enterprise Strategy Group.

But with the onset of AI in the last 18 months, DDN's customer base could continue to evolve, he added. Twenty-six years ago, most enterprises didn't need high performance storage, but that is changing.

"It is not that DDN is necessarily coming down [performance-wise] into the enterprise market," Robinson said. "But the market is coming up to them."

More performance, similar box

The AI400X2 Turbo has a maximum bandwidth of 120 GBps compared with the AI400X2's 90 GBps. This was achieved through an increase in networking and memory. The AI400X2 Turbo will join the A3I appliances used in cases such as the storage for Nvidia's Selene supercomputing system and can also be used for large AI systems, such as the models that power ChatGPT, according to the vendor.

It is not that DDN is necessarily coming down [performance-wise] into the enterprise market. But the market is coming up to them.
Simon RobinsonAnalyst, Enterprise Strategy Group

Robinson noted that DDN's news plays into today's high-performance storage market dynamic. In the past, workloads were categorized as either read or write, random or sequential, and large files or small files. Now the market is about delivering performance across multiple vectors, he said.

"It's all about saturating the GPUs," he said, which is what DDN is after with the updated AI400X2.

Mitch Lewis, an analyst at Futurum Group, cautioned customers around how AI storage is marketed. He said there needs to be clearer delineation between storage for large-scale foundational models versus storage for small AI applications such as inferencing. DDN's AI400X2 Turbo is focused on the former.

"It is targeted at the really high-end, deep-into-AI kinds of customers," he said.

More partners, more options

In an expansion of the storage supplied to Nvidia DGX SuperPod customers, DDN will also be supporting cloud partners in the Nvidia Partner Network that host large-scale GPU services, including DDN's AI storage appliances and ExaScaler file system as well as Lambda Labs and Scaleway.

"This demonstrates that DDN is a leader in [large AI deployments and GPU clouds] and meeting high-end AI requirements," Lewis said.

Robinson echoed this point, noting that now almost every service provider that offers GPU as a service is using DDN storage.

Even with the new products and partners, high-performance computing and AI is still a relatively limited but rapidly growing market, according to Robinson.

"Now is their time to shine," he said.

Enterprise customers need performance, but scalable systems and data management is probably top of mind for AI storage for broad enterprise use, Lewis said. The AI400X2 Turbo is niche in that respect because it is aimed at high-performance computing and AI workloads, which still aren't omnipresent in the enterprise.

Regardless, Lewis said, "AI is the biggest thing going on in the world."

Not all companies are operating at the high end, but it's garnering a lot of attention, he said.

Adam Armstrong is a TechTarget Editorial news writer covering file and block storage hardware, and private clouds. He previously worked at StorageReview.com.

Dig Deeper on Primary storage devices

Disaster Recovery
Data Backup
Data Center
Sustainability
and ESG
Close