Dell EMC AI reference architecture builds on Isilon all-flash
For its AI product strategy, Dell EMC is pursuing the same approach it takes with other storage products. Namely, that it’s better to have multiples of similar storage gear than to leave a gap in the portfolio.
The latest Dell EMC AI offering is a scale-out reference architecture based on all-flash Isilon F800 NAS, resulting from an expanded partnership with supercomputer maker Nvidia.
Several major storage vendors are partnering with Nvidia to launch AI products, specifically Nvidia DGX-1 turnkey hardware appliances. If the new product rings a bell, that’s because it’s similar to the existing Dell EMC AI Ready Solution for Deep Learning, which combines Isilon F800 and Dell PowerEdge C4140 servers with four NVIDIA Tesla V100 GPUs. The difference here is the absence of the PowerEdge hardware.
The reference architecture gives enterprises experimenting with AI more flexibility, said Varun Chhabra, a Dell EMC senior director of marketing for unstructured storage products.
“Many of our customers are just starting out on their AI journey, and it’s important for us to support them wherever they are,” Chhabra said.
Isilion AI for massive clusters
The building block bundles one Isilon F800 fed by up to eight Nvidia DGX-1 graphics processing units. Each Isilon chassis contains four storage nodes, 60 high-performance SSDs, eight 40 Gigabit Ethernet connections and two Arista 7060CX2-32S network switches.
Dell EMC’s OneFS file system allows a cluster to scale to 144 Isilon nodes and 36 chassis, for up to 33 PB of raw flash. Dell EMC AI customers can mix in hybrid Isilon boxes to more than double the raw capacity at 68 PB.
DGX-1 servers draw power from eight integrated Tesla V100 GPUs configured as high-performance fabric mesh. The design is intended to sidestep bottlenecks caused by PCI Express-based interconnects.
Each Nvidia DGX-1 GPU is rated to deliver a petaflop of processing speed. Dell EMC claims the Isilon F800-DGX combo supports millions of concurrent connections to ingest data at 540 Gbps.
Chhabra said customers in automotive, financial service and life sciences are using Dell EMC AI storage to meet an “insatiable demand” to inference and train data sets.
“The initial AI thrust for customers was ‘We need to get GPUs,’ so they’d buy a lot of servers and GPUs (to process data). Now customers realize there is a big role for shared storage to play in the AI space,” Chhabra said.
Dell EMC will provide product support, but customers buy the validated Isilon reference design through the channel. The product is available now.
Pure Storage and NetApp also partner with Nvidia for AI capabilities in unstructured data storage systems.