Nabugu -

Nvidia extends its AI reach with cloud development hub

Nvidia debuted a development platform to speed the development and delivery of AI projects, along with a monthly subscription plan that allows access to its DGX SuperPod servers.

Nvidia has rolled out a cloud development hub that provides developers with tools to more quickly move AI projects from prototype to production.

The new offering, called the Nvidia Base Command Platform, allows multi-team AI development workflows that can be hosted on-premises or in the cloud. This level of flexibility permits more researchers and data scientists to work simultaneously with a range of computing resources that can increase productivity among experienced developers.

In related news, Nvidia also made the Base Command Platform available through a monthly subscription, jointly offered by Nvidia and NetApp. The subscription gives developers access to Nvidia's DGX SuperPod AI computers, along with NetApp's data management software.

"More and more development teams are working together to design software collaboratively, something that has been exacerbated by COVID," said Manuvir Das, head of Nvidia's enterprise computing group. "We have packaged up the software they need to better democratize AI for teams working across geographies."

Nvidia said on Aug. 2 that the Base Command Platform is now available in North America.

Brad Anderson, executive vice president and general manager of NetApp hybrid cloud services, added that the majority of larger enterprises see AI as essential to their organizations' success, although the complexity of the technology has prevented many from integrating it into their existing products. The new subscription offering serves to alleviate that pain point, according to Anderson.

One analyst sees the subscription plan as a sensible way to expose its range of hardware and software products to new users.

Nvidia is providing accessibility to traditionally expensive technology, only this time hosting it in the cloud.
Dan NewmanPrincipal analyst, Futurum Research; CEO, Broadsuite Media Group

"Much like what it did with the DGX A100, Nvidia is providing accessibility to traditionally expensive technology, only this time hosting it in the cloud," said Dan Newman, founding partner and principal analyst of Futurum Research and CEO of Broadsuite Media Group. "In the end, more time can be spent extracting value from data and AI, and less on dealing with hardware and infrastructure."

Base Command offers a single view across the span of a geographically diverse AI development team. This makes it easier for users to share resources either through a graphical interface or command-line APIs, as well as to carry out integrated monitoring and reporting dashboards.

Available AI and data tools to help researchers plan and schedule workloads and refine models include Nvidia's complete NGC catalog of AI and analytics software, APIs for integration with MLOps software and Jupyter notebooks, the company said.

Available now for early access customers, the monthly subscription pricing starts at $90,000. The fee may prove expensive for some shops, but large companies may see it as the right price to enter the AI market.

"While seemingly high, the cost is far less than outright buying a full DGX," said Mike Leone, senior analyst at Enterprise Strategy Group, a division of TechTarget. "For organizations just getting started with AI and who don't have access to a DGX system, this significantly lowers the barrier for entry by delivering a 'best of breed' experience on a proven AI system without the operational burdens."

Nvidia also announced at the annual Computex conference this week that several top-tier hardware manufacturers, including HPE, Dell Technologies, Lenovo and Supermicro have made AI-optimized servers available. The new x86-based systems are all based on Nvidia Ampere GPUs. Systems using the company's BlueField DPUs will arrive sometime later this year, followed by Arm-based servers in 2022. Google Cloud, meanwhile, plans to add support for Base Command Platform in its marketplace to deliver a "hybrid AI experience" for customers later this year.

Nvidia's Das said the x86-based systems provide the necessary industry-standard platforms to assist larger enterprises in more quickly completing AI projects.

"Enterprises, no matter what industry they are in, need a traditional data center infrastructure to support what in many cases is data-intensive work," Das said.

But while Nvidia's latest offerings solidify its position as a market leader in GPUs, ESG's Leone said that as the vendor gets deeper into the software side of things, it will face stiffer competition at the low end.

"Nvidia's AI Enterprise brings together the essential AI software tools and frameworks in a tightly integrated way," Leone said. "I would argue it is the most robust AI software stack in the market. But the greatest challenge NVIDIA has is this software may be too robust for where many organizations are today in their AI journeys."

As Editor At Large with TechTarget's News Group, Ed Scannell is responsible for writing and reporting breaking news, news analysis and features focused on technology issues and trends affecting corporate IT professionals. He has worked for 26 years at Infoworld and Computerworld, covering enterprise class products and technologies from larger IT companies including IBM and Microsoft, as well as serving as Editor of Redmond for three years overseeing that magazine's editorial content.

Next Steps

Nvidia partners with Google Cloud to boost AI for 5G app dev

Dig Deeper on AI infrastructure

Business Analytics
Data Management