Created in the 20th century to sell storage to engineers, NetApp has survived for 25 years to remain the largest standing data storage company not tied to a server vendor. Founder Dave Hitz credits that survival to the company’s “enormous capacity to change” as the IT landscape changes.
“People ask me, why are you still alive after 25 years? That’s a very real question,” Hitz, currently a NetApp executive vice president, said during a press even last month. “NetApp has survived 25 years because we have an amazing ability for radical change when we need it.”
Hitz said his company has previously pivoted to survive disruptions caused by the rise of the internet, the internet crash and virtualization. He said all posed threats to NetApp when they first developed, and NetApp adjusted its storage to take advantage. Now the NetApp cloud pivot is the current adjustment that can make or break the company.
“Each of these transitions were things that were going to kill us,” he said. “Here we are again, possibly the biggest transition of all, into cloud computing and again it’s the thing that’s going to kill us. We hear, ‘We’re all doomed, everything’s going to move into the cloud, there’s no room for NetApp.’ I don’t think it’s true. It could be true if we don’t’ respond.”
Of course, you don’t have to be a bull-castrating genius to figure out the cloud is the key for today’s storage companies. Every large storage company has the cloud in its strategy and barely a month goes by when we don’t see a startup come along promising to provide cloud-like storage for enterprises, and to connect on-premises storage to public clouds.
So what is the NetApp cloud strategy?
Hitz said NetApp “way underestimated how pervasive the cloud would be on all enterprise computing,” just like it misjudged how flash would impact enterprise storage. (NetApp originally bet on flash as cache instead of solid-state drives in storage arrays before getting out its successful All-Flash FAS array in 2016.). But he said the NetApp cloud plan consists of doing what it does best — data management.
“We think data is the hardest part [of the cloud],” Hitz said. “It is very easy to go to Amazon or Azure, fire up 1,000 CPUs, run them for an hour or day or week, [and] then turn them off. It’s not easy to get them the data they need, and after they make a bunch of data, it’s not easy to get it back and keep track of it. Those are the hard parts. And that’s right in the center of our wheelhouse.”
Channeling NetApp’s history, CEO George Kurian said he saw his job when he took over in 2015 as leading the company through transition. “As the world around us changed, NetApp needed to change fundamentally,” he said.
He sees a strong NetApp cloud strategy as the key to initiating that change. “Many customers are engaged with us to help them build hybrid architectures, whether it’s between on-prem and public cloud, between two public clouds or migrating one of their sites to a colocation,” Kurian said.
Kurian cites SolidFire — an all-flash array platform built for cloud providers — as the “backbone of the next-gen data center.” NetApp acquired SolidFire in 2016 as much as a cloud platform as to fill a need for all-flash storage.
NetApp cloud software-defined storage (SDS) and services include Private Storage for Cloud, Ontap Cloud, Data Fabric, AltaVault cloud backup and others. NetApp also has a Cloud Business Unit, which includes development, product management, operations, marketing and sales.
Senior vice president Anthony Lye joined the company last March to run the NetApp Cloud Business Unit. “The whole purpose of my organization is to build software that runs on hyper-scale platforms,” Lye said. “The software can be consumed by NetApp or non-NetApp customers, on hybrid or multi-cloud environments.”
The NetApp cloud portfolio will go a long way in determining if the vendor gets to keep its survivor status.