Google Cloud Platform is looking to challenge hyperscaler and smaller vendors with a new, high-performance storage service and a native backup-as-a-service offering.
Google Cloud Hyperdisk, a new variant of GCP's Persistent Disk block storage service, brings greater IOPS along with a greater memory bandwidth than prior GCP services. Google Cloud representatives said the service aims to replicate the performance of on-premises storage without the latency or performance concerns users may find in the cloud.
Hyperdisk, unveiled Thursday, will enter general availability later this calendar year alongside a handful of new GCP storage products and services including automated storage tiering, Kubernetes storage services, and new backup and disaster recovery options.
A majority of the new services will be useful to existing customers but don't add up to significant draws for customers outside the GCP ecosystem, said Marc Staimer, president of Dragon Slayer Consulting in Beaverton, Oregon.
"This is a typical shotgun announcement," Staimer said. "None of them stand out on their own, but they're important features."
Data center in the cloud fast lane
Google Cloud Hyperdisk decouples cloud block storage from VMs, enabling customers to customize performance based on application demands, according to Guru Pangal, vice president and general manager of storage at Google Cloud.
The service is akin to storage pooling capabilities of on-premises SANs to disaggregate hardware from capacity, Pangal noted, but adds in additional services and capabilities tied to the GCP platform.
"Now [storage and VMs] are decoupled," Pangal said. "This was there in the on-premises world but not as prevalent in the hyperscaler world."
Google Cloud Hyperdisk is aimed at applications that use artificial intelligence or machine learning, as well as more traditional but demanding databases such as SAP HANA, an in-memory relational database.
Existing block storage services under GCP's Persistent Disk branding will remain available to those seeking block storage services, albeit a step down in performance and price compared with Hyperdisk.
Kubernetes and backups
GCP also expanded storage offerings for Kubernetes users with new products unveiled Thursday including Filestore Enterprise with multishares support and Backup for Google Kubernetes Engine (GKE).
When setting up an instance, the new multishares capability lets administrators using the Filestore Enterprise service automatically set aside a portion of the storage for GKE container clusters without additional configurations or service interruptions. Backup for GKE is a managed service that enables backups and disaster recovery of GKE container clusters.
GCP is also releasing Google Backup and Data Recovery into full GA. Built on Acitfio backup and recovery software that Google acquired in 2020, this managed service is available through the Google Cloud Console, unlike the previously released Actifio GO. It enables users to back up databases and applications including Google Cloud VMware Engine and Compute Engine.
Similar services exist within the other hyperscalers, but GCP remains a popular choice for developer teams that are beginning to lean heavily on containers and Kubernetes, said Dave Raffo, a senior analyst at Evaluator Group in Boulder, Colo.
Native Kubernetes backup services are quickly becoming a mandatory part of infrastructure planning due to its popularity and the increased reliance on the service for business-critical applications, he added.
"AWS and Azure already have backup as a service, so [GCP is] catching up there," Raffo said.
Both AWS and Azure backup services are more limited in capabilities compared with GCP's pitch, Staimer said, specifically in how backups are made and in the services provided.
Marc StaimerPresident, Dragon Slayer Consulting
GCP's native backup-as-a-service offering is a direct challenge to third-party backup and recovery services already on its platform, he said. Such a service might be of limited use to multi-cloud customers since it exclusively protects data in GCP, he added.
"It's a gauntlet they threw down," Staimer said.
Storage auto classification and insights
Cloud Storage Autoclass and Storage Insights, also unveiled Thursday, should enable users to save on cloud storage costs and give better insights into what data they're paying to keep, according to Pangal.
The Cloud Storage Autoclass service enables users to set policies for moving data to different tiers of storage based on access, usage or other standards within GCP. The company sees the service as a way to integrate GCP storage into multi-cloud environments.
Users can perform these class shifts manually, but the Autoclass service automates much of the process, such as developing policy scripts, and it eliminates the ingress and egress fees of moving among its four hotter or colder storage tiers or classes.
The announcement of the new service also comes on the heels of a new set of pricing standard for cloud storage that GCP plans to enact before the end of September.
Storage Insights provides information and API connections to object storage services within Google Cloud, enabling customers such as MSPs or developers to create dashboards and consoles to manage data in their own products.
Tim McCarthy is a journalist living in the North Shore of Massachusetts. He covers cloud and data storage news.