- Share this item with your network:
- Download
Storage
- Editor's letterPredictive storage analytics can watch your storage for you
- Cover storyInformation storage on the cloud is often the way to go nowadays
- InfographicSuccessful IT resiliency strategy based on high availability and DR
- FeatureStorage administrator skills and jobs in the age of convergence
- ColumnAutomated data management is a crucial part of IT's future
- ColumnPublic cloud workload success requires IT leadership
Fotolia
Public cloud workload success requires IT leadership
IT must change with the times and adapt to the reality that others within the organization can now procure and provision cloud resources without their input.
Few innovations have redefined IT as much as the public cloud. According to Enterprise Strategy Group research, 85% of IT organizations now use public cloud services, using either IaaS or software as a service, and 81% of public cloud infrastructure users work with more than one cloud service provider. Multi-cloud is our modern-day IT reality. Amidst this mass adoption of public cloud services, an interesting phenomenon is occurring: Of IaaS users, 41% have brought at least one public cloud workload back to run on premises. While this may seem like an indictment on the use of public cloud services, it isn't, quite the contrary.
At ESG, we recently conducted an extensive investigation into the decisions that led to migrating workloads back on premises. When looking for insights into the factors driving and influencing these migration activities, one theme stuck out. Often in the enthusiasm to benefit from public cloud infrastructure, companies commit workloads en masse without applying necessary due diligence. Only later do they identify that some simply don't fit. Additionally, the cost of moving the workloads and the data back and forth proved significant in most cases.
What can we learn from the organizations that have prematurely shifted workloads to the cloud, only to be forced to move them back at a later date? Well, incongruities between cloud expectations and actual performance arise for a variety of reasons:
- Cloud decision-makers are often not IT decision-makers. While IT plays a role in a majority of cloud provider selections, there are still a significant number of businesses where IT is left out of the decision process. This is an issue.
- Factors that influence public cloud workload success often differ from those historically used to drive on-premises decisions. Historically, IT has been a game of managing aggregates -- ensuring storage, compute and network deliver performance, as well as capacity and bandwidth necessary at a data center level. Individual public cloud workload analysis is often done on an additive basis. In other words, does the current infrastructure have enough performance or capacity headroom, or should we add more? With public cloud infrastructure service offering far more granularity with resource deployment than on premises, decisions should be made on a workload-by-workload basis. Individual workload characteristics, such as performance and capacity, play a role in determining the cloud cost-effectiveness.
- The cloud introduces new rules and interfaces. One issue is the ease with which cloud infrastructure users can procure and provision resources. This ease has accelerated adoption, sure, but it has also opened up cloud services to divisions in organizations without the necessary workload -- performance or data sensitivity requirements -- expertise.
What to do?
IT needs to take the lead on cloud. For most companies this is already the case, but the perception of IT as business inhibitor persists. Line-of-business teams and developers still bypass IT to use cloud resources in a sizable percentage of companies. While the IT community often refers to these as shadow IT activities, many cloud users see this as a feature rather than a bug. They believe they're serving their company's best interests by bypassing the slow, outdated IT infrastructure team and processes for a leaner, faster, more agile process.
The reaction from IT decision-makers is often that business must change, and the IT team should make technology decisions. The second part of that statement is accurate, but the first portion simply doesn't work anymore. IT needs to change.
Here's how:
- Understand and manage IT at a workload level. Specifically, IT needs to index heavily on application performance and data sensitivity requirements. These two points represent the most common issues with public cloud workloads, and often the culprit is that the necessary analysis wasn't done upfront.
- Architect processes that don't impede cloud access. This may seem counterintuitive, but if IT impedes or delays access to the cloud, the rest of the business will continue to bypass IT on the way to the cloud. Focus on what matters. It will vary by public cloud workload and organization, but data sensitivity, compliance and performance requirements should take precedence.
- Use tools from cloud providers to assist with gaps. Cloud providers acknowledge the hurdles businesses encounter with cloud services and offer tools to assist.
As an example of this last point, earlier this year, AWS launched a service called Zelkova, which employs automated reasoning to analyze cloud policies, understand them and then inform on their future consequences. Amazon may have noticed some organizations struggling with proper public cloud workload adoption and built a technology to reduce the complexity and guesswork involved.
One of Zelkova's goals is to improve confidence in the security configurations through its standout Public/Non-Public identifier. AWS S3 exploits Zelkova technology to check each bucket policy and then identify if an unauthorized user can read or write to the bucket. A bucket is flagged as Public when Zelkova identifies public requests that can access the bucket. Non-Public, meanwhile, means Zelkova has verified that all public requests are denied.
This tool offers an incredibly valuable service, given the significant skill shortage of cybersecurity IT professionals. The cloud introduces new paradigms when it comes to data security, and any tool that simplifies that process reduces the risk involved in using a hybrid cloud ecosystem and its burden on business.
This is your company's data, however, and tools like Zelkova are just that, tools. They provide a valuable protection layer when using cloud resources for a public cloud workload, which should expedite IT processes and ensure cloud adoption is done quickly and securely. These tools don't replace internal due diligence. Ultimately, IT must lead the business to the cloud, and not let the business pass them by.
Related Resources
- Rethinking Storage Modernization –Hewlett Packard Enterprise
- Rethinking Storage Modernization –Hewlett Packard Enterprise
- Panzura unveils first offering after Moonwalk acquisition –Panzura
- BlackPearl S3: Hybrid Cloud Object Storage that seamlessly extends to tape –Spectra Logic
Dig Deeper on Cloud storage
-
What is cloud repatriation and what are the leading causes?
-
On-prem vs cloud storage: Four decisions about data location
By: Stephen Pritchard -
S3 Express One Zone set to power generative AI workloads
-
Amazon adds new S3 features for data lakes, hybrid cloud