carloscastilla - Fotolia
Public cloud storage comes in many shapes and sizes, as the major providers offer a range of services to fit most enterprise needs.
But, even with various choices from AWS, Microsoft, Google and others, IT admins still need to do their part to maintain performance, control costs and protect data. Apply these three public cloud storage tips to ensure you optimize the services that house critical business data.
Monitor and manage performance
Storage has a big impact on application performance, so enterprises need to pay attention to the public cloud services and classes they use. Admins should evaluate the benefits and limitations of each offering to find the ones that meet their workload requirements. For example, apps that frequently access data and need low latency would be better off with Amazon S3 Standard or Google Cloud Multi-Regional Storage. If application demands change, shift to a different service tier.
If a cloud storage service falters or fails, it can cripple an application. Native monitoring tools, such as Amazon CloudWatch, Azure Monitor and Google Cloud Stackdriver, can check usage and performance metrics to help optimize your workloads. Use insights from these tools to determine whether to store application data in a closer region or if app design changes are necessary.
For hybrid environments, consider tools to speed the connection between a local data center and the public cloud. These types of tools, such as AWS Storage Gateway and Azure StorSimple, are typically used for backup and disaster recovery tasks. Enterprises can also opt for a private direct connection between on-premises systems and public cloud with services such as AWS Direct Connect, Azure ExpressRoute and Google Cloud Interconnect.
Purge unneeded data
While public cloud storage is relatively inexpensive, old and unneeded data can unnecessarily add to your cloud bill, complicate compliance and impact agility. Mitigate these risks with a data purge policy.
With proper categorization and time deletion policies, you can create an automated process to remove data.
First, classify files -- such as those related to standards and regulations, like the Health Insurance Portability and Accountability Act -- that should never be deleted. After that, determine the business value of the rest of the data. Some data types, such as information on outgoing employees, can create years of unneeded log files. Create policies that automatically delete these classifications after a certain amount of time.
However, remember to distinguish between backup data and archives. Public cloud storage typically comes in three tiers -- primary, backup and archive -- which all have different fees. Some of the lower tiers, such as Google Cloud Storage Coldline and Amazon S3 Glacier, can have minimum storage duration requirements and early deletion fees, which should factor into a purge policy. To cut backup data costs, consider compression, deduplication or global data reduction.
Take snapshots for data protection
Cloud storage snapshots -- which capture the state of a storage system at a particular time -- are a common method of data protection. If there is a problem, roll back to a snapshot, and go back to that prior state.
Enterprises can choose how often they take snapshots, but there isn't a standard frequency, as workloads have different requirements. For less volatile workloads, such as virtual desktops, take snapshots every hour. For critical workloads that are more volatile -- like databases -- opt for continuous snapshots to ensure minimal data loss.
While it's fairly straightforward to take a cloud storage snapshot, the process has its downsides. For example, it can impact performance, because there is a lot of bandwidth involved with additional inputs/outputs. To maintain proper performance, enterprises may have to pay premium prices for solid-state drive-based cloud instances and storage. Also, storing snapshots can increase costs, especially if you take them continually.