Torbz - Fotolia

Tip

Azure Backup service best practices to streamline protection

Is Azure Backup the right protection for your organization? Explore three important considerations for using the Microsoft service to back up your data.

Azure Backup is a service that enables organizations to back up their data to the Microsoft Azure cloud and provides comprehensive data protection.

Although the Azure Backup service supports seamless backups, there are some best practices that an organization should use for optimum data protection. For example, it is important to implement this protection in a way that will not deplete your bandwidth. You will also need to be aware of the cost associated with protecting all of your virtual machine instances.

Consider the bandwidth requirements

One of the first things that you must consider prior to setting up the Azure Backup service is bandwidth requirements. If you already have a sizable amount of data that needs to be protected, for example, then it may be completely impractical to upload all of that data to the cloud.

Depending on your available bandwidth, it could take weeks to upload a few dozen terabytes of data to Azure Backup. As such, it is a good idea to take a look at Microsoft's offline seeding option. This option allows you to create the initial backup in your own data center and then ship the backup to Microsoft for ingestion into the Azure cloud.

You must also consider the volume of data that your organization creates each day. There have been many documented instances of organizations having a data change rate that exceeds what their bandwidth can handle. In such situations, it is necessary to adopt data reduction techniques or to acquire additional bandwidth.

Don't give up your on-premises backups

Cloud backup has its advantages and disadvantages. On one hand, cloud backup stores data remotely, thereby insulating the organization against regional disasters. On the other hand, this isolation can also make data restorations difficult and time-consuming.

Although the Azure Backup service supports seamless backups, there are some best practices that an organization should use for optimum data protection.

According to Microsoft, the Azure Backup service is designed to be used with System Center Data Protection Manager (DPM). It is possible, however, to use one Data Protection Manager server to protect another. This means an organization could conceivably write its backups to an on-premises DPM server that stores backups locally, and then use a secondary Data Protection Manager server to back up all of that data to the Azure cloud.

The advantage to using this technique is that it allows data to be restored from a local source, rather than requiring all restore operations to be based on cloud data. This allows restore operations to complete far more quickly than would be possible if the data had to be downloaded from Azure. However, the data stored within the Azure Backup service could be used to recover from a more catastrophic failure in which restoring from a local backup is not an option.

One important thing to keep in mind about this backup architecture is that it is not suitable for every situation. If an organization is operating almost entirely in the cloud, then it probably does not make sense for that organization to deploy an on-premises System Center DPM server. In such a situation, it would be better for the organization to back up its cloud resources directly to Azure Backup. Incidentally, Azure Backup allows for cross-region redundancy, so you are protected against backup failures.

Consider the costs

Although the cloud had a reputation for being a less expensive alternative to on-premises operations, there are considerable costs associated with using the Azure Backup service. The costs are based on factors such as the size of the data set being protected and whether you are using locally redundant storage or geo-redundant storage.

Microsoft charges a flat fee for each instance that is being protected, but also charges for the storage that is consumed. The flat fee works out to about $10 per 500 GB (or fraction thereof) of storage consumed, but is only $5 for instances that are less than 50 GB in size. Hence, a 1.2 TB instance would incur a fee of $30 per month.

The storage fees account for the bulk of the data storage costs. These fees range from just over 2 cents per gigabyte, per month to just under 5 cents per gigabyte, per month, depending on the amount and type of storage that is being used. The storage cost associated with backing up a 1.2 TB instance to locally redundant storage would be $29.41. When you also include the previously mentioned fees, the cost of backing up that instance works out to $59.41 per month, plus any applicable taxes and fees.

Dig Deeper on Cloud backup

Disaster Recovery
Storage
ITChannel
Close