Backing up data in an application is reasonably easy. It’s something we’ve been doing since the advent of databases. But anyone who has been responsible for managing backups and recovery knows the big challenge is getting data back into an application’s database in an acceptable time frame.
Traditional database backups involve exporting the contents to a file system where it is backed up, usually on an automatic schedule. Restoring data involves retrieving the backups and then importing the data to the database. Executed in real time, the process is long and slow.
When databases were small, this was a viable approach. But with the explosion of data creating massive databases, this process no longer delivers the rapid response businesses demand. With applications often being tightly coupled to their underlying data, it might not be possible to take a database offline to back up or restore the data.
The better way to do this is to have an agent that can interact with the data in real time, while the application is active. Application data management executes backups and restores using some sort of API that can talk, in real time, with the application and is aware of the application’s overall environment. This includes factors such as whether the application is operating on virtualised infrastructure in a cloud environment with specific limitations such as latency and how associated middleware interacts with the application.
Application data management considers the complex environment applications operate in and takes that into account when backing up and restoring data, to minimise impact on business operations.
The serial nature of the dump-to-disk approach and the relatively long time required to execute a data restoration has resulted in some legacy backup vendors acquiring backup and restore solutions from smaller developers to deliver point solutions. But that has led to enterprises deploying different solutions for each of their applications, creating further complexity. While such point solutions may promise integration, they are often only loosely connected back to a central engine.
Where operations teams have purchased separate solutions for each application, they can find that the software used to back up one application’s data causes issues with the software for another program. And there’s the need to have expertise in multiple backup and restore systems.
Solving this problem starts by reducing the number of different products used for backing up and restoring application data. By embracing more holistic solutions, enterprise IT teams can reduce the number of different tools they need to be proficient with and get a better overall view of how their application data is being managed.
That means looking for tools that use open APIs and can share indexes. Automation and database federation will also assist with simplifying application data management and help organisations avoid getting locked in with a specific vendor.
This approach is also agnostic whether you are using on-premises or cloud solutions and facilitates moving from one application to another. Since you have greater control over your data, it can be easier to move data between service providers or locations.
If you are using software as a service (SaaS), having your own application data management approach provides further insurance against a failure of the service provider’s systems. While many SaaS providers do back up your data, unless you are paying for a specific service level for backup and restoration of your application data, the service is provided only on a “best endeavors” basis.
Keeping control of your application data is critical. Your data is a key corporate asset and should be treated as such.