Jerry Sliwowski - Fotolia
Data may or may not be the new oil, depending on who you ask. But there is at least one clear similarity between oil and data: When disaster strikes, cleaning up the mess is an expensive and arduous task.
A recently released study commissioned by Dell EMC put the cost of downtime in the millions for relatively small amounts of data. That's not much of a surprise because other surveys on data protection challenges have produced similar results.
More surprising is, while vendors commonly tout ransomware-fighting tools in their data protection products these days, IT pros feel unarmed in the battle. Nearly half of the 2,200 respondents in the Dell EMC survey cited a lack of data protection products for emerging technologies as a pain point. Organizations that use new technologies, such as artificial intelligence, machine learning and internet of things, or new deployments like SaaS applications and persistent volumes for containers, report a hard time finding vendors that could protect their data.
The cost of downtime
Disruptions like ransomware attacks cost organizations more than money. The attacks also damage productivity and staff morale, which are difficult to quantify, but still have impact.
Bob Bender, CTO at Founders Federal Credit Union, was called in to assist another financial institution that was under a ransomware attack. He was instructed to help the organization pay off the attackers if needed. While there, he watched IT staff work around the clock for more than 52 hours to restore what the criminals had encrypted.
Bob BenderCTO, Founders Federal Credit Union
"Ransomware is nothing like what they tell you," Bender said. "Seeing these professionals that couldn't do anything -- the event shocked us and we took action."
Bender did not end up paying the ransom, but during the attack, only one of the backup vendors the financial organization worked with sent an expert to help. Witnessing this caused Bender to seriously question every contract he had with his own data protection vendors. This eventually led to a product refresh of Founders' data protection lineup.
The Dell EMC survey found that the average estimated cost of 20 hours of downtime is about $500,000, and losing 2.13 TB of data costs an average of nearly $1 million.
Data protection challenges include high impact of data loss
Dell EMC commissioned market research firm Vanson Bourne to run the Global Data Protection Index survey. The study polled 2,200 IT decision-makers across 18 countries and highlighted the data protection challenges they faced. It is the third such study Dell EMC has published, and the first since 2016.
The new study, with data gathered between 2016 and 2018, found that more customers are making money off their growing stores of data, but this has also led to a higher business impact if they lose access to that data.
The survey found 75% of respondents are either monetizing their data or investing in tools to do so in the future.
"They're actually now trying to figure out how to drive business value through IT and IT capabilities," said Rüya Barrett, vice president of marketing in data protection at Dell EMC. "Across the board, the impact of disruption is increasing. The consequences of data loss and data unavailability have gone beyond the cost of downtime. It's really opportunity loss."
More than three-quarters of those surveyed said they experienced "some kind" of disruption in the past year, including 28% who said a ransomware attack prevented access to information. The percentage of those who said they could not recover from their current backup methods jumped from 14% in 2016 to 27% in the new survey of data protection challenges.
Data protection vendors are aware of the problem, and often make security enhancements part of product updates. Most backup products have ways to deal with ransomware. Veritas and Storware have released products for container backup, and backup options for SaaS applications such as Salesforce and Microsoft Office 365 are becoming common. However, not every vendor has a product for these environments, and many customers are hesitant to add further complexity to their infrastructure.
"How do you secure data that isn't static, but streaming real-time from ever-increasing endpoints?" Barrett asked. "How do you support new deployment models?"