Sponsored Content

Sponsored content is a special advertising section provided by IT vendors. It features educational content and interactive media aligned to the topics of this web site.

Home > Storage

Is your data security strategy ready for AI?

Many enterprises have data security strategies that are more likely to have been forced upon them by history than actively chosen. Someone picked a backup policy for the on-premises warehouse in 2018, while HR chose a cloud data security option when it started its first AWS workloads three years later. Edge sites got whatever the regional integrator recommended.

While rational at the time, each such decision is siloed. If you then scale this across hundreds of projects and half a dozen budget cycles, a company’s data can end up scattered across the data center, the edge, and multiple clouds, all protected under policies that have nothing to do with each other.

AI is changing the data protection game

Fragmentation of data security may have been tolerable in the past, but AI is changing the arithmetic. This technology has made enterprise data more central to competitive outcomes, raising the stakes for cyber resilience considerably.

To work, AI must draw data from disparate locations into shared pipelines. When datasets that once sat under separate data protection policies start feeding the same project, silos shatter by default.

Last August, 89% of organizations in an Omdia research study identified that AI initiatives make data protection and data resilience more important to them. More than eight out of ten (83%) said outright that AI success is impossible if the underlying data isn't secured and protected.

But here's the really uncomfortable part: Almost half of all data center teams find scoping protection requirements for AI more challenging than they are for traditional workloads. Cloud teams aren't far behind. The awareness is there, but the planning often isn't.

Part of the problem is how teams prioritize challenges. I've repeatedly watched organizations rush to secure GPU capacity before realizing the compute is useless without the right data in the right pipelines. Data security gets relegated to a third-order problem. By the time people think about it, the architecture has been set and the budget largely allocated.

Preparing for the new AI order

Wherever you sit on your AI journey, the cyber resilience question is heading your way. Getting ahead of it gives you more options to solve it.

Merge your tools, not your data

The natural instinct is to consolidate by moving everything into a data lake and reducing the surface area. While some data movement is necessary, in practice, organizations almost never retire their old environments when they build new ones. There's too much sunk cost and too many dependencies. You can't treat an established enterprise like a greenfield startup. Instead, they add a data lake on top of what they already have. Now, data lives in even more places than before.

The realistic path isn't consolidating where data lives; it's consolidating the tooling used to protect it. Invest in protection that spans on-premises, cloud, and edge infrastructure from a single management source. This matters for security as much as efficiency, because the more complex an environment, the more likely something is missed.

Understand who backs up what

While teams are more aware of this than they were five years ago, there's also a stubborn blind spot around the shared responsibility model. I still encounter organizations that assume their cloud provider handles backup. It doesn't. That misunderstanding gets especially dangerous with AI workloads, which frequently have data-locality and sovereignty requirements. They pull infrastructure on-premises while the broader pipeline still stretches across public cloud.

Then there are the new data types AI generates: vector embeddings, training checkpoints, and inference logs. Whether these need traditional backup depends on the workload. A regulated lender tracing model decisions probably needs those embeddings preserved, while a team building an internal chatbot-based product guide may not. But at the macro level, AI means more data, and more data means more to protect. Platforms that can protect both traditional application data and AI-specific artifacts through a single policy framework, with immutable copies and cyber recovery options, will reduce risk and complexity over time.

County of Kaua’I Customer Story- Protecting paradise with smart solutions

The County of Kaua’i needed to safeguard the island’s critical infrastructure by implementing a robust, modern data center with advanced cyber resilience. Dell PowerProtect Data Manager and PowerProtect Cyber Recovery proactively protects the county’s systems and community from natural and man-made threats.

Download Now

Invest, invest, invest

If your organization has been allocated budget for AI infrastructure, make data security and cyber resilience a line item from day one. And when you invest, look for protection that covers your cloud footprint alongside your on-prem environment.

One solution that can span both beats three or four that cover things piecemeal. Even if your first AI project sits neatly in a single facility today, it won't stay that way. Build that cyber resilience foundation now, while the architecture is still taking shape — before fragmented decisions solidify into the next generation of technical debt.

Source: Complete Survey Results: IT Transformed: Inside the Convergence of Hybrid Cloud and AI, June 2025. All data in this article is from this study.

MicroScope
Search Security
Search CIO
Close