Mathias Rosenthal - Fotolia

Top 2020 data center industry trends include edge, private cloud

IT infrastructure isn't just an on-premises data center anymore. The '2020 State of the Data Center' report shows growing interest in private cloud, edge and colocation.

Video killed the radio star, but cloud won't kill the data center. IT resources continue to grow as organizations generate large amounts of data and find new ways to expand end-user connectivity.

Afcom's "2020 State of the Data Center" report provided admins and IT managers a snapshot of the latest data center industry trends and helped readers figure out how other organizations build their infrastructure.

This year's report highlighted growth in private cloud adoption, increases in rack density and extensive edge deployments. The growing complexity of data center infrastructure brings questions around security, staffing and operations.

"The shift on the vendor and supplier side has been trying to package together hardware solutions with software management tools to try and make the IT operations manager job as easy as possible," said Kuba Stolarski, research director of infrastructure systems, platforms and technologies at IDC.

Organizations step back from public cloud

More IT teams are reconfiguring cloud infrastructure as organizations realize that public cloud has disadvantages related to cost, management and data control. Seventy-two percent of respondents said they plan to migrate to private cloud and colocation, while 52% said they currently have some type of private cloud.

"One of the reasons why we saw so many enterprises going public cloud is because they did not have the expertise to create a private cloud on their own, especially because private cloud can be very human resource-intensive and requires expertise -- whether that's internal or external," said Peter Panfil, vice president of global power at Vertiv.

Even with this migration to private cloud, IT departments don't limit themselves to one type of cloud architecture. For the first time, this year's report included multi-cloud infrastructure. Twenty-five percent of respondents said they currently use a multi-cloud setup, and 44% said they look to implement in the next one to three years.

Data type can drive the decision of which cloud type an organization uses, meaning IT teams may require multiple cloud architectures. The use of various cloud types can allow organizations to share specialized information but also provide an outward-facing application for customers. This is well suited for use cases in healthcare, government and data center industry sectors that have a mix of public-facing and regulated data.

"When I look at the biggest advantage of cloud, it is full of data and controlled access. You decide what you're going to allow to have happen and what you're going to restrict," Panfil said.

Data center infrastructure: Doing more with less space

The report's findings stated a decrease in overall data center square footage but an increase in rack density. The average data center rack grew from 7.3 Kw per rack to 8.2 Kw.

Increased rack density can bring long-term cost savings and higher infrastructure utilization, especially because organizations may not have the land or capital to build out new data centers. It brings much more revenue capability for the same amount of data center footprint.

Server components have also increased in capacity over the past 10 years, and a single storage drive can now hold up to 18 TB, Stolarski said. These larger component capacities are particularly useful for hyperscalers that decide to increase computing power across their infrastructure.

Organizations that increase density and look to use more software-defined infrastructure can have challenges with traditional blade servers. Blade servers have brought plug-in-play hardware to data center infrastructure but little in the way of management software. Effective management requires either an additional software layer on top of hardware or completely standardized servers.

"There's a new way of running the modern equipment, which, in the past, was turning it on, make sure it's working and let it run. Now, I need to be careful. … I want to have maximum efficiency, and I have choices to make that I never had before, and how those choices interact with legacy systems can affect availability," said Gregory Ratcliff, chief innovation officer at Vertiv.

For denser rack power management, admins should look for tools that offer insight into power distribution, support security protocols, and allow remote access and management.

Colocation customers get smarter

Colocation gives organizations a short-term and cost-effective way to scale, especially as applications become more sensitive to latency. It's more beneficial to have a distributed infrastructure network; sometimes that can be done through colocation, Stolarski said.

Organizations are becoming much savvier when it comes to colocation, particularly regarding power consumption and efficiency. This has caused colocation providers to rethink how they build and manage their data centers. Now, customers are more interested in talking about available power for purchase, instead of just buying up space.

"[Customers] want to balance their power consumption and pay for what they're using. And as [needs] grow more and more sophisticated, they're putting pressure on the data center providers to optimize the space," said Drew Leonard, vice president of strategy for Evoque Data Center Solutions.

Infrastructure optimization includes the use of denser server hardware to support up to 10 Kw cabinets; power management software to effectively distribute varying power needs across infrastructure; and extensive, carrier-neutral network hardware that can make it easier for customers to ingress and egress data.

Over the next few years, colocation service providers can investigate what 5G network support and artificial intelligence can do externally and internally for their businesses.

"The ability to provide high-level compute in the data center [and] access to 5G networks is going to be huge. Artificial intelligence will need to be closer to the end user [and] closer to the source of the data so that it can provide the lowest latency, highest output. There's also leveraging internal artificial intelligence to help us run our data centers more efficiently to optimize the infrastructure across the board," Leonard said.

A growing interest in edge

The data center industry is increasing its investment in edge computing, with internal initiatives and colocated infrastructure.

The report's authors showed that 55% of respondents said they are planning for edge computing use within the next three years, while 20% said they are already implementing the technology. The deployment size varies; 32% of respondents said they already have 11-20 active locations and 70% said they have planned for 40 or more edge sites.

Big data, machine learning, connected devices and media streaming use cases drive interest in edge computing. If organizations can place the data processing as close as possible to the source, this reduces latency and cost.

But distributed, cloud-based edge offerings bring security concerns. Respondents stated hesitations around company data security (58%), total cost of ownership (52%) and network reliability (42%). Admins can use edge computing-specific software and internal data center infrastructure management to guard edge site data, set protocols and receive network outage alerts -- but humans can only do so much.

As the sector matures, admins who manage edge deployments will require software that has automation and analytics features to filter alerts, provide insight of deployment operations and support rapid site deployment. These capabilities can reduce the amount of in-the-weeds management and let admins focus on big-picture initiatives for smarter, more flexible infrastructure.

Dig Deeper on Data center ops, monitoring and management

SearchWindowsServer
Cloud Computing
Storage
Sustainability
and ESG
Close