There comes a time when every IT department has to refresh its server and storage hardware. There are no hard and fast rules about when this should take place. Sometimes decisions are driven by the availability of new technologies, such as all-flash storage or next-generation Intel processors.
Other times decisions can be driven by changing workload requirements, new applications or the simple reality that the organization can measurably save money and improve application performance by refreshing the equipment in place.
Deciding how and when to get maximum value out of refreshing servers and storage is one of the biggest challenges—and opportunities—facing IT teams. Too soon and you’ve wasted money; too late and you’ve wasted opportunity.
Neglecting to upgrade your server infrastructure in a timely fashion can cost you up to 39% of peak performance; add up to 40% in application management costs, and increase server administration costs by up to 148%, according to IDC.
A smarter way to refresh
Given the accelerated pace of technology innovation, there is more pressure than ever on IT when undergoing a refresh. For example, which workloads need all-flash storage? Which need the highest-density servers? Which will be best served by hyper-converged infrastructure? Which can be moved to a hybrid or public cloud?
These are critical questions IT decision-makers must answer on the path to IT modernization and transformation. The only real way to answer them is with real data culled and analyzed from your current IT environment and workloads.
When you are armed with facts, you can make much better decisions about refresh timing and technology. Workload data that is particularly useful includes:
- Capacity: How much capacity is being used by each workload, and are there peaks and valleys in their usage?
- Performance: How much IOPS are required by different workloads, and how and when do these requirements fluctuate?
- Reliability: Is there enough performance and capacity to reliably support business-critical applications?
- Environment: Which operating systems and virtualization platforms are in use? How do they impact memory, CPU, network and storage performance?
- Planning: How will requirements change as the organization brings on new applications, workloads and personnel?
You can also make smarter refresh decisions by modeling your workloads based on equipment you are considering buying. This will give greater insight into the value of the refresh and save time and money in making the right purchase decisions the first time.
Leverage a data-driven model
Live Optics, a vendor-agnostic software solution from Dell EMC, is a valuable tool in providing the real-world data required to make smarter refresh decisions. It is a free product that enables IT to leverage deep insights into performance, workload simulations, utilization and support. Live Optics is lightweight, remote and agentless, collecting data about IT environments without affecting performance and security. It also enables IT to model project requirements to gain a deeper knowledge of workloads.
When it comes to refreshing IT infrastructure, there is a lot of money potentially at stake. IDC says IT organizations can save millions of dollars in capital and operations costs annually, thanks to higher server performance, consolidation, management efficiency and improved reliability.
In the past, refresh decisions were complicated by a lack of real-world data that could be analyzed and acted upon in near-real time. That is no longer the case. Looking ahead, when it’s time to consider a refresh, any IT team in any environment can now use real data about real workloads to drive cost savings and operational efficiencies.