Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
IT organizations are continuously looking for new ways to drive operational and cost efficiencies and the data center is no exception. Data center staff want to take advantage of the benefits of a subscription-based cloud consumption model, but often don’t have the ability or budget to move their data or compute resources to the cloud. As a result, IT leaders are looking for ways to leverage resource consumption in a pay-as-you-go cloud model but within an on-premises infrastructure.
ESG research backs this up, with 48% of IT leaders saying they would prefer to pay for on-premises data center infrastructure through a consumption-based model.1
48% of organizations say they would prefer to pay for infrastructure via a consumption-based model such as a variable monthly subscription based on hardware utilization, which is up from 42% last year.
IBM was paying attention to these market trends and in the third quarter of 2021, they introduced new ways to expand storage consumption options with IBM storage-as-a-service offerings.
IBM offering storage as a consumption model
In a recent briefing covering IBM’s Storage launch, Eric Herzog stated “IBM storage-as-a-service offers a lower price than the public cloud.” I like this pay-as-you-go consumption model compared to a forced move to the cloud, where everything moves straight to the cloud. In the IBM storage-as-a-service model, organizations are able to move to the cloud yet keep some storage on-premises using the same subscription financial model. The beauty of this approach is that it offers a hybrid cloud solution with a concierge service to continually health check the client’s systems.
It is an interesting approach and shows that IBM is keeping up with and responding to storage industry trends about how customers are looking to purchase IT resources.
Not surprisingly, IBM is also responding to the future promise of quantum computing.
Quantum computing is not a myth!
In a different briefing this week, IBM’s Chief Quantum Exponent Robert Sutor explained that “…quantum computers will solve some problems that are completely impractical for classical computers…” The conversation with Robert was impressive. He explained to me the history of IBM’s background in the quantum computing space and said, “There are differences when describing quantum computing.” Sutor offers a lot of depth on this topic in his book, Dancing with Qubits: How quantum computing works and how it can change the world.
The IBM Institute for Business Value (IBV) has been deeply engaged with quantum research for many years. In June of this year, IBM announced advancements with its Quantum System One, delivering its first quantum computer system outside of the US to Europe’s largest application-oriented research organization, Fraunhofer-Gesellschaft. And the second system is being delivered to the APAC region soon. IBM, as usual, is on the forefront of innovation and driving paradigm shifts.
Many people, even tech-savvy ones, do not fully understand the potential in quantum computing processing power. This diagram provides the “gap” between classical computing of today and how quantum computing will enhance hybrid cloud and the overall heterogenous computational fabric. These cloud-based, open-source development environments will make using quantum computers “frictionless,” according to IBM.
The number of qubits (or quantum bits) in a computer make a big difference in computational performance, according to Robert. By 2023, IBM will expand its current quantum computer from 65 qubits to over 1,000 qubits. It is an ambitious goal and exciting at that same time. It is not quite like leaping through space and time like Sam Beckett did in the TV show Quantum Leap, but IBM’s future focus in this area is impressive, nonetheless.
The next phase in quantum computing? Beyond its continued technological evolution, IBM should evangelize its benefits more broadly so that organizations are well positioned to take advantage of all it has to offer. Hopefully, next year this time, I’ll speak with IBM again and see the beginning of more widespread adoption of quantum computing.
IBM seems to be doing the right things
I like the direction that IBM is taking with the storage roadmap. The improvements in performance offered to the IBM DS8900F analytics machine and the IBM TS7770 all flash VTL improve AI responsiveness and resiliency, which seems to provide an advantage over other systems in the same class. We also were briefed on the importance of IBM’s strategy with FlashSystem Safeguard Copy and how it improves data resilience by providing an immutable point-in-time of production data, isolated logical air-gap offline data, and separate privileges for admins.
Paul’s POV
In my opinion, IBM is listening to customers. The response with the roadmap in storage as well as quantum is aggressive yet will keep them competitive. Customers will ultimately decide for themselves whether these are the right approaches, but from my vantage point, IBM is checking the right boxes to address many common business pain points right now as well as those that are expected in the future. It will be interesting to see how the competitor landscape will respond to these new advances in IBM’s portfolio.
Data teams and developers continue to serve as the lynchpin to businesses, overcoming shortcomings associated with more rapidly and reliably gaining insight from growing data sets. With improving data analytics for real-time business intelligence (BI) and customer insight consistently ranking as one of the business priorities driving significant technology spending, how are organizations enabling more end-users to actually leverage data? Skills gaps, collaboration, and accessibility have created several barriers for democratizing analytics across organizations, and pressure is being placed on data and software teams to make business intelligence easier to leverage and consume. But with the dynamic nature of the business being what it is today and the constant shifting of priorities, timeliness of delivery and accessibility of simplified analytics are being scrutinized. Embedded analytics is increasingly becoming the answer.
Can an IT systems vendor really transform into a cloud service provider? With everything we saw from HPE and HPE GreenLake at this year’s HPE Discover 2021, I think we are going to find out pretty soon. With this year’s announcements, it’s clear HPE understands that modern IT wants a cloud-like experience everywhere, and HPE means to deliver it.
ESG conducted a comprehensive online survey of information security and IT professionals from private- and public-sector organizations in North America (United States and Canada), Europe, Asia, Central/South America, and Africa between March 1, 2021 and April 7, 2021. To qualify for this survey, respondents were required to be information security and IT professionals from ISSA’s member list.
This Master Survey Results presentation focuses on the lives and experiences of cybersecurity professionals, including performance assessments of their cybersecurity leaders, as well as suggestions for what organizations can do to help cybersecurity professionals succeed.
Listen in as Bob Laliberte and I discuss these two important research topics and capture technology and business insights into the future of the data center.
Distributed Cloud is a connected ecosystem of cloud services, developer-ready infrastructure, and legacy IT that provides consistent operations for modern applications.
Digital Ecosystems: How carriers, Telcos, MSPs, and CSPs are innovating and optimizing edge computing applications.
Key research questions will include:
Where are businesses recognizing the value of interconnection?
How are application architectures leveraging a distributed ecosystem?
What are the challenges and priorities in deploying cloud-native application architectures that are acting as a catalyst for services outside the data center?
Listen in as Bob Laliberte and I discuss these two important research topics:
Distributed Cloud is a connected ecosystem of cloud services, developer-ready infrastructure, and legacy IT that provides consistent operations for modern applications.
Observability from Code to Cloud: Optimizing cloud-native application architectures via AIOps-driven automation and FinOps cost optimization.
Key research topics will include:
Are IT operations teams and developers functioning congruently with modern application processes & architectures?
Where are businesses recognizing the benefits of automation coupled to observability and intelligence?
How are businesses building a cloud operating model?
Organizations today are digitally transforming their IT environments to become more agile and responsive to market and customer demands. While this typically includes people, process, and technology, the technology piece underpins many of the new processes and ensures your employees can collaborate and be productive.
While AI is still considered nascent, the impact it is having on organizations that are embracing it early and often is profound. This serves as a key component to why organizations continue placing bets on AI. Even as skills gaps remain when it comes to incorporating AI into the business, organizations simply cannot afford to wait in adopting the technology as they risk being disrupted by the competition using AI today. With the rise of AI tools that simplify and automate several, if not all aspects of the AI lifecycle, expect adoption of AI to continue exploding for years to come.
Organizations continue to prioritize AI investments with a goal of achieving a more data-centric future. While business objectives point to several areas where AI can help improve businesses both internally and externally, time to value continues to be scrutinized as organizations make massive investments in people, processes, and technology in support of AI initiatives. Opportunities to reduce time to value continue to pave the way for AI technology vendors that can help simplify the adoption and use of AI technology to support a growing number of use cases throughout the business.
Though the cyclical AI lifecycle is riddled with complexity, the last mile of AI is proving to be the greatest challenge for organizations in their quest to leverage AI. Between diverse and distributed application environments, the rate at which growing data sets change and create data drift, and the dynamic needs of the business, several contributing factors lead to organizations suffering from AI deployment challenges. Both new and mature businesses leveraging AI continue to prioritize opportunities to simplify the last mile of AI—deploying AI into production—with a goal of reducing the amount of time it takes to get from trained model to production. This has paved the way for the emergence of technology to better enable businesses to deploy, track, manage, and iterate on a growing number of ML models in production environments.