Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Data is the fuel of the business, creating new value and offering greater insights. A data lake environment plays a foundational role in helping extract value from data, and artificial intelligence initiatives are no exception. As the demand for faster access to data increases, massive scalability is no longer enough for modern data lake environments, so these environments increasingly leverage flash-level performance as well.
The recent and massive uptick in AI investments has been driven by the fact that these projects are, by and large, successful. This success often fuels an increase in the number of AI objectives, which places greater demands on IT and the underlying infrastructure. To ensure continued success with AI, the right infrastructure must be in place to consolidate data storage and accelerate its usage across the entire data pipeline.
IT organizations are continuously looking for new ways to drive operational and cost efficiencies and the data center is no exception. Data center staff want to take advantage of the benefits of a subscription-based cloud consumption model, but often don’t have the ability or budget to move their data or compute resources to the cloud. As a result, IT leaders are looking for ways to leverage resource consumption in a pay-as-you-go cloud model but within an on-premises infrastructure.
ESG research backs this up, with 48% of IT leaders saying they would prefer to pay for on-premises data center infrastructure through a consumption-based model.1
48% of organizations say they would prefer to pay for infrastructure via a consumption-based model such as a variable monthly subscription based on hardware utilization, which is up from 42% last year.
IBM was paying attention to these market trends and in the third quarter of 2021, they introduced new ways to expand storage consumption options with IBM storage-as-a-service offerings.
IBM offering storage as a consumption model
In a recent briefing covering IBM’s Storage launch, Eric Herzog stated “IBM storage-as-a-service offers a lower price than the public cloud.” I like this pay-as-you-go consumption model compared to a forced move to the cloud, where everything moves straight to the cloud. In the IBM storage-as-a-service model, organizations are able to move to the cloud yet keep some storage on-premises using the same subscription financial model. The beauty of this approach is that it offers a hybrid cloud solution with a concierge service to continually health check the client’s systems.
It is an interesting approach and shows that IBM is keeping up with and responding to storage industry trends about how customers are looking to purchase IT resources.
Not surprisingly, IBM is also responding to the future promise of quantum computing.
Quantum computing is not a myth!
In a different briefing this week, IBM’s Chief Quantum Exponent Robert Sutor explained that “…quantum computers will solve some problems that are completely impractical for classical computers…” The conversation with Robert was impressive. He explained to me the history of IBM’s background in the quantum computing space and said, “There are differences when describing quantum computing.” Sutor offers a lot of depth on this topic in his book, Dancing with Qubits: How quantum computing works and how it can change the world.
The IBM Institute for Business Value (IBV) has been deeply engaged with quantum research for many years. In June of this year, IBM announced advancements with its Quantum System One, delivering its first quantum computer system outside of the US to Europe’s largest application-oriented research organization, Fraunhofer-Gesellschaft. And the second system is being delivered to the APAC region soon. IBM, as usual, is on the forefront of innovation and driving paradigm shifts.
Many people, even tech-savvy ones, do not fully understand the potential in quantum computing processing power. This diagram provides the “gap” between classical computing of today and how quantum computing will enhance hybrid cloud and the overall heterogenous computational fabric. These cloud-based, open-source development environments will make using quantum computers “frictionless,” according to IBM.
The number of qubits (or quantum bits) in a computer make a big difference in computational performance, according to Robert. By 2023, IBM will expand its current quantum computer from 65 qubits to over 1,000 qubits. It is an ambitious goal and exciting at that same time. It is not quite like leaping through space and time like Sam Beckett did in the TV show Quantum Leap, but IBM’s future focus in this area is impressive, nonetheless.
The next phase in quantum computing? Beyond its continued technological evolution, IBM should evangelize its benefits more broadly so that organizations are well positioned to take advantage of all it has to offer. Hopefully, next year this time, I’ll speak with IBM again and see the beginning of more widespread adoption of quantum computing.
IBM seems to be doing the right things
I like the direction that IBM is taking with the storage roadmap. The improvements in performance offered to the IBM DS8900F analytics machine and the IBM TS7770 all flash VTL improve AI responsiveness and resiliency, which seems to provide an advantage over other systems in the same class. We also were briefed on the importance of IBM’s strategy with FlashSystem Safeguard Copy and how it improves data resilience by providing an immutable point-in-time of production data, isolated logical air-gap offline data, and separate privileges for admins.
Paul’s POV
In my opinion, IBM is listening to customers. The response with the roadmap in storage as well as quantum is aggressive yet will keep them competitive. Customers will ultimately decide for themselves whether these are the right approaches, but from my vantage point, IBM is checking the right boxes to address many common business pain points right now as well as those that are expected in the future. It will be interesting to see how the competitor landscape will respond to these new advances in IBM’s portfolio.
Can an IT systems vendor really transform into a cloud service provider? With everything we saw from HPE and HPE GreenLake at this year’s HPE Discover 2021, I think we are going to find out pretty soon. With this year’s announcements, it’s clear HPE understands that modern IT wants a cloud-like experience everywhere, and HPE means to deliver it.
June is proving to be a busy month in the tech world. Cisco is no exception and has announced a new launch within their portfolio. I had the opportunity to attend Cisco’s Future Cloud event, and I am providing my summary of Cisco’s future direction. They focused on innovation in these main areas:
At this year’s Pure Accelerate we were offered a glimpse into the future of data storage. Contrary to what you might expect, the event didn’t center on some new array or a new version of flash. Rather, the focus was on how Pure is transforming the way it delivers storage technology, how its storage supports modern application environments, and how its customers consume and manage its storage.
Recently, I had the pleasure of attending an analyst briefing with Ben Bolles, Vice President of Products; and Mike Koponen, Senior Director, Product Marketing and Strategic Alliances, at Pivot3 discussing their latest launch. It was interesting to see the advances in the hyperconverged infrastructure space and the differentiation Pivot3 is bringing to market.
When we look at Pivot3’s core areas, it is clear they are competing in a large market, but they keep the focus on video surveillance. This is a market segment that is clearly gaining adoption, not only at the data center core but also at the edge and in the cloud. (more…)
IBM Think! 2021 delivered a number of areas IT leaders may want to consider with new and existing initiatives and infrastructure deployments. This year’s event amplified the importance of growth in artificial intelligence as it relates to IBM Cloud Pak, digital transformation and digital business models, IBM Watson Orchestrate self-service automation, public and hybrid cloud, and application and infrastructure modernization, to say the least. Our research at Enterprise Strategy Group aligns nicely to the messages IBM delivered this year. Listen as Mark Peters, Practice Director; Scott Sinclair, Senior Analyst; Mike Leone, Senior Analyst, and I discuss and share our thoughts on this year’s IBM Think! 2021 event.
2021 is poised to be a transformational year for Dell Technologies, with Dell Technologies APEX playing the lead role. At this year’s Dell Technologies World, Dell announced the availability of what is expected to just be the start of a broad portfolio of managed IT services. Combined with the announced spin off of both VMware and Boomi, Dell Technologies is finding its focus—a focus I expect will lead to benefits for both Dell and its customers, but there is still much to be done.
The focus at KubeCon + CloudNativeCon Europe 2021 demonstrated that containers and Kubernetes is growing up—fast! The adoption of Kubernetes and its overall business impact, along with how the growth of applications and cloud adoption expands beyond the developer, was a key message at the event. Kubernetes has been ready and is expanding its presence in the overall business ecosystem, with deployments both in the cloud and at the edge.
HPE made a big announcement this week on the data storage front, introducing Unified DataOps. At a high level, I am a big supporter of innovation along the lines of what HPE is doing—namely, providing technologies that fuel the transition from buying systems to realizing positive business outcomes. That’s because more and more, I see business leaders trying to achieve two seemingly conflicting goals: