Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
This year’s KubeCon + CloudNativeCon did not disappoint! As a hybrid event, there were many attendees in person as well as virtual, and the event coordinators did a great job at seamlessly transitioning between the attendees. At this year’s North America event, the coverage areas focused on application definition/development, orchestration/management, runtime, and provisioning. There was also a “special” category for offerings supporting cloud-native technology that did not fit into the other groupings. (more…)
The powerhouse VMware is doing it again—leading the way in the industry with innovation focused on the next generation of multi-cloud offerings. The question is: Will it be enough to differentiate VMware from the pack?
Let’s take a closer look. VMware entered the market as a virtualization leader and grew in strength throughout the early 2000s. As the market evolved, so did VMware. The company expanded into the next phase of the market as a leader in the private cloud space. The established footprint of the virtualization business provided a natural transition into the private cloud arena, but as the industry continues to evolve to multi-cloud, where does that leave VMware? VMware’s next chapter will be key to maintaining its growth as the established industry leader.
VMware’s Chapter 3: The Next Endeavor
At VMworld 2021, VMware announced its focus on several new areas with the intent to simplify multi-cloud operations. To no one’s surprise, multi-cloud adoption is taking off. According to ESG’s 2021 Data Infrastructure Trends research survey, 48% of respondents indicated their organizations have a cloud-first approach. Management of heritage and cloud-native applications, as well as microservices, are a large part of success for IT organizations. Like most vendors, VMware is taking on multi-cloud and application management services as their next area of evolution.
Multi-cloud support can be viewed in several areas. Application platforms, cloud management security and networking, and digital workspaces are several key cross-cloud service areas that VMware is focused on for cloud support. With its recent announcement, it is clear that VMware is emphasizing its focus on cloud infrastructure, which demonstrates the company’s intention to amplify support of its services to build, run, and secure applications across any cloud.
VMware is announcing several new areas to support these initiatives, including:
Project Arctic – Focuses on capacity on demand, deploying VMware cross-cloud services rapidly, and growing multi-cloud support without disruption.
Project Cascade – Enables open multi-cloud consumption while leveraging Kubernetes, enabling developers, and allowing DevOps to consume infrastructure through virtual machines and containers. This empowers IT abstract resource pools across the cloud and avoids lock-in by leveraging open platforms.
VMware Project Cascade
Project Capitola – Addresses the growing memory needs of applications. It provides cost-effective scale for memory tiers, improves memory resiliency, and simplifies memory infrastructure operations.
Project Ensemble – Provides a unified view of vRealize cloud management, enabling users to provision and configuring how much provisioning is possible.
Data protection is a key part of the multi-cloud strategy. VMware announced hybrid cloud disaster recovery. This includes 30-minute recovery point objective (RPO), accelerated ransomware recovery, and integrated data protection for cloud virtual machines, in addition to continued support at the edge with enhancements and introductions to VMware edge, VMware SASE, and Cloud platform.
What’s New With VMware?
VMware announced several enhancements and improvements to vSAN and vSphere as part of this launch. vSAN 7 update 3 expands on its simple, reliable, and future-ready messaging by delivering enhancements that deliver a developer-ready infrastructure, simplify operations, and provide additional platform enhancements to continue this vision.
VMware Increased Availability for Cloud-native Applications
With vSphere 7 update 3, VMware adds increased delivery AI and developer-ready infrastructure, improved scalability, and simplification of operations. All with the goal of delivering towards the vision to manage, deploy, and deliver with ease.
Paul’s POV
VMware is an industry leader. There is no doubt that VMware’s past success will help accelerate its future vision, but VMware has its work cut out. The future direction VMware laid out is where the industry is going. In fact, 92% of respondents to ESG’s 2021 Data Infrastructure Trends survey leverage public cloud infrastructure services. However, VMware is not the only vendor looking to achieve this goal. In fact, they are one of many vendors in the market looking to solve, address, and “crack the multi-cloud code.” This is a very different market from the early 2000s when VMware seized the virtualization opportunity. VMware will need to increase and align partnerships in this space to fulfill its vision. Their partnerships will not only accelerate their go-to-market strategy and drive adoption but will also create less competition for them.
ESG conducted a comprehensive online survey of IT and data storage professionals from private- and public-sector organizations in North America (United States and Canada) between June 22, 2021 and June 30, 2021. To qualify for this survey, respondents were required to be professionals responsible for evaluating, purchasing, and managing data storage technology for their organization.
This Master Survey Results presentation focuses on investigating storage trends for both on- and off-premises technology environments, including challenges, opportunities, and evolving strategies for both.
The world is complex. When looking at today’s already complex world, it seems as though complexity continues to grow day by day. No industry understands complexity better than the healthcare industry. Rules, compliance, and regulation are only the tip of the iceberg when it comes to dealing with day-to-day challenges. The fluctuations of and increasing demands on healthcare systems around the world continue to grow everywhere throughout the industry at large hospitals, remote satellite sites, or a combination of both. The ability to pass patient data from location to location can often mean the difference in saving lives.
Data is the fuel of the business, creating new value and offering greater insights. A data lake environment plays a foundational role in helping extract value from data, and artificial intelligence initiatives are no exception. As the demand for faster access to data increases, massive scalability is no longer enough for modern data lake environments, so these environments increasingly leverage flash-level performance as well.
The recent and massive uptick in AI investments has been driven by the fact that these projects are, by and large, successful. This success often fuels an increase in the number of AI objectives, which places greater demands on IT and the underlying infrastructure. To ensure continued success with AI, the right infrastructure must be in place to consolidate data storage and accelerate its usage across the entire data pipeline.
IT organizations are continuously looking for new ways to drive operational and cost efficiencies and the data center is no exception. Data center staff want to take advantage of the benefits of a subscription-based cloud consumption model, but often don’t have the ability or budget to move their data or compute resources to the cloud. As a result, IT leaders are looking for ways to leverage resource consumption in a pay-as-you-go cloud model but within an on-premises infrastructure.
ESG research backs this up, with 48% of IT leaders saying they would prefer to pay for on-premises data center infrastructure through a consumption-based model.1
48% of organizations say they would prefer to pay for infrastructure via a consumption-based model such as a variable monthly subscription based on hardware utilization, which is up from 42% last year.
IBM was paying attention to these market trends and in the third quarter of 2021, they introduced new ways to expand storage consumption options with IBM storage-as-a-service offerings.
IBM offering storage as a consumption model
In a recent briefing covering IBM’s Storage launch, Eric Herzog stated “IBM storage-as-a-service offers a lower price than the public cloud.” I like this pay-as-you-go consumption model compared to a forced move to the cloud, where everything moves straight to the cloud. In the IBM storage-as-a-service model, organizations are able to move to the cloud yet keep some storage on-premises using the same subscription financial model. The beauty of this approach is that it offers a hybrid cloud solution with a concierge service to continually health check the client’s systems.
It is an interesting approach and shows that IBM is keeping up with and responding to storage industry trends about how customers are looking to purchase IT resources.
Not surprisingly, IBM is also responding to the future promise of quantum computing.
Quantum computing is not a myth!
In a different briefing this week, IBM’s Chief Quantum Exponent Robert Sutor explained that “…quantum computers will solve some problems that are completely impractical for classical computers…” The conversation with Robert was impressive. He explained to me the history of IBM’s background in the quantum computing space and said, “There are differences when describing quantum computing.” Sutor offers a lot of depth on this topic in his book, Dancing with Qubits: How quantum computing works and how it can change the world.
The IBM Institute for Business Value (IBV) has been deeply engaged with quantum research for many years. In June of this year, IBM announced advancements with its Quantum System One, delivering its first quantum computer system outside of the US to Europe’s largest application-oriented research organization, Fraunhofer-Gesellschaft. And the second system is being delivered to the APAC region soon. IBM, as usual, is on the forefront of innovation and driving paradigm shifts.
Many people, even tech-savvy ones, do not fully understand the potential in quantum computing processing power. This diagram provides the “gap” between classical computing of today and how quantum computing will enhance hybrid cloud and the overall heterogenous computational fabric. These cloud-based, open-source development environments will make using quantum computers “frictionless,” according to IBM.
The number of qubits (or quantum bits) in a computer make a big difference in computational performance, according to Robert. By 2023, IBM will expand its current quantum computer from 65 qubits to over 1,000 qubits. It is an ambitious goal and exciting at that same time. It is not quite like leaping through space and time like Sam Beckett did in the TV show Quantum Leap, but IBM’s future focus in this area is impressive, nonetheless.
The next phase in quantum computing? Beyond its continued technological evolution, IBM should evangelize its benefits more broadly so that organizations are well positioned to take advantage of all it has to offer. Hopefully, next year this time, I’ll speak with IBM again and see the beginning of more widespread adoption of quantum computing.
IBM seems to be doing the right things
I like the direction that IBM is taking with the storage roadmap. The improvements in performance offered to the IBM DS8900F analytics machine and the IBM TS7770 all flash VTL improve AI responsiveness and resiliency, which seems to provide an advantage over other systems in the same class. We also were briefed on the importance of IBM’s strategy with FlashSystem Safeguard Copy and how it improves data resilience by providing an immutable point-in-time of production data, isolated logical air-gap offline data, and separate privileges for admins.
Paul’s POV
In my opinion, IBM is listening to customers. The response with the roadmap in storage as well as quantum is aggressive yet will keep them competitive. Customers will ultimately decide for themselves whether these are the right approaches, but from my vantage point, IBM is checking the right boxes to address many common business pain points right now as well as those that are expected in the future. It will be interesting to see how the competitor landscape will respond to these new advances in IBM’s portfolio.
Can an IT systems vendor really transform into a cloud service provider? With everything we saw from HPE and HPE GreenLake at this year’s HPE Discover 2021, I think we are going to find out pretty soon. With this year’s announcements, it’s clear HPE understands that modern IT wants a cloud-like experience everywhere, and HPE means to deliver it.
June is proving to be a busy month in the tech world. Cisco is no exception and has announced a new launch within their portfolio. I had the opportunity to attend Cisco’s Future Cloud event, and I am providing my summary of Cisco’s future direction. They focused on innovation in these main areas:
At this year’s Pure Accelerate we were offered a glimpse into the future of data storage. Contrary to what you might expect, the event didn’t center on some new array or a new version of flash. Rather, the focus was on how Pure is transforming the way it delivers storage technology, how its storage supports modern application environments, and how its customers consume and manage its storage.
Recently, I had the pleasure of attending an analyst briefing with Ben Bolles, Vice President of Products; and Mike Koponen, Senior Director, Product Marketing and Strategic Alliances, at Pivot3 discussing their latest launch. It was interesting to see the advances in the hyperconverged infrastructure space and the differentiation Pivot3 is bringing to market.
When we look at Pivot3’s core areas, it is clear they are competing in a large market, but they keep the focus on video surveillance. This is a market segment that is clearly gaining adoption, not only at the data center core but also at the edge and in the cloud. (more…)