Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
The world is complex. When looking at today’s already complex world, it seems as though complexity continues to grow day by day. No industry understands complexity better than the healthcare industry. Rules, compliance, and regulation are only the tip of the iceberg when it comes to dealing with day-to-day challenges. The fluctuations of and increasing demands on healthcare systems around the world continue to grow everywhere throughout the industry at large hospitals, remote satellite sites, or a combination of both. The ability to pass patient data from location to location can often mean the difference in saving lives.
It’s noisy out there. Hundreds of TV shows that your friends insist you “HAVE to watch,” a concerning number of crime-related podcasts (seriously, why are there so many?), and seemingly non-stop communication with one another – text, email, social media, carrier pigeon.… There are only so many hours in the day, and a lot of things fighting for your time and attention. It can be difficult to cut through the noise and see clearly what should be prioritized, both in your personal life and at work. The cybersecurity space is noisy, too.
Millions of events happen behind the scenes of an organization every day, and security teams need to identify and research the most important cyber threats at any given time. But which threats are the most serious? And where are they all coming from?
There is a plethora of cybersecurity tools designed to cut through this noise and to address these cyber threats. ESG (Enterprise Strategy Group) has validated the economic value of several of these solutions. The ESG Economic Validation process takes into consideration analyses of the market and industry, both quantitative and qualitative research, customer proof points, and more, before coming to a conclusion about the solution.
Here are a few ESG Economic Validations of cybersecurity solutions that help organizations to cut through the noise and focus on what’s most important: protecting their environments.
Google Chronicle Security Analytics: ESG validated significant economic savings for organizations that leverage Chronicle’s pricing model and Google Cloud Platform’s economies of scale.
Gigamon Network Visibility: ESG validated Gigamon Visibility and Analytics Fabric and discovered favorable results in the areas of digital transformation, reduction in tooling costs, reduction in the time needed to analyze traffic for security, and reduction in complexity.
Trend Micro XDR: Evaluated for security effectiveness, business enablement, and cost reduction, ESG validated that Trend Micro Vision One with XDR makes it easier for organizations to identify which threats are most concerning so that they can allocate their resources and focus accordingly.
Anomali Threat Intelligence Platform: ESG completed a quantitative economic validation and modeled analysis on the Anomali suite of products and found that it lowers operational cost of SecOps, improves security effectiveness, reduces risk to the organization, and improves SecOps productivity and satisfaction.
Enterprise Strategy Group is an IT analyst, research, validation, and strategy firm that gives the global IT community access to market intelligence and actionable insight. The Validation Team creates assets, such as Economic and Technical Validation Reports, videos, webinars and more, that help to communicate the technological and economic value of IT products.
ESG conducted a comprehensive online survey of IT professionals from private- and public-sector organizations in North America (United States and Canada) between May 21, 2021 and June 3, 2021. To qualify for this survey, respondents were required to be professionals familiar with their organization’s entire network environment.
This Master Survey Results presentation focuses on the impact of modern, distributed cloud environments on network infrastructure and strategies, spanning data centers, campus, and branch/edge locations.
The cybersecurity skills shortage continues with no end in sight, but collaborative research between Enterprise Strategy Group and ISSA suggests that organizations could and should be doing more to address it.
See the data behind these trends and more with this infographic.
Over the last few years, there has been a disconnect by which organizations rely on their SaaS vendors to protect, back up, and recover their data when they should actually perform their own backups. Younger organizations have more faith in their SaaS providers, while older organizations have the experience to back up and protect their own data. Some of the bigger names in the as-a-service industry are mission-critical to countless organizations, but they still require organizations to put their own strategies in place.
Business intelligence is the key to data-driven success according to new Enterprise Strategy Group research—with organizations on the leading-edge of data analytics usage revealing they enjoy a variety of competitive advantages.
See the data behind these trends and more with this Infographic, The Path to Data Leadership: Embracing Business Intelligence to Achieve Data-driven Success.
Data is the fuel of the business, creating new value and offering greater insights. A data lake environment plays a foundational role in helping extract value from data, and artificial intelligence initiatives are no exception. As the demand for faster access to data increases, massive scalability is no longer enough for modern data lake environments, so these environments increasingly leverage flash-level performance as well.
The recent and massive uptick in AI investments has been driven by the fact that these projects are, by and large, successful. This success often fuels an increase in the number of AI objectives, which places greater demands on IT and the underlying infrastructure. To ensure continued success with AI, the right infrastructure must be in place to consolidate data storage and accelerate its usage across the entire data pipeline.
In early 2021, the Enterprise Strategy Group and the Information Systems Security Association (ISSA) conducted the fifth annual research project focused on the lives and experiences of cybersecurity professionals. This year’s report is based on data from a global survey of 489 cybersecurity professionals.
The cybersecurity skills gap discussion has been going on for over 10 years, and the data gathered for this project confirms that there has been no significant progress toward a solution to this problem during the five years it has been closely researched. The skills crisis has impacted over half (57%) of organizations. The top ramifications of the skills shortage include an increasing workload (62%), unfilled open job requisitions (38%), and high burnout among staff (38%). Further, 95% of respondents state the cybersecurity skills shortage and its associated impacts have not improved over the past few years while 44% say it has only gotten worse.
IT organizations are continuously looking for new ways to drive operational and cost efficiencies and the data center is no exception. Data center staff want to take advantage of the benefits of a subscription-based cloud consumption model, but often don’t have the ability or budget to move their data or compute resources to the cloud. As a result, IT leaders are looking for ways to leverage resource consumption in a pay-as-you-go cloud model but within an on-premises infrastructure.
ESG research backs this up, with 48% of IT leaders saying they would prefer to pay for on-premises data center infrastructure through a consumption-based model.1
48% of organizations say they would prefer to pay for infrastructure via a consumption-based model such as a variable monthly subscription based on hardware utilization, which is up from 42% last year.
IBM was paying attention to these market trends and in the third quarter of 2021, they introduced new ways to expand storage consumption options with IBM storage-as-a-service offerings.
IBM offering storage as a consumption model
In a recent briefing covering IBM’s Storage launch, Eric Herzog stated “IBM storage-as-a-service offers a lower price than the public cloud.” I like this pay-as-you-go consumption model compared to a forced move to the cloud, where everything moves straight to the cloud. In the IBM storage-as-a-service model, organizations are able to move to the cloud yet keep some storage on-premises using the same subscription financial model. The beauty of this approach is that it offers a hybrid cloud solution with a concierge service to continually health check the client’s systems.
It is an interesting approach and shows that IBM is keeping up with and responding to storage industry trends about how customers are looking to purchase IT resources.
Not surprisingly, IBM is also responding to the future promise of quantum computing.
Quantum computing is not a myth!
In a different briefing this week, IBM’s Chief Quantum Exponent Robert Sutor explained that “…quantum computers will solve some problems that are completely impractical for classical computers…” The conversation with Robert was impressive. He explained to me the history of IBM’s background in the quantum computing space and said, “There are differences when describing quantum computing.” Sutor offers a lot of depth on this topic in his book, Dancing with Qubits: How quantum computing works and how it can change the world.
The IBM Institute for Business Value (IBV) has been deeply engaged with quantum research for many years. In June of this year, IBM announced advancements with its Quantum System One, delivering its first quantum computer system outside of the US to Europe’s largest application-oriented research organization, Fraunhofer-Gesellschaft. And the second system is being delivered to the APAC region soon. IBM, as usual, is on the forefront of innovation and driving paradigm shifts.
Many people, even tech-savvy ones, do not fully understand the potential in quantum computing processing power. This diagram provides the “gap” between classical computing of today and how quantum computing will enhance hybrid cloud and the overall heterogenous computational fabric. These cloud-based, open-source development environments will make using quantum computers “frictionless,” according to IBM.
The number of qubits (or quantum bits) in a computer make a big difference in computational performance, according to Robert. By 2023, IBM will expand its current quantum computer from 65 qubits to over 1,000 qubits. It is an ambitious goal and exciting at that same time. It is not quite like leaping through space and time like Sam Beckett did in the TV show Quantum Leap, but IBM’s future focus in this area is impressive, nonetheless.
The next phase in quantum computing? Beyond its continued technological evolution, IBM should evangelize its benefits more broadly so that organizations are well positioned to take advantage of all it has to offer. Hopefully, next year this time, I’ll speak with IBM again and see the beginning of more widespread adoption of quantum computing.
IBM seems to be doing the right things
I like the direction that IBM is taking with the storage roadmap. The improvements in performance offered to the IBM DS8900F analytics machine and the IBM TS7770 all flash VTL improve AI responsiveness and resiliency, which seems to provide an advantage over other systems in the same class. We also were briefed on the importance of IBM’s strategy with FlashSystem Safeguard Copy and how it improves data resilience by providing an immutable point-in-time of production data, isolated logical air-gap offline data, and separate privileges for admins.
Paul’s POV
In my opinion, IBM is listening to customers. The response with the roadmap in storage as well as quantum is aggressive yet will keep them competitive. Customers will ultimately decide for themselves whether these are the right approaches, but from my vantage point, IBM is checking the right boxes to address many common business pain points right now as well as those that are expected in the future. It will be interesting to see how the competitor landscape will respond to these new advances in IBM’s portfolio.