Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Whip it good–as old friend Dimitri Vlachos from Devo stopped by the ESG video studio to kick off our 2019 SOAPA video series. If you are unfamiliar with Devo, the company describes itself as follows:
Devo delivers real-time operational and business insights from analytics on streaming and historical data to operations, IT, security, and business teams at the world’s largest organizations. (more…)
As organizations struggle with the complexity and number of security tools in use, the dream of an integrated platform seems convincingly like a good idea. Surely life would be less complex with fewer tools to manage, systems that were designed and built to work together, and fewer vendors to deal with. But there will be new challenges and tradeoffs to consider that will require some planning and effort.
Over the past few years, Enterprise Strategy Group has promoted the security operations and analytics platform architecture (SOAPA). Just what is SOAPA? A multi-layered heterogenous architecture designed to integrate disparate security analytics and operations tools. This architecture glues incongruent security analytics tools together to improve threat detection, and then tightly-couples security analytics with operations tools to accelerate and automate risk mitigation and incident response. After all, you can have great security analytics for investigations, threat hunting, and root-cause analysis, but this all means diddlysquat if you can’t use these analytics to make and execute timely incident response and risk mitigation decisions.
Now we’ve certainly seen security analytics and operations come together in the market. In 2016, IBM acquired Resilient Systems to marry security operations with QRadar. Splunk followed suit in 2018 when it gobbled up Phantom. Most recently, Palo Alto Networks acquired Demisto to accelerate its Networks Application Framework strategy and serve as a critical step forward in the company’s aim to deliver immediate threat prevention and response for security teams.
SOAPA and these industry efforts have a similar purpose: Turn security insights into automated actions through closed-loop communications between security analytics engines and controls. When a security analytics system discovers a cyber-attack in progress, it can immediately alert Cisco firewalls, Blue Coat proxies, FireEye sandboxes, Trend Micro IDS/IPS, or McAfee endpoints with specific instructions on which files, hashes, network packets, and IP addresses to block.
This is a great strategy for security operations process automation but unfortunately there’s a catch. To make this work, security analytics systems must be programmed to “talk” to every type of security control out there. Not surprisingly, this is a slow and arduous process with lots of nuances. You may be covered in you have a Check Point firewall, but out of luck if you are a Forcepoint NGFW customer.
Enter OpenC2, a standards effort from OASIS. The OpenC2 forum’s mission is to define a language at a layer of abstraction that will enable unambiguous command-and-control of cyber defense technologies.
Remember that connection I described between security analytics engines and actual security controls? OpenC2 would standardize this communication. This could lead to a situation where security analytics tools could communicate in a common way with any security technology control from any vendor. No more need for custom coding, integration, proprietary APIs, etc.
Who would benefit from OpenC2? How about everyone! Users could have a standard way to modify security controls to defend against cyber-attacks or mitigate new risks as they arise. Security analytics vendors could go beyond threat detection to automate responses. Meanwhile, security tools’ efficacy could vastly improve if they could be “programmed” with new rule sets from security analytics engines in real time.
Our cyber-adversaries are well-organized, patient, and good at finding our vulnerabilities. What’s our response? We rely on point tools and manual processes and then hope for the best. As they say down south, “that dog don’t hunt.” To have any hope against hackers, cyber-criminals, and nation-states, we need to be able to use data analysis to accelerate our insights and then turn these insights into accurate and timely responses. Since OpenC2 has the potential to help here, I’m encouraging the cybersecurity diaspora to join the effort.
There’s been a lot of speculation as to how much Windows Defender (now Microsoft Defender) will impact the endpoint security market. With the continued prevalence of Windows devices in the corporate endpoint landscape, these endpoints continue to be the most vulnerable and most often attacked. With an increasing cadre of endpoint alternatives with better perceived security available to business users, Microsoft is clearly motivated to close this gap by fortifying the security of these devices in a way that enables its continued dominance in the corporate endpoint market. With a majority of organizations now utilizing Defender in a meaningful way, Microsoft is beginning to gain a foothold as a layered security control on the endpoint, but why aren’t organizations ready to go all in with Defender? And with the recent announcements for Mac support, and Linux support on the way, can Microsoft up its game and become a significant player in endpoint security, changing the endpoint security landscape moving forward?
I remember giving a presentation when I first started working in cybersecurity in 2003 (note: it was called information security back then). I talked about the importance of good security hygiene, focusing on deploying secure system configurations, managing access controls, and performing regular vulnerability scans.
When it came to the Q&A portion of my presentation, a gentleman in the first row raised his hand. He mentioned that his company was diligent about vulnerability scanning but then asked me: “How do you determine which vulnerabilities to prioritize and which ones to ignore?”
I don’t remember exactly how I responded but I am certain that my answer wasn’t very good.
The vulnerability management dilemma from 2003 remains a big problem to this day. As part of a recent ESG research project on cyber risk management, 340 cybersecurity and IT professionals were asked to identify their organization’s biggest vulnerability management challenges. Here are some of the results:
43% of respondents indicate that one of their biggest vulnerability management challenges is prioritizing which vulnerabilities to remediate. Sound familiar?
42% of respondents indicate that one of their biggest vulnerability management challenges is tracking vulnerability and patch management over time. In other words, many organizations find it difficult to manage processes from vulnerability scanning, to trouble ticketing, to change management, to patching, to incident closure. Oh, and these processes require strong collaboration between security and IT operations personnel. As Bruce Schneier says, “security is a process, not a product.” In this case, the processes are broken.
42% of respondents indicate that one of their biggest vulnerability management challenges is patching vulnerabilities in a timely manner. It’s not uncommon for a large enterprise to have thousands or even tens of thousands of vulnerabilities at any time. Little wonder why it’s difficult to keep up.
41% of respondents indicate that one of their biggest vulnerability management challenges is tracking the cost and effectiveness of their vulnerability management program. Security budgets continue to rise but CFOs want some reasonable metrics around what they are getting for their money. Looks like many organizations remain clueless when it comes to vulnerability management ROI.
40% of respondents indicate that one of their biggest vulnerability management challenges is keeping up with the volume of vulnerabilities. As I mentioned above, thousands to tens of thousands of vulnerabilities.
By the way, we’ve tried to improve vulnerability management by prioritizing vulnerabilities with high CVSS scores, those with known exploits, or those from mission-critical software vendors. But based upon this data, it looks like we haven’t progressed much in the past 16 years. Given the number of applications, devices, and systems on the network today, many organizations face greater cyber risk today than they did in the early 2000s simply because of these and other continuing vulnerability management challenges.
Fortunately, I finally have an answer to the question posed in 2003.
Question: How do you determine which vulnerabilities to prioritize and which ones to ignore?
Answer: Let data analytics be your guide. In other words, take all your vulnerability scanning data and analyze it across a multitude of parameters including asset value, known exploits, exploitability, threat actors, CVSS score, similar vulnerability history, etc. This data analysis can be used to calculate risk scores, and these risk scores can help guide organization on which vulnerabilities should be patched immediately, which ones require compensating controls until they can be patched, which ones can be patched on a scheduled basis, and which ones can be ignored.
Of course, few organizations will have the resources or data science skills to put together the right vulnerability management algorithms on their own, but vendors like Kenna Security, RiskSense, and Tenable Networks are all over this space. Furthermore, SOAR vendors like Demisto, Phantom, Resilient, ServiceNow, and Swimlane are working with customers on runbooks to better manage the operational processes.
After all this time, I’m still convinced that strong cybersecurity hygiene is a critical practice for cyber risk mitigation. I’m glad that we’ve finally made some progress on ways to make this happen.
ESG’s Master Survey Results provide the complete output of syndicated research surveys in graphical format. In addition to the data, these documents provide background information on the survey, including respondent profiles at an individual and organizational level. It is important to note that these documents do notcontain analysis of the data.
This Master Survey Results presentation focuses on state of the artificial intelligence (AI) market with an IT and a data-centric lens to uncover drivers, challenges, budgets, and personas across the AI data pipeline.
ESG’s Master Survey Results provide the complete output of syndicated research surveys in graphical format. In addition to the data, these documents provide background information on the survey, including respondent profiles at an individual and organizational level. It is important to note that these documents do not contain analysis of the data.
This Master Survey Results presentation focuses on the state of cyber risk management today and how it is changing in order to better support organizational missions and initiatives.
Enterprise data warehouses (EDWs) are often deemed the most valuable asset in the data center, serving as the backbone of the business. The ongoing insight gained from these solutions has justified the significant up-front capital investments and ongoing operational costs, but the rigidity of the traditional EDW is forcing organizations to reevaluate their approach to analytics and business intelligence.
While legacy EDW solutions were all about throwing as much computational power as possible at a relatively static data set, with the inflow of new and valuable sources of data and the emergence of all-encompassing analytics initiatives, the success of today’s EDW solutions depends more on operational and resource agility than raw horsepower. Being able to dynamically adjust to the needs of the business, integrate into operational processes, and quickly react to emerging opportunities can place an organization at a distinct competitive advantage. Today’s EDW solutions must act as a global repository of information, provide the agility to scale up or down on demand, and seamlessly integrate with other analytics tools and services used throughout a data-driven organization.
Over the last two years, Enterprise Strategy Group has conducted detailed studies quantifying the economic value of Google data analytics services. The first evaluated Google BigQuery compared to on-premises Hadoop and AWS redshift. The second focused on Google DataProc compared to DIY Spark and Hadoop approaches. I’m happy to share the next iteration of our economic analysis, extending the BigQuery study to incorporate a comparison to legacy enterprise data warehouses, both on-premises and in the cloud.
Through publicly available pricing and in-depth qualitative customer interviews, ESG was able to assert a base set of assumptions that power a dynamic model, incorporating up-front capital investments, deployment and migration costs, expected monthly cloud costs, administrative costs, and operational costs associated with legacy on-premises EDWs, cloud-based EDWs, and Google BigQuery.
The crux of the results show organizations can save up to 52% by using BigQuery over on-premises EDWs and up to 41% over cloud-based EDWs. Unlike legacy on-premises EDWs, BigQuery provides organizations with the key abilities that are essential to delivering a modern EDW solution, most notably the ability to integrate across other Google Cloud Platform services, including its market leading AI-based solutions and services. Although we did not call it out directly in the published report, ESG’s models indicate that the savings achieved by migrating an on-premises EDW solution to Google BigQuery may actually be more cost effective than simply continuing to operate an existing on-premises EDW solution. Stay tuned for more as we continue to expand our analysis throughout the year!
If you are in the cybersecurity market, you’ve heard (or read) about the point tools problem hundreds or thousands of times. Enterprise organizations base their cybersecurity defenses on dozens of point tools from different vendors. These point tools don’t talk to one another, making it difficult to get a complete end-to-end picture for situational awareness. This also leads to tremendous operational overhead as the cybersecurity staff is called upon to act as the glue between disparate tools.
In this edition of Data Protection Conversations, I speak with Justin Augat, VP of Product Management and Product Marketing of iland.
iland is a global cloud service provider of secure and compliant hosting for infrastructure (IaaS), disaster recovery (DRaaS), and backup as a service (BaaS) and delivers cloud services from its data centers throughout the Americas, Europe, Asia, and Australia. (more…)
In this edition of Data Protection Conversations, I speak with Joe Noonan, VP of Product Marketing and Product Management at Unitrends.
Unitrends recently joined the Kaseya family of IT solutions and delivers high-availability hardware and software to natively provide all-in-one enterprise backup and continuity. (more…)
I had a terrific week at RSA, meeting and talking with many of the world’s leading endpoint security and application security vendors. Every year, RSA provides a unique opportunity to take a fresh look at new and existing vendors, through in-person meetings with technical and marketing leaders, and checking out messaging through booths, signage, and materials.