Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
Our seasoned analysts couple their industry-leading B2B research with in-depth buyer intent data for unparalleled insights about critical technology markets.
Clients trust us across their GTMs—from strategy and product development to competitive insights and content creation—because we deliver high-quality, actionable support.
Browse our extensive library of research reports, research-based content, and blogs for actionable data and expert analysis of the latest B2B technology trends, market dynamics, and business opportunities.
In part 2 of my SOAPA video with old friend Dimitri Vlachos from Devo, we discuss:
Devo use cases. Dimitri describes some of the most popular security use cases for Devo, including threat detection, security analytics/investigations, and threat hunting. I’ve got to hand it to Dimitri as he came up with one of the best SOAPA video soundbites ever, “You can’t use old tools to cover new security analytics needs.” (more…)
ESG’s Master Survey Results provide the complete output of syndicated research surveys in graphical format. In addition to the data, these documents provide background information on the survey, including respondent profiles at an individual and organizational level. It is important to note that these documents do not contain analysis of the data.
This Master Survey Results presentation focuses on the current people, process, and technology approaches to threat detection and response, specifically in the areas of endpoint detection and response, network traffic analysis, and managed detection and response.
Last week Intel hosted a group of press and analysts to learn about its Data Centric Innovation launch. Driven by the proliferation of cloud computing, growth in AI and analytics, and the cloudification of the network and the edge, Intel released a number of new technologies to help organizations process, store, and move all the data that is being created.
In 2017, my colleague Doug Cahill conducted research on endpoint security. Back then, the research indicated that 87% of organizations were considering a comprehensive endpoint security suite rather than several disconnected endpoint security point tools.
I spent a few days on the sunny Las Vegas strip with the team from Aruba last week at its annual customer event – Atmosphere 19. This year’s attendance recorded almost two thousand attendees and eight hundred partners. Aruba continues to grow and provided some stats to support that, citing over 1Billion switch ports, 15M access ports, 500,000 customers, 90,000 or so Airheads, and over 4000 patents. HPE Aruba is now at three billion in revenues and looking towards future growth that would get them to five billion.
Chinese military strategist Sun Tzu is quoted as saying, “if you know the enemy and you know yourself, you need not fear the results of a hundred battles.” In cybersecurity terms, this means knowing the cyber-adversaries and associated tactics, techniques, and procedures (TTPs) they use to attack your organization. Additionally, Sun Tzu’s quote extends to an organizational reflection where you must know everything about your technical, human, and even physical vulnerabilities in order to apply the best protection for critical assets.
Whip it good–as old friend Dimitri Vlachos from Devo stopped by the ESG video studio to kick off our 2019 SOAPA video series. If you are unfamiliar with Devo, the company describes itself as follows:
Devo delivers real-time operational and business insights from analytics on streaming and historical data to operations, IT, security, and business teams at the world’s largest organizations. (more…)
As organizations struggle with the complexity and number of security tools in use, the dream of an integrated platform seems convincingly like a good idea. Surely life would be less complex with fewer tools to manage, systems that were designed and built to work together, and fewer vendors to deal with. But there will be new challenges and tradeoffs to consider that will require some planning and effort.
Over the past few years, Enterprise Strategy Group has promoted the security operations and analytics platform architecture (SOAPA). Just what is SOAPA? A multi-layered heterogenous architecture designed to integrate disparate security analytics and operations tools. This architecture glues incongruent security analytics tools together to improve threat detection, and then tightly-couples security analytics with operations tools to accelerate and automate risk mitigation and incident response. After all, you can have great security analytics for investigations, threat hunting, and root-cause analysis, but this all means diddlysquat if you can’t use these analytics to make and execute timely incident response and risk mitigation decisions.
Now we’ve certainly seen security analytics and operations come together in the market. In 2016, IBM acquired Resilient Systems to marry security operations with QRadar. Splunk followed suit in 2018 when it gobbled up Phantom. Most recently, Palo Alto Networks acquired Demisto to accelerate its Networks Application Framework strategy and serve as a critical step forward in the company’s aim to deliver immediate threat prevention and response for security teams.
SOAPA and these industry efforts have a similar purpose: Turn security insights into automated actions through closed-loop communications between security analytics engines and controls. When a security analytics system discovers a cyber-attack in progress, it can immediately alert Cisco firewalls, Blue Coat proxies, FireEye sandboxes, Trend Micro IDS/IPS, or McAfee endpoints with specific instructions on which files, hashes, network packets, and IP addresses to block.
This is a great strategy for security operations process automation but unfortunately there’s a catch. To make this work, security analytics systems must be programmed to “talk” to every type of security control out there. Not surprisingly, this is a slow and arduous process with lots of nuances. You may be covered in you have a Check Point firewall, but out of luck if you are a Forcepoint NGFW customer.
Enter OpenC2, a standards effort from OASIS. The OpenC2 forum’s mission is to define a language at a layer of abstraction that will enable unambiguous command-and-control of cyber defense technologies.
Remember that connection I described between security analytics engines and actual security controls? OpenC2 would standardize this communication. This could lead to a situation where security analytics tools could communicate in a common way with any security technology control from any vendor. No more need for custom coding, integration, proprietary APIs, etc.
Who would benefit from OpenC2? How about everyone! Users could have a standard way to modify security controls to defend against cyber-attacks or mitigate new risks as they arise. Security analytics vendors could go beyond threat detection to automate responses. Meanwhile, security tools’ efficacy could vastly improve if they could be “programmed” with new rule sets from security analytics engines in real time.
Our cyber-adversaries are well-organized, patient, and good at finding our vulnerabilities. What’s our response? We rely on point tools and manual processes and then hope for the best. As they say down south, “that dog don’t hunt.” To have any hope against hackers, cyber-criminals, and nation-states, we need to be able to use data analysis to accelerate our insights and then turn these insights into accurate and timely responses. Since OpenC2 has the potential to help here, I’m encouraging the cybersecurity diaspora to join the effort.
There’s been a lot of speculation as to how much Windows Defender (now Microsoft Defender) will impact the endpoint security market. With the continued prevalence of Windows devices in the corporate endpoint landscape, these endpoints continue to be the most vulnerable and most often attacked. With an increasing cadre of endpoint alternatives with better perceived security available to business users, Microsoft is clearly motivated to close this gap by fortifying the security of these devices in a way that enables its continued dominance in the corporate endpoint market. With a majority of organizations now utilizing Defender in a meaningful way, Microsoft is beginning to gain a foothold as a layered security control on the endpoint, but why aren’t organizations ready to go all in with Defender? And with the recent announcements for Mac support, and Linux support on the way, can Microsoft up its game and become a significant player in endpoint security, changing the endpoint security landscape moving forward?
I remember giving a presentation when I first started working in cybersecurity in 2003 (note: it was called information security back then). I talked about the importance of good security hygiene, focusing on deploying secure system configurations, managing access controls, and performing regular vulnerability scans.
When it came to the Q&A portion of my presentation, a gentleman in the first row raised his hand. He mentioned that his company was diligent about vulnerability scanning but then asked me: “How do you determine which vulnerabilities to prioritize and which ones to ignore?”
I don’t remember exactly how I responded but I am certain that my answer wasn’t very good.
The vulnerability management dilemma from 2003 remains a big problem to this day. As part of a recent ESG research project on cyber risk management, 340 cybersecurity and IT professionals were asked to identify their organization’s biggest vulnerability management challenges. Here are some of the results:
43% of respondents indicate that one of their biggest vulnerability management challenges is prioritizing which vulnerabilities to remediate. Sound familiar?
42% of respondents indicate that one of their biggest vulnerability management challenges is tracking vulnerability and patch management over time. In other words, many organizations find it difficult to manage processes from vulnerability scanning, to trouble ticketing, to change management, to patching, to incident closure. Oh, and these processes require strong collaboration between security and IT operations personnel. As Bruce Schneier says, “security is a process, not a product.” In this case, the processes are broken.
42% of respondents indicate that one of their biggest vulnerability management challenges is patching vulnerabilities in a timely manner. It’s not uncommon for a large enterprise to have thousands or even tens of thousands of vulnerabilities at any time. Little wonder why it’s difficult to keep up.
41% of respondents indicate that one of their biggest vulnerability management challenges is tracking the cost and effectiveness of their vulnerability management program. Security budgets continue to rise but CFOs want some reasonable metrics around what they are getting for their money. Looks like many organizations remain clueless when it comes to vulnerability management ROI.
40% of respondents indicate that one of their biggest vulnerability management challenges is keeping up with the volume of vulnerabilities. As I mentioned above, thousands to tens of thousands of vulnerabilities.
By the way, we’ve tried to improve vulnerability management by prioritizing vulnerabilities with high CVSS scores, those with known exploits, or those from mission-critical software vendors. But based upon this data, it looks like we haven’t progressed much in the past 16 years. Given the number of applications, devices, and systems on the network today, many organizations face greater cyber risk today than they did in the early 2000s simply because of these and other continuing vulnerability management challenges.
Fortunately, I finally have an answer to the question posed in 2003.
Question: How do you determine which vulnerabilities to prioritize and which ones to ignore?
Answer: Let data analytics be your guide. In other words, take all your vulnerability scanning data and analyze it across a multitude of parameters including asset value, known exploits, exploitability, threat actors, CVSS score, similar vulnerability history, etc. This data analysis can be used to calculate risk scores, and these risk scores can help guide organization on which vulnerabilities should be patched immediately, which ones require compensating controls until they can be patched, which ones can be patched on a scheduled basis, and which ones can be ignored.
Of course, few organizations will have the resources or data science skills to put together the right vulnerability management algorithms on their own, but vendors like Kenna Security, RiskSense, and Tenable Networks are all over this space. Furthermore, SOAR vendors like Demisto, Phantom, Resilient, ServiceNow, and Swimlane are working with customers on runbooks to better manage the operational processes.
After all this time, I’m still convinced that strong cybersecurity hygiene is a critical practice for cyber risk mitigation. I’m glad that we’ve finally made some progress on ways to make this happen.
ESG’s Master Survey Results provide the complete output of syndicated research surveys in graphical format. In addition to the data, these documents provide background information on the survey, including respondent profiles at an individual and organizational level. It is important to note that these documents do not contain analysis of the data.
This Master Survey Results presentation focuses on the state of cyber risk management today and how it is changing in order to better support organizational missions and initiatives.