CISO role in ASM could add runtime security, tokenization
Runtime security and tokenization stand to play a bigger role in attack surface management, a development that could influence security leaders' responsibilities.
Attack surface management is a sprawling cybersecurity field that aims to identify internal and external vulnerabilities, recommend countermeasures and watch for emerging threats. Enterprises looking to shore up the attack surface can deploy numerous ASM tools that scan, classify, remediate and monitor security issues, aligning with the CISO's traditional role of assessing threats and implementing controls.
But cybersecurity leaders might also consider emerging ASM aspects that promote more proactive security measures. For example, runtime security approaches protect applications and workloads while they are executing, allowing security personnel to immediately address issues that arise. Another example is tokenization, a process that replaces sensitive data with a randomly generated identifier called a token. The sensitive data is stored in a secure database or encrypted using an algorithm, which helps reduce the attack surface and minimize the effects of a successful data breach.
Upwind Security offers a runtime-based cloud security platform. Rinki Sethi, chief security and strategy officer at Upwind, believes runtime will become critical for ASM as the use of agentic AI becomes more prevalent in cybersecurity.
"If you are a true believer that the future of security is going to be agentic, which I believe, focusing on runtime security is going to be the most important thing when it comes to attack surface management," she said.
Sethi said agentic AI systems can consume runtime data and help organizations make decisions when issues occur rather than dealing with misconfigurations or other vulnerabilities after the fact. By comparison, a cybersecurity tool that belatedly identifies an issue that's been lingering in an IT environment for two weeks tells security managers something attackers already know, she added.
You want to know your issues in real time, and if you don't operate that way, you are going to be missing a beat.
Rinki SethiChief security and strategy officer, Upwind
"You want to know your issues in real time, and if you don't operate that way, you are going to be missing a beat," Sethi said.
Sethi said Upwind's runtime focus was the main reason she decided to join the startup in June. She was previously the CISO at Bill, a financial operations platform provider and Upwind customer.
"There is a lack of education on runtime and why it's so important," she said. "The focus still seems to be on posture management as it relates to attack surface, and I think we need to change that into really focusing in on runtime."
There are some technology adoption considerations, however. For example, an Upwind security glossary advised organizations to aim for tools offering a breadth of infrastructure coverage, noting that "not every runtime security tool supports every cloud environment and system equally."
Runtime security and tokenization can reinforce these ASM measures for reducing attack surfaces.
Tokenization limits the 'blast radius'
Capital One has invested in tokenization, building an in-house tokenization engine to protect sensitive data at scale. The financial services company now runs more than 100 billion tokenization operations a month across hundreds of its applications, noted Leon Bian, head of product for data security solutions at Capital One Software, a B2B software business within Capital One.
Earlier this year, the software unit launched a tokenization offering dubbed Databolt, which was adapted from Capital One's internal engine.
Bian said tokenization lines up with ASM's core principles of reducing exploitable assets.
"Tokenization minimizes high-risk data exposure and limits the blast radius of potential breaches," he said.
Bian described tokenization as a crucial but underutilized component of ASM, which traditionally focuses on discovering and managing exposed assets. But tokenization "proactively neutralizes" the value of those assets to attackers if they are breached, he added.
Enterprises that implement tokenization should follow best practices to fully realize the process' benefits, according to a Capital One Software blog post. Those practices include identifying the most critical data, understanding where data resides and how it flows, and securing the token server.
John Moore is a writer for Informa TechTarget covering the CIO role, economic trends and the IT services industry.