Getty Images/iStockphoto

Guest Post

It's time to rethink security certification for OT devices

Security certifications don't protect OT devices from vulnerable processes and insecure-by-design practices. It's time to update security certs for the connected OT age.

Those who do not learn history are doomed to repeat it.

We've heard that saying countless times, but it has added importance when it comes to operational technology (OT) cybersecurity.

History repeats itself time and again as entities -- from manufacturers to third parties to end users -- follow the same legacy processes and execute the same strategies.

For the last three years, Forescout has conducted research on OT device security issues and spearheaded the largest security evaluation of TCP/IP stacks -- communication protocols OT devices rely on to operate -- uncovering 95-plus new vulnerabilities. This continued with evaluation of OT equipment and protocols earlier this summer in our OT:Icefall research, leading to the discovery of 56 additional vulnerabilities.

Similar conclusions could be drawn from all the research: Legacy processes, insecure-by-design practices and reliance on previous certifications are primary culprits and need to be addressed. One way to do this is by using security certifications.

Trouble with following the same processes and certifications

We live in a connected world that is constantly changing. Industries that operate commerce, support our health and create new innovations are delivering their value proposition at a faster pace, thanks to OT devices.

This speed and constant change are precisely why following the same processes and relying on the same certifications no longer suffice.

To know where we want to go, it's imperative to examine what we've experienced up to this point. While it's not the case for every OT device, security is often a tier-two or tier-three priority before a device hits the market. Actions, such as scanning for vulnerable code, often take place, as do walkthroughs of components and protocols to ensure the device meets compliance requirements. These actions inform the security certification process, which is flawed because it's a static, in-time evaluation.

The trouble with this is a device can go through a rigorous security risk assessment process before it goes to market or is deployed onto a network, but that doesn't mean it's secure for its lifetime. In addition, during that security risk assessment process, the security of the actual protocols and software components is rarely scrutinized to a satisfactory level. Our OT:Icefall research found 74% of the product families affected by the vulnerabilities discovered had already received some form of security certification.

This doesn't mean security certifications are meaningless. It means we must reevaluate the security certification process.

How to reevaluate security certifications

Security teams, manufacturers and regulatory agencies have become accustomed to certifications that are based on opaque security definitions and functional testing. They've also become used to playing a game of hot potato when it comes to security liability. Government agencies have tried to place more liability on manufacturers and, in turn, manufacturers on security teams. This is the problem. Security certification and the management of the long-term security risk posture of OT devices must take on a more holistic approach and be a team sport.

Security certification in an OT world should encompass the following:

  • Well-defined and broadly accepted security requirements connected to realistic attacker models. Security certifications should clearly state what they certify against. Some schemes adopt levels of certification that correspond to attacker classes of increasing sophistication. This sophistication, however, is defined in generic terms, such as moderate resources, sophisticated means and specific skills. These vague terms lend themselves to interpretations reflecting an auditor's perceptions and expectations. Attacker models and their capabilities should be standardized. Additionally, lower levels of certification sometimes only account for issues such as unintentional misuse, which is too lax, enabling insecure designs. Basic security requirements should include signed firmware and encrypted and authenticated protocols.
  • Rigorous testing of protocol implementations. Many certification schemes limit the evaluation of security requirements to functional testing, meaning features are verified to be present but no inspection is conducted. This testing typically excludes proprietary protocols. As such, a functional security assessment might conclude that authentication is present on an engineering interface, while the protocol is unauthenticated, and all authentication is done client-side. Likewise, communication tests often only assess open protocols known by auditors. The specification of all communication protocols should be provided to auditors during certification efforts, and ideally, these protocols should be evaluated at the implementation level to avoid issues where a feature is present but in a vulnerable way.
  • Certification of individual components of a connected device. Supply chain vulnerabilities are widespread. Since virtually every device is made up of a myriad of reusable software components, these components should be considered the basic unit of testing and certification. This could lead to libraries of trusted components and reusable certifications that would enable device manufacturers to pick from known-good designs and implementations.
  • Automatic certification invalidation. The discovery of vulnerabilities on a device should automatically invalidate its security-certified status until the issues are addressed and patched. This automatic invalidation could be done with recent technical developments, such as software bills of materials, Common Security Advisory Framework and Vulnerability Exploitability eXchange.

Once a certified device is running on and communicating with an enterprise's network, the real work of managing long-term security risk posture begins. Consistent monitoring and contextual risk assessment of OT devices by the security team are essential. Similarly, manufacturers of these devices should continually test these devices in new situations, reevaluate device components to pinpoint emerging risks and share that information with enterprise end users. When it comes to security posture improvement, remember we're all on the same team.

About the author
Daniel dos Santos is the head of security research at Forescout's Vedere Labs, where he leads a team of researchers that identifies new vulnerabilities and monitors active threats. He holds a doctorate in computer science, has published more than 30 journal and conference papers on cybersecurity, and has spoken at conferences including Black Hat, Hack In The Box and x33fcon.

Dig Deeper on Threats and vulnerabilities

Networking
CIO
Enterprise Desktop
Cloud Computing
ComputerWeekly.com
Close