Listen to this article
With remote work here to stay alongside far-flung cloud and edge computing resources, securing remote access to IT systems has become a key part of DevSecOps strategy -- or it should, experts said.
For enterprises that rushed to support remote work with the onset of the COVID-19 pandemic, this might seem like a truism. Remote and hybrid work have not fully receded along with pandemic emergency measures over the last two years. But amid a rising tide of cybersecurity hazards, securing remote access remains, at times, an overlooked discipline.
The latest cautionary tale emerged in February by password management vendor LastPass, in its second post-mortem report on a major breach it suffered in August 2022. That attack was traced to a senior DevOps engineer at the company who used a home computer to access sensitive storage resources in AWS, according to the report. Attackers were able to install keystroke logging software on that machine via outdated Plex media server software and steal the engineer's credentials, which played a key role in the breach.
"While everyone is concentrating on the fact that this employee's version of Plex wasn't updated, it's not clear that piece of software has a legitimate business purpose in this scenario," said Daniel Kennedy, an analyst at 451 Research, a division of S&P Global. "Why wasn't this employee issued a separate corporate device as part of the remote setup? And when issued, my assumption would be that entertainment and other nonbusiness apps wouldn't be permitted to be installed on a corporate-owned device, lowering the attack surface."
Daniel KennedyAnalyst, 451 Research, S&P Global
In this case, however, either the company or the engineer did not follow such an approach. And other enterprises still have their guard down about securing remote access for distributed teams, some because they still believe that remote and hybrid work are a temporary phenomenon, Kennedy said.
"There are still too many security folks I speak to who are treating this as a pandemic thing that will go away at some point, so they just have to hold out until then," he said. "There's a lot to think about, but the ground seems to be permanently shifted under everyone's feet, so it's time to have a strategy."
Even in organizations where work isn't remote, access to IT resources has become increasingly distributed among hybrid cloud and edge computing locations, and that trend shows no signs of slowing in the next few years, according to technologists.
Along multiple dimensions -- including geographic regions in which data must be stored to comply with regulations, an explosion of edge computing that has followed 5G mobile networks and the growing number of personal devices on which employees can work -- computing as a whole is becoming more fragmented than ever before, said Craig McLuckie, co-creator of Kubernetes and an entrepreneur-in-residence at Accel, a venture capital firm.
"That's not what we're used to right now," McLuckie said during a recent online panel discussion hosted by Docker Inc. about technology trends. "We're so used to the cloud, where the data is in one or more [availability zones] and that's it. Now, we're starting to look at this deeply federated space."
Remote access management tools multiply
Amid the twin trends of ever more remote work and ever more remote resources, vendors have taken note. From distributed identity and access management systems to products that rethink the traditional VPN, DevSecOps teams don't lack for the technical means of securing remote access.
New approaches are also still appearing. For example, Docker inked a partnership with Ambassador Labs last month to integrate Telepresence, a local Kubernetes development tool, with Docker Desktop. Telepresence for Docker bridges local developer machines to remote Kubernetes clusters for development and staging, instead of individually managing replicated services both locally and remotely, which requires Kubernetes expertise and is prone to errors.
"This provides a shared cluster environment for developers to work together as a team on building applications for deployment to Kubernetes," Docker CEO Scott Johnston said in a press conference announcing the partnership March 23. "Docker extensions enable developers to use their favorite tools in an integrated, secure manner inside their Docker Desktop environment. We're also able to automatically ... produce images that run on multiple architectures."
The following week, incident response vendor PagerDuty followed with an updated PagerDuty Process Automation service that orchestrates resources from servers to laptops and edge devices using runbooks. This can include building automation jobs to update software on separate devices and laptops for security purposes. The new release accommodates deploying the PagerDuty Runner utility as a standalone entity in private networks that can be reached via HTTPS, instead of requiring riskier SSH commands or jump hosts to manage local resources, PagerDuty officials said.
"One of the biggest things organizations must do is look at their developer access controls and ensure that they are configured according to least privilege [within such systems]," said Katie Norton, an analyst at IDC. "Least privilege is of critical importance when it comes to securing the software supply chain, as it can reduce the attack surface by restricting lateral movement and containing bad actors."
In other words, enterprises can no longer simply allow remote devices to access distributed resources because employees log in using personal credentials. For fully remote companies such as remote access vendor Teleport, that means following the same strict security hygiene for endpoints that has been customary on corporate networks.
"Instead of being able to monitor the network that people are on, you have to think about it from the endpoint alone," said Reed Loden, vice president of security at Teleport. "With zero trust, we're making sure that we're doing user identification and verification at every step, but the device is just as important."
An open source utility created at Facebook, Osquery, is useful to scan endpoints for outdated software packages and potential malware, Loden said. Teleport and other companies such as Arnica are also developing products to manage trusted devices and developer access.
Whatever the mechanism used, zero-trust security and least-privilege access must be enforced at the user and device level in code as much as possible, said Robert Slaughter, CEO at IT defense contractor Defense Unicorns in Colorado Springs, Colo.
"For example, don't have a 'policy' for multifactor authentication -- enforce it by not letting people log in [without it]," he said. "It's hard to do, [but] all organizations can do is strive to continuously improve."
Despite tools, distributed DevSecOps a strategy issue
As with many other SecOps topics, the bulk of the problem with securing remote access ultimately isn't a lack of technical resources -- it's the difficulty of implementing and enforcing a cohesive strategy.
Tried-and-tested best practices for secure remote access have been well established, but the specific means of achieving them varies from environment to environment, even within the same company.
"Among different enterprises, and even different employees within those enterprises, risk profiles are so different that I tend to avoid trying to make blanket guidelines about 'securing remote work,'" 451's Kennedy said. "But I really want enterprises to think critically about threat modeling [and] securing remote access [for] user groups including privileged users, developers, financial folks who affect books and records -- wherever risk profiles start to look different than the average user."
Threat modeling is a multifaceted endeavor, encompassing corporate IT resources in multiple locations, network traffic flows, user authentication, equipment and device management, data loss prevention, and vulnerability assessments, according to Kennedy. Remote workers also have a list of best practices to follow, including using strong passwords, firewalls, secure Wi-Fi networks and VPN clients, and detecting phishing threats and other vulnerabilities.
"Remember, forcing this in code, with continuous monitoring and alerting, is key," Slaughter said. "Also important, across all of them, is that employee experience is key. If the UX is so terrible that they lose productivity, then they will opt out of using their corporate device."
Add all this to an already fully loaded plate for enterprise SecOps, where even traditional best practices for centralized IT resources such as patch management still aren't always followed, and a potentially pernicious threat starts to emerge, DevOps experts said.
"I still do not believe people have taken supply chain attacks seriously enough," said Chris Riley, senior manager of developer relations at marketing tech firm HubSpot in Cambridge, Mass. "You look at most engineering teams, and the security posture is not what you'd hope for from a least-privilege point of view. There's no access, and then there's almost all access -- there's no in-between for most of them."
Inconsistent security controls in nonproduction environments are an especially neglected issue, Riley said.
"Organizations have to address the inconsistency across development environments and force more standards and procedures around how environments are provisioned, how they're used, how they're accessed, etc.," he said. "These are all things that we know, that we've applied to production -- that we now need to apply to the delivery chain itself."
Beth Pariseau, senior news writer at TechTarget Editorial, is an award-winning veteran of IT journalism. She can be reached at [email protected] or on Twitter @PariseauTT.