CNCF surveys highlight cloud-native AI connections

Cloud-native infrastructure aligns closely with AI in two new CNCF surveys, thanks in part to platform engineering and other abstractions that make it more user-friendly.

ATLANTA -- Cloud-native technology has become more accessible to back-end developers, and AI development is closely tied to cloud-native infrastructure -- even though AI engineers might not know it.

So said the findings summarized in two new research reports released by the Cloud Native Computing Foundation (CNCF) during KubeCon + CloudNativeCon North America 2025 this week: the 2025 "State of Cloud Native Development" report and the "CNCF Technology Radar" report on AI. While a majority of back-end developers -- 58% -- said they considered themselves cloud-native, just 41% of AI engineers gave the same response.

CNCF senior technical program manager Bob Killen interpreted those findings during an interview on the IT Ops Query podcast.

"A large reason for this is that cloud-native [technology] has become more accessible to a larger set of users," Killen said. "It's no longer like you have to know what a container or Kubernetes is. There are a lot of things like internal developer platforms and a whole series of new CNCF tools that have made it much easier to consume."

Bob Killen, senior technical program manager, CNCFBob Killen

Similarly, cloud-native workload portability has improved over time, Killen said, which accounts for a jump in hybrid cloud usage in the cloud-native development survey, which increased to 32% from 22% in 2021.

Somewhat paradoxically, increasing layers of platform engineering abstraction could also be the reason only 41% of AI engineers consider themselves cloud-native, Killen said.

"A lot of them are consuming SaaS-type services, where the SaaS service itself is running on a cloud-native platform … under the hood, but for a lot of the consumers, they don't have to interact with that," he said.

Still, there are strong connections between cloud-native technology and AI platforms, even if they're sometimes hidden behind the scenes, according to the Tech Radar report. The survey also found that some respondents were already using what Killen described as "young projects," such as Model Context Protocol and the Agent2Agent Protocol in production.

Other cloud-native projects have adapted in the last year to better support AI workloads, from the Kubernetes dynamic resource allocation feature to Kueue, a scheduling system for AI/ML workloads, and Kubernetes JobSet, which follows a leader/worker paradigm more suited to batch jobs for AI, Killen said.

A lot of lower-level primitives that required a lot more management from platform engineers are now just handled natively by Kubernetes.
Bob KillenSenior technical program manager, CNCF

"In a lot of AI workloads, and this even goes back to the old days of HPC [high-performance computing], you need to schedule workloads to run at the exact same time, or you need to wait and make sure that you don't start a job until a certain number of nodes or pods are available," he said. "A lot of lower-level primitives that required a lot more management from platform engineers are now just handled natively by Kubernetes."

However, as the CNCF leader tasked with improving project governance processes and bringing end users and maintainers together in the community, Killen has plenty of work ahead of him to strengthen the bonds between cloud-native projects and AI.

"Right now, we are certainly seeing a lot of people just consuming as many resources as they can get, but we see hints of people starting to care a lot more about the cost of running those things," he said. "There's a lot more coming down the pipes in terms of optimizing those workloads and making sure that you are getting the most for your dollar."

Beth Pariseau, a senior news writer for Informa TechTarget, is an award-winning veteran of IT journalism covering DevOps. Have a tip? Email her or reach out @PariseauTT.

Dig Deeper on Systems automation and orchestration