New CNCF Annual Cloud Native Survey reveals near-universal adoption of Kubernetes
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, released its Annual Cloud Native Survey results, revealing that Kubernetes has solidified its role as the ‘operating system’ for AI, with 82% of container users now running Kubernetes in production. The findings illustrate how Kubernetes has become the common denominator for cloud native scale, stability, and innovation, especially as organizations bring AI workloads into production environments.
Kubernetes has evolved beyond orchestration to become the backbone of enterprise infrastructure. Its role in scaling AI workloads demonstrates how integral it has become to modern production environments. Cloud native technologies are core to production environments, especially as companies look to operationalize AI.
“Over the past decade, Kubernetes has become the foundation of modern infrastructure,” said Jonathan Bryce, executive director of CNCF. “Now, as AI and cloud native converge, we’re entering a new chapter. Kubernetes isn’t just scaling applications; it’s becoming the platform for intelligent systems. This community has the expertise to shape how AI runs at scale, and we have a massive opportunity to build something open, powerful, and impactful for the next ten years.”
Infrastructure Maturity is Near-Universal
98% of surveyed organizations reported that they have adopted cloud native techniques, demonstrating how the technology has clearly moved beyond the “early adopter” phase and is establishing itself as the enterprise standard for deploying and managing modern applications at scale. This shift reflects increased confidence in Kubernetes and related tools, with most organizations now treating cloud native approaches as foundational rather than experimental.
- Production Kubernetes usage has surged: 82% of container users now run Kubernetes in production, up from 66% in 2023.
- Cloud native practices are the norm: 59% of organizations report that “much” or “nearly all” of their development and deployment is now cloud native.
- New adoption is slowing: 10% of organizations are in early stages or not using cloud native at all.
Kubernetes as the AI Platform
The survey highlights a major convergence between AI and cloud native infrastructure, positioning Kubernetes as the preferred platform for running inference workloads at scale.
- Kubernetes adoption for AI inference: 66% of organizations hosting generative AI models use Kubernetes to manage some or all of their inference workloads.
- AI deployment frequency remains cautious: While infrastructure is ready, only 7% of organizations deploy models daily; 47% deploy occasionally.
- Most organizations are still AI consumers: 44% report they do not yet run AI/ML workloads on Kubernetes, underscoring the early stage of AI production maturity.
GitOps and Platform Engineering Define ‘Innovators’
The survey identifies a clear link between operational maturity and the use of standardized platforms, as teams increasingly adopt GitOps workflows and internal developer platforms to manage scale and complexity.
- GitOps is a hallmark of maturity: 58% of “cloud native innovators” use GitOps principles extensively, compared to only 23% of “adopters.”
- Developer platforms are accelerating: The Backstage project (Internal Developer Portals) ranks as the #5 CNCF project by velocity.
Observability is the Second Most Active Frontier
OpenTelemetry has emerged as a dominant force in the ecosystem, reflecting how observability is evolving from a siloed tooling decision into a strategic pillar of cloud native operations.
- OpenTelemetry leads project velocity: It is now the second-highest-velocity CNCF project, with more than 24,000 contributors and widespread community support.
- Profiling adoption signals new priorities: Nearly 20% of respondents now report using profiling as part of their observability stack.
Cultural Challenges Have Overtaken Technical Complexity
For the first time, the primary challenge to cloud native adoption is not technical—it’s organizational. As more teams standardize on cloud native tools, the biggest obstacles have shifted from tool complexity and training to internal communication, team dynamics, and leadership alignment.
- Culture outpaces technical hurdles: “Cultural changes with the development team” is now the top challenge, cited by 47% of respondents.
- Traditional blockers take a back seat: In 2025, respondents ranked lack of training (36%), security (36%), and complexity (34%) lower than in previous years.
What’s Next for Cloud Native
As Kubernetes becomes the platform of choice for AI workloads and organizations scale their deployments, the next wave of innovation will hinge on resolving cultural adoption barriers, investing in platform engineering, and evolving security and observability standards.
“Enterprises are aligning around Kubernetes because it has proven to be the most effective and reliable platform for deploying modern, production-grade systems at scale—including AI—and because of the ecosystem and community that support it,” said Hilary Carter, senior vice president of research at Linux Foundation Research. “This year’s data shows that the next phase of cloud native evolution will be as much about people and platforms as it is about the tech itself. Organizations that invest in both will have a clear advantage.”
Catch more CIO Insights: Identity is the New Perimeter: The Rise of ITDR
[To share your insights with us, please write to psen@itechseries.com ]

