At the KubeCon North America 2025 conference held in Atlanta, leaders from the Cloud Native Computing Foundation (CNCF) forecasted a substantial increase in cloud-native computing, driven by the expanding use of AI inference workloads. The CNCF anticipates hundreds of billions of dollars in spending in this area over the next 18 months.
AI and Cloud-Native Computing Integration
The integration of AI workloads, particularly inference tasks, into cloud-native computing is marking a new era where intelligent applications demand scalable and reliable infrastructure. According to CNCF Executive Director Jonathan Bryce, AI is transitioning from being handled by a few 'training supercomputers' to widespread 'enterprise inference,' which he describes as fundamentally a cloud-native issue. He emphasized the role of platform engineers in building open-source platforms that will enable enterprise AI.
Growth in AI Inference Workloads
CNCF CTO Chris Aniszczyk highlighted the merging of cloud-native and AI-native development as a pivotal moment. He cited data from Google showing a significant increase in internal inference jobs, which processed 1.33 quadrillion tokens per month recently, up from 980 trillion just months prior. This growth underscores the need for cloud-native projects, like Kubernetes, to adapt to large-scale inference workloads. The latest Kubernetes release includes a dynamic resource allocation feature that allows for GPU and TPU hardware abstraction, enhancing its capability to manage AI tasks.
Certification and Standardization Initiatives
To address the growing demand, the CNCF has introduced the Certified Kubernetes AI Conformance Program. This initiative aims to ensure AI workloads are as portable and reliable as traditional cloud-native applications. Aniszczyk noted that as AI moves into production, teams require consistent infrastructure. The program will establish shared guidelines to ensure AI workloads perform predictably across different environments, building on community-driven standards used in Kubernetes to support scalable AI adoption.
Economic Impact
The CNCF predicts that the integration of AI inference into cloud-native infrastructure and services will result in significant financial investments, potentially reaching hundreds of billions of dollars within the next 18 months. This surge is driven by enterprises eager to implement reliable and cost-effective AI services, spurred by the advancements and standardization efforts led by the CNCF.










