Docker is expanding the Docker Compose spec to accommodate AI agents in an effort to bring AI development closer to existing software development workflows.
Updates from Docker Inc. to its Docker Compose tool this week aim to integrate AI agent applications into enterprise development, including deployments to production with a familiar command.
Docker Compose, first introduced in 2014, uses YAML-based Docker Compose files to define and run multi-container applications. The Docker Compose open source spec was added in 2020 to standardize the Compose file format, organized under top-level elements such as services, networks, storage volumes, configuration data and secrets. This week, the Docker Compose spec added a new top-level element -- models.
This update enables Docker Compose users to define large language models for their agentic applications within Compose files, connect them to tools via Model Context Protocol (MCP) and configure agents from multiple application frameworks in the same Compose file. Users can also deploy these to cloud services using the same docker compose up command as with other containerized applications. To support this update, Docker expanded its partnership with Google Cloud this week to enable deployments of AI agents using Docker Compose on Google Cloud Run with a new gcloud run compose up command.
Industry analysts said integrating AI applications, including AI agents, into the existing software development lifecycle will be a significant step forward for enterprise AI agent projects, which often stall at the proof-of-concept stage.
Synchronizing the probabilistic AI part with the deterministic code-based part of the application is the only way to consistently achieve reliable results, and it is also critical to scale agentic AI apps across the enterprise.
Torsten VolkAnalyst, Enterprise Strategy Group
"Defining the entire application in a single YAML file is a critical precondition for the release pipeline to handle application code and AI models as two parts of one application that have to be released in sync," said Torsten Volk, an analyst at Enterprise Strategy Group, now part of Omdia. "Synchronizing the probabilistic AI part with the deterministic code-based part of the application is the only way to consistently achieve reliable results, and it is also critical to scale agentic AI apps across the enterprise."
Docker will support CrewAI, Embabel, Google's Agent Development Kit, LangGraph, Spring AI, and Vercel's AI SDK with Docker Compose integrations. Deployments for AI agents using Docker Compose files on Microsoft Azure Container Apps will be available soon, according to Docker officials. The company is in talks with AWS, but has not yet finalized a similar partnership with the hyperscaler for this update.
This week's update included another new feature called Docker Offload, which gives Docker Desktop users local access to GPUs hosted in Docker's cloud during the design and development phase of creating AI agents. Docker launched Docker Build Cloud in 2024 for cloud offload during the build stage of software delivery.
Finally, this week Docker released its MCP Gateway, introduced as MCP Toolkit in April, to open source. MCP Gateway became generally available with version 4.43 of Docker Desktop on July 3 and adds a security enforcement point between autonomous AI agents and tools.
The Docker Compose spec now supports defining AI models, tools and agents as part of an application.
AI development shakes up multi-cloud management
Docker highlighted that since its MCP Gateway and the Compose spec are open source, AWS users can combine them, but one analyst said a supported integration with the largest public cloud provider will be essential.
"Maybe a startup would do what they're talking about," said Larry Carvalho, an independent consultant at RobustCloud. "But without support? I don't see any enterprise doing it."
Overall, Docker and its commercial Desktop product must also continue to demonstrate value beyond the existing customer base to remain relevant in the AI race, according to Carvalho.
Cloud computing market dynamics are shifting due to generative AI, he said. While Google Cloud, Azure and AWS remain dominant, providers such as Oracle Cloud Infrastructure have reported strong growth this year. CoreWeave, an AI-focused cloud provider, was the first this month to receive the latest Blackwell Ultra chips from its stakeholder Nvidia, potentially shaking up the cloud market even more. Meanwhile, on-premises and self-managed deployments for AI inferencing appear to be gaining traction among some enterprises.
All of that creates new potential for multi-cloud management opportunities for vendors such as Docker, if they position themselves as a neutral, centralized platform to develop and deploy AI applications with a broad set of provider partnerships that support the new version of Compose, Carvalho said.
"Docker has a good story about abstracting complexity, but they could have a better story," he said.
Docker is part of the AWS Partner Network and the AWS ISV Accelerate program.
"While we haven't formally announced integration partnerships with AWS, Oracle or IBM at this time for Compose, we're actively in discussions with major cloud providers, including AWS, to explore deeper collaboration, especially where it benefits real-world developer workflows," a Docker spokesperson said in an emailed statement this week to Informa TechTarget.
Beth Pariseau, a senior news writer for Informa TechTarget, is an award-winning veteran of IT journalism covering DevOps. Have a tip? Email her or reach out @PariseauTT.