Compose Spec Updated with <models> for AI Workloads

16 pploug 10 7/2/2025, 3:19:31 PM github.com ↗

Comments (10)

AsmodiusVI · 15h ago
Just saw this in the release notes, super excited to try it out.

Compose supporting an AI agent stack out of the box could be a big deal. I just saw a Reddit thread around using compose in production and I'm thinking I can put these two together. The idea of spinning up a whole multi-container setup for an LLM app (like a vector DB, orchestrator, backend, etc.) with a single compose up could work to actually get stuff to prod. I’ve been hacking around and struggling with this. Curious to see how far I can push this, might finally be the clean setup I can share.

mikesir87 · 15h ago
Excited to see models get first-class support! Will be interesting to see what opportunities this opens with this being formalized (other tooling/integrations, etc.).

Note that Compose files using the previous model declarations (using providers) will still work.

pploug · 15h ago
The big question is still - how will the models reach production?
kiview · 15h ago
You can check out my blog post on publishing models to Docker Hub using Docker Model Runner :) https://www.docker.com/blog/publish-ai-models-on-docker-hub/
pploug · 15h ago
Well, more wondering how I go from having local models into a production app with the model available - I guess swap it with a cloud offering?
kiview · 15h ago
You can run the same model in production, using Docker Model Runner in Docker CE. This is assuming, the model works for your production use case :)
djdto · 15h ago
Following this.
nunez · 12h ago
This is brilliant.
pianoswim · 13h ago
super pumped to see models get this support through docker
asuarezfr · 15h ago
That sounds interesting