Inference Gateway just released version 0.17.0 - automatic service discovery for A2A (Agent to Agent) resources. This means that any agent or tool you deploy into your Kubernetes cluster can now register and deregister itself to the gateway with no manual configuration required.
The goal is to make AI agent orchestration as seamless as possible without introducing unnecessary complexity like service meshes, sidecars, or authentication setups. Support for identity based workload discovery might come later.
The A2A spec lets you expose standard HTTP tools and even full MCP compatible agents. The Gateway watches for these agent resources and connects them dynamically, enabling composable AI services inside Kubernetes.
I’d love to hear feedback, especially from those building agent based workflows or orchestration systems.
The goal is to make AI agent orchestration as seamless as possible without introducing unnecessary complexity like service meshes, sidecars, or authentication setups. Support for identity based workload discovery might come later.
The A2A spec lets you expose standard HTTP tools and even full MCP compatible agents. The Gateway watches for these agent resources and connects them dynamically, enabling composable AI services inside Kubernetes.
I’d love to hear feedback, especially from those building agent based workflows or orchestration systems.