Claude Code Observability via LiteLLM and OpenInference

8 Dylancouzon 2 8/22/2025, 3:23:11 PM github.com ↗

Comments (2)

Dylancouzon · 5h ago
We (Dylan, Alex, Adam) built this to answer “what actually happened?” in Claude Code runs to debug things like nested tool calls, prompt bloat, token/cost spikes... It’s an open proxy on LiteLLM that emits OTEL + OpenInference spans to Arize-Phoenix.Giving you access to Full traces with internal prompt content, tool calls, streaming chunks, costs, and latency. No signup; works with the Claude Code CLI unchanged; MIT-licensed.

Any feedback welcome, hopefully folks find it useful. Full blog: https://arize.com/blog/claude-code-observability-and-tracing...

unwoven · 5h ago
Adam here, was surprised that we didn't have this for Claude Code Max yet, I imagine so many people are using it!