Exposing OpenAI-Compatible APIs from GitHub Copilot Models

1 privapps 1 8/6/2025, 5:00:49 AM github.com ↗

Comments (1)

privapps · 16m ago
I recently built a Go application called github-copilot-svcs https://github.com/privapps/github-copilot-svcs that provides an OpenAI-compatible API interface from GitHub Copilot.

Why This Matters: GitHub Copilot uses several LLM models, but other AI agents might perform better. This application also enables the use of GitHub Copilot for powering other AI applications, such as Proof-of-Concepts (POCs).

How It Works: The application acts as a middleware layer, translating OpenAI API requests into interactions with GitHub Copilot models. It leverages Go's robust concurrency model and efficient HTTP handling to ensure low-latency responses, making it suitable for real-time applications. This project is a reverse-engineered proxy of the GitHub Copilot API, exposing it as an OpenAI compatible service. This allows integration with tools that support the OpenAI Chat Completions API.

Use Cases: While the application uses models provided by GitHub Copilot, you can integrate other AI agents like QwenCode, Cline, Roo code, Crush, OpenCode, and etc. Or even power your own application.

I'd love to hear your thoughts on how this approach can be improved or extended. Whether you're building a new application or integrating it into an existing workflow, feel free to share your ideas or feedback here!