Show HN: Expose Copilot as a standard OpenAI-style API for your local toolchain (github.com)

🤖 AI Summary
Copilot Bridge is a VS Code extension that exposes your paid GitHub Copilot session as a local, OpenAI-compatible HTTP gateway. With the editor open, it runs a Polka server bound to 127.0.0.1 and implements /v1/chat/completions, /v1/models and /health endpoints (including SSE streaming), letting you curl completions, point existing OpenAI clients at a new baseURL, or wire Copilot into scripts, CI, desktop shells (Raycast, Alfred) and other tooling without routing traffic to additional vendors. It uses the VS Code Language Model API to discover Copilot models (e.g., gpt-4o-copilot or family keywords), normalizes conversation history, enforces concurrency limits (early 429s), and can require an optional bearer token. This matters for the AI/ML community because it preserves local control, latency, and data locality while enabling rapid experimentation with prompts, agents, and automation flows that expect an OpenAI-style API. Technical tradeoffs and behaviors are documented: the bridge always returns one choice, omits some OpenAI metadata fields, accepts but no-ops extra request options, streams deltas as SSE (clients should replace previous fragments), and stops when VS Code closes. It’s explicitly single-user and loopback-only (do not expose remotely). The project is Apache-2.0 licensed and aims to simplify integrating Copilot into local toolchains while maintaining compatibility with existing OpenAI-based workflows.
Loading comments...
loading comments...