Show HN: Built an MCP server using Cloudflare's Code Mode pattern (github.com)

🤖 AI Summary
A developer released a local implementation of the "Code Mode" workflow for Model Context Protocol (MCP) servers that replaces brittle LLM tool-calling with a single execute_code primitive. Instead of exposing dozens of tools and forcing the model to issue correctly formatted calls, the LLM writes TypeScript/JavaScript that runs in a Deno sandbox and uses fetch() to hit a local proxy (default http://localhost:3001/mcp/*). The proxy transparently forwards requests to configured MCP servers and returns structured JSON results, while the execute_code endpoint runs the code (TypeScript by default) with network-only access, no filesystem or env access, temporary file cleanup, and a 30s timeout. Technically, the project ships endpoints for discovery (GET /mcp/servers, GET /mcp/{server}/tools) and invocation (POST /mcp/call with {server, tool, args}), plus a detailed JSON tool schema optimized for LLM consumption. It uses Bun for dependencies, Deno for sandboxed execution, and works with MCP-compatible clients (Claude Desktop, Cursor, Copilot). Significance: by leveraging LLMs’ superior ability to write and chain code, this pattern enables natural composition, richer client-side logic, and safer orchestration of MCP tools without complex tool-interface training. Future work ideas include injecting helper wrappers (mcp.tool(...)) and richer config/filtering, while the repo remains a practical experiment to test the workflow.
Loading comments...
loading comments...