Show HN: CodeMode – First library for tool calls via code execution (github.com)

🤖 AI Summary
CodeMode (UTCP/@utcp/code-mode) is a new library that turns LLMs into code executors rather than direct tool callers: instead of exposing hundreds of tool APIs to the model, you give it one TypeScript execution endpoint that can call registered tools (MCP, HTTP, file, CLI). In three lines you create a client, register tools, and run client.callToolChain(...) to execute multi-step workflows in a sandboxed Node VM. The system offers runtime interface discovery, TypeScript interface generation for IDE integration, streaming console logs, configurable timeouts (default 30s), and strict sandboxing (no filesystem/network access outside registered servers). For practitioners this is significant: independent benchmarks and internal claims report big wins in latency, token usage, and cost—60% faster execution, 68% fewer tokens, 88% fewer API round trips, and up to 98.7% context reduction for complex workflows; an independent Python benchmark projects ~$9.5k/year savings at 1,000 scenarios/day. CodeMode batches many API calls into a single execution, avoids repeated context re-processing, supports mixed tool ecosystems, and preserves observability and error handling (returns {result, logs}). The approach is especially useful for complex or batch workflows (8+ tools) where traditional tool orchestration requires many LLM iterations and API round trips.
Loading comments...
loading comments...