Show HN: AutoDocs – Reduce AI costs and never manage context again (github.com)

🤖 AI Summary
AutoDocs is an open-source tool from Sita that automatically generates repository‑wide, dependency‑aware documentation by parsing your codebase with tree‑sitter (AST) and SCIP (symbol resolution). It builds a file/definition/call/import dependency graph, topologically sorts it, then traverses that graph to produce high‑signal summaries and docs. The stack includes a Python FastAPI ingestion/search service that produces per‑repo SQLite analysis stores, a Next.js web UI for chat/exploration, and an MCP HTTP server (codebase-qna) so agentic tools can deep‑search repos. It integrates LLM summaries and embeddings (OpenAI/OpenRouter compatible), exposes rate limits for batching, and ships as a local Docker Compose setup (pnpm, uv, Postgres required); hosted/enterprise offerings are available via waitlist. For AI/ML teams and agent builders this matters because dependency‑aware summarization preserves the true context and call order of code, reducing redundant context sent to LLMs and lowering inference costs while improving answer accuracy for code Q&A. The MCP endpoint lets coding agents query a scoped knowledge store rather than re‑ingesting or streaming raw files. Current caveats: local-first deployment, TS/JS/Python only (polyglot/repos with mixed languages not supported yet), and GitHub API usage may require a PAT. Licensed Apache 2.0, AutoDocs looks aimed at integrating into agent workflows and CI to keep docs and code Q&A up to date with minimal manual maintenance.
Loading comments...
loading comments...