Show HN: MyLocalAI now has Google Search – Local AI with web access (github.com)

🤖 AI Summary
MyLocalAI is an open-source, fully local Next.js chat app that adds real-time Google Search and web-scraping to a local LLM pipeline. Built with Next.js 15, LangGraph agent orchestration and MCP (Model Context Protocol) tools, it routes user queries to searchable web tools (Google Search, URL scraping) which the agent can call, then feeds results to a locally hosted LLM (Ollama recommended Qwen 3 14B). Responses stream back to the UI over Server-Sent Events (SSE) for low-latency, incremental output; the UI shows tool call visibility, renders Markdown with code highlighting, and persists threads via an SQLite checkpointer. No cloud API keys are required — you run Ollama and the app on your machine. This matters because it gives researchers and practitioners a reproducible, privacy-focused way to combine up-to-date web evidence with local models and agent reasoning. Key technical points: LangGraph handles complex reasoning and tool orchestration, MCP provides a pluggable tool SDK, the SSE streaming endpoint exposes a /langraph_backend route, and tools live under app/mcp_server with a transport route for registration. System requirements call for Node 18+, Ollama, and recommended 16GB+ RAM for 14B models (7B/4B suggested for lower resources). The repo is developer-friendly (change models in app/page.tsx, add tools under app/mcp_server/tools), MIT-licensed, and targets workflows that need current facts without sending data to external LLM APIs.
Loading comments...
loading comments...