🤖 AI Summary
Local Browser AI is an open-source Chrome/Edge extension that demos the new browser Prompt API to run small language models (SLMs) locally in the side panel. The project uses plain JavaScript, minimal permissions, makes no remote calls or tracking, and is available on GitHub (MIT). It showcases Edge AI benefits—privacy (prompts never leave your machine), low latency and offline usage—and a familiar chat UI while relying on the browser to handle model download/caching, letting many web apps share the same local model instead of each shipping huge downloads.
Technically, Prompt API exposes a window.LanguageModel() with a stateful, immutable session model: you emit prompts and the browser manages internal history (which isn’t directly readable), sessions are cloneable but not mutable, and system prompts are passed at initialization as options. Only temperature and topK are configurable per session (you must recreate sessions to change them). Chrome currently supports multimodal inputs (audio/image transcription) and languages like en/es/ja; context windows depend on hardware (example: RTX 4070 → Chrome ~9k tokens, Edge ~4k). Requirements are nontrivial—you’ll need sufficient RAM/VRAM and disk (model downloads 4–6+ GB; browsers recommend ~20–22 GB), and Prompt API is only available to extensions today. The extension is a practical reference for building local, privacy-first AI web apps while highlighting trade-offs of Edge AI: upfront hardware/disk cost and model quality vs. speed, cost savings, and control.
Loading comments...
login to comment
loading comments...
no comments yet