Show HN: Rebuilt Bible search app to run 100% client-side with Transformers.js (www.biblos.app)

🤖 AI Summary
A developer rebuilt a Bible search and summarization app to run entirely in the browser using Transformers.js, loading the complete Scripture text and semantic search models client-side so queries, retrieval, and summaries happen without any server round-trip. On first use the page loads models and data (so initial startup can take a minute), then it performs semantic embedding, contextual search, and short-form summaries locally—preserving user privacy and removing backend costs or reliance on cloud APIs. For the AI/ML community this is a practical demo of on-device transformer inference for a real-world retrieval + generation workflow. Transformers.js leverages WebAssembly/WebGPU to run smaller or quantized transformer models in the browser, enabling vector-based semantic search and lightweight summarization without a remote vector DB. The design highlights trade-offs: lower latency and privacy versus model-size, memory, and accuracy limits (necessitating smaller architectures, quantization, caching, or progressive loading). The project is significant as a blueprint for domain-specific, offline-capable semantic search apps (legal, medical, docs) and shows how open-source client-side ML tooling can democratize deployment patterns previously limited to server-side inference.
Loading comments...
loading comments...