Google's AI Search Live is now available to all US app users (www.engadget.com)

🤖 AI Summary
Google has rolled out Search Live to all US users of the Google app on iOS and Android, bringing real-time, multimodal search interactions to the mainstream. Activated via a new "Live" icon under the search bar or from Google Lens, Search Live lets you stream your phone’s camera to Search in “AI Mode,” so the system can see and interpret the scene, answer follow-up questions, surface relevant links, and provide live guidance. Camera sharing is enabled by default in Lens for instant back-and-forth visual conversations; the feature is available only in English for this wider rollout. For the AI/ML community, this is a notable step toward production-grade, low-latency multimodal systems that tightly couple visual grounding with conversational search and retrieval. It highlights demand for models and pipelines that can handle continuous video input, real-time image understanding, query grounding, and dynamic retrieval tasks — all while meeting latency and scalability constraints. The launch also underscores practical considerations around user consent, privacy, and moderation when deploying always-on camera-driven agents, and will likely drive further work on efficient multimodal architectures, streaming inference, and robust visual-language alignment.
Loading comments...
loading comments...