Firefox Forcing LLM Features (equk.co.uk)

🤖 AI Summary
Mozilla has been shipping LLM/AI features in Firefox that are enabled by default and — crucially — often lack a clear GUI toggle for users to turn them off. Reported issues include unexpected context-menu entries like “ask an AI chatbot,” high CPU/RAM usage from local AI features, and ambiguous Terms of Service language that raised privacy concerns. Power users discovered the only reliable way to disable the functionality is via about:config or by editing prefs.js in the profile, flipping flags such as browser.ml.enable, browser.ml.chat.enabled, browser.ml.chat.menu and extensions.ml.enabled to false; others have automated scripts to apply these prefs across profiles. This matters to the AI/ML community because it highlights trade-offs between shipping on-device intelligence and preserving user control, performance, and trust. Enabling local models or AI-assisted UI by default can increase resource consumption and surface data-handling questions, potentially undermining adoption or prompting forks and migrations to other browsers. The practical takeaway: developers deploying embedded LLM features should provide explicit, discoverable controls, document resource/performance impacts and data flows, and offer easy opt-out mechanisms. For users, the current workaround is manual prefs editing or switching to Firefox forks that strip the AI features; the original author has published automation scripts on GitHub to simplify that process.
Loading comments...
loading comments...