Netlify's new default enabled "AI gateway" broke our app (typzet.com)

🤖 AI Summary
On October 1 Netlify rolled out a default-enabled "AI gateway" that automatically injects LLM provider API keys into deployments and routes model calls through Netlify’s own gateway/credit system. The immediate effect for one team: a Gemini-based "prompt generation" endpoint began returning 401 Unauthorized even though no code changed and other Gemini features using the same API key kept working. After 12+ hours of debugging they discovered Netlify’s gateway had injected its own GEMINI_API_KEY at build time, overriding the app’s build-time variable and causing authentication failures; disabling the gateway didn’t help, and the broken endpoint used Google’s genai SDK (suggesting the SDK’s env-var usage interacted poorly with Netlify’s injection). This incident matters for the AI/ML community because platform-managed LLM features can silently alter authentication, routing, billing and SDK behavior—breaking apps, changing vendor telemetry, and potentially charging via a platform-level credits system. Key technical takeaways: Netlify’s claim that they won’t override existing env vars doesn’t cover build-time variable patterns, SDKs may react differently to injected keys, and toggling the gateway may not be reliable. Workarounds include explicitly setting provider keys in the runtime/build environment so the platform won’t inject its own, auditing post-deploy behavior after platform updates, and treating managed LLM gateways as an operational dependency that can affect authentication, billing and telemetry.
Loading comments...
loading comments...