Quinnypig/Yeet (github.com)

🤖 AI Summary
Quinnypig released a tiny, zero‑config toolchain — yeet and yoten — that uses Anthropic’s Claude to automatically analyze a local project and spit out exact deployment commands. Run yeet in your project directory (no args), and it will ask Claude what you’re shipping, infer the platform (examples: Vercel, Fly.io, Railway, Render, Netlify), recommend the shell commands to deploy (e.g., install vercel and run vercel --prod), and save the deployment URL. yoten then hits that URL, reports HTTP status and response time, and gives a tongue‑in‑cheek quality verdict (fire 🔥, mid, or cooked 💀). It’s implemented for Python 3.12+, built with uv, distributed via pip/uvx, and requires ANTHROPIC_API_KEY in your env. Why it matters: yeet demonstrates a pragmatic, opinionated use of LLMs for developer ergonomics — automating stack detection and deployment orchestration with minimal friction. It showcases both the upside (fast prototyping, fewer manual steps) and limits (opaque LLM decisions, brittle or incorrect platform picks, no configurability). Practically, it’s handy for demos and experiments but not a drop‑in production solution: it relies on Claude’s accuracy, exposes deployment risk if wrong, and intentionally trades configurability for “vibes.” Open‑source, MIT‑ish, and built for laughs as much as utility.
Loading comments...
loading comments...