Show HN: Runprompt – run .prompt files from the command line (github.com)

🤖 AI Summary
Runprompt is a single-file Python CLI tool for executing .prompt files directly from the command line, designed to make prompt-driven workflows reproducible and scriptable. You install it with a curl and chmod, write a .prompt, then feed input via stdin (the special {{STDIN}} variable holds the raw input) or pipe files into it. It supports chaining prompts by piping JSON output from one run into another, so you can extract structured data from text and immediately use those fields as template variables for generation. Examples: echo '{"name":"World"}' | ./runprompt hello.prompt or cat article.txt | ./runprompt summarize.prompt. Technically it supports multiple providers and models using a provider/model-name convention (anthropic/..., openai/..., googleai/..., openrouter/...), and reads API keys from provider-specific env vars (e.g., ANTHROPIC_API_KEY, OPENAI_API_KEY) or through OpenRouter’s single key. Frontmatter values in prompt files can be overridden on the CLI or via RUNPROMPT_ prefixed environment variables for site-wide defaults. It also includes an output-schema feature that enforces and emits JSON (optional fields use a trailing ?), a verbose flag to show request/response details, and a tests folder with usage examples—making it useful for tooling, automation, and reproducible prompt engineering.
Loading comments...
loading comments...