🤖 AI Summary
Flemma (formerly Claudius) is a Neovim plugin that turns the editor into a multi-model AI workspace — designed for technical writers, researchers, creators and tinkerers rather than as a coding assistant. The project is mid-refactor/rename, so expect breaking changes for now, but the core idea is stable: persistent .chat buffers with streaming conversations, reusable prompt templates, local attachments, cost/token reporting, and ergonomic Vim-native commands that unify Anthropic Claude, OpenAI (including GPT‑5 family) and Google Vertex AI. It’s pitched as a fast, local-first playground where you keep prompts, history and backups in Git, reuse muscle memory, branch chat sessions by duplicating buffers, and avoid browser/session fragility or opaque vendor workbenches.
Technically, Flemma hooks into Neovim 0.11+ (Tree-sitter folding), uses curl with Server‑Sent Events for streaming, and leverages the file CLI for MIME detection of @./path attachments. Templates use Lua/JSON frontmatter (registerable parsers), and Vertex streams <thinking> blocks that Flemma folds and highlights; it also exposes reasoning effort in lualine. Secrets come from env vars or the Linux Secret Service; Vertex supports service-account flow and gcloud token refresh. Commands like :Flemma send/switch/cancel, buffer-local mappings, message text objects, presets, hooks, usage notifications, and contributor tooling (Nix dev shell, headless tests) make it a programmable, auditable in-editor AI environment for non-coding workflows.
Loading comments...
login to comment
loading comments...
no comments yet