🤖 AI Summary
closed-circuit-ai (ccai) is an open-source, local-first platform that gives developers a lightweight workspace-driven server for building LLM apps: it serves local web front-ends alongside Jupyter notebook back-ends and ships a native "chat" app (chat.ipynb) for prompt orchestration and tool wiring. Importantly, ccai uses conflict-free replicated data types (CRDTs) to provide reliable bidirectional messaging with eventual consistency, enabling real-time collaborative editing across clients and offline-change persistence. The whole stack is small — ~150 MB install and ~0.2 GB RAM — and runs on Python ≥3.10, making it practical for on-device prototyping, privacy-preserving deployments, and collaborative development.
Technically, ccai exposes an OpenAI-compatible /v1/chat/completions endpoint and is designed to integrate with local LLM servers such as llama.cpp (recommended). You install via pip (pip install closed-circuit-ai) and run the ccai CLI with flags like --expose to accept external connections, SSL options, and kernel API settings. The notebook shows dynamic tool registration (create_tool_definition) and async tool execution examples (e.g., a typed WeatherTool), letting you declare tools that models can call. One caveat: some providers (Ollama, llama.cpp) have deviated from the official chat spec and may be incompatible out-of-the-box, but chat.py/chat.ipynb are editable so you can adapt ccai to different backend implementations. Overall, ccai is a compact, extensible local environment for building and testing collaborative, tool-enabled LLM apps.
Loading comments...
login to comment
loading comments...
no comments yet