🤖 AI Summary
Gego is an open-source "GEO" (Generative Engine Optimization) tracker that automates sending prompts to multiple LLMs, extracts keywords from their responses, and provides analytics to compare models, prompts and trends. It supports OpenAI, Anthropic, Ollama, Google, Perplexity and custom providers, and runs as a CLI, API server (default port 8989) or Docker service. Gego uses a hybrid DB: SQLite for configuration (LLMs, schedules) and MongoDB for analytics (prompts, responses). Features include cron-based scheduling, automatic keyword extraction (no predefined list required, with optional exclusion file), latency/token/error metrics, a 3-attempt retry policy with 30s delays, configurable logging levels, and a pluggable architecture to add providers or backends.
For the AI/ML community this is useful for prompt engineering, model benchmarking, SEO/marketing research, and monitoring how brands/concepts surface across assistants over time. Technically it’s a Go 1.21 project with REST endpoints to manage LLMs, prompts and schedules; responses are indexed in MongoDB for trend queries and top-keyword stats. The combination of multi-LLM execution, automated keyword extraction, persistent analytics and operational tooling (health checks, logs, Docker compose) makes Gego a lightweight, reproducible platform for longitudinal evaluation and competitive analysis of generative systems.
Loading comments...
login to comment
loading comments...
no comments yet