ChatGPT Token Counter for your chat history (Token-Counter-nu.vercel.app)

🤖 AI Summary
A new open-source tool lets you analyze token usage across your entire ChatGPT history by importing the conversations.json export from Settings → Data Controls → Export. Once you download the archive that OpenAI emails you, the app reads conversations.json locally in your browser and estimates input, output, and total tokens per conversation using the gpt-4-turbo tokenizer. You can sort chats by title, input/output/total tokens and inspect metadata like creation and update dates. All parsing and tokenization run offline in the browser, and the project is open for contributions. This is significant for users, developers, and researchers who want transparent insights into prompt length, model context consumption, and potential cost drivers without sending data to third parties. The estimator is useful for auditing long threads, trimming histories to fit context windows, and optimizing prompts, but it’s explicitly an estimate—not an exact billing record. Local processing preserves privacy and makes it a practical utility for anyone managing token budgets, migrating between models, or studying real-world prompt patterns.
Loading comments...
loading comments...