🤖 AI Summary
Minecrafter and engineer Sammyuri released CraftGPT, a working small-language model implemented entirely inside Minecraft using Redstone logic. The in-game contraption occupies 1,020 Ă— 260 Ă— 1,656 blocks (about 439 million blocks) and implements tokenizers, matrix multipliers and other LLM primitives without command blocks or data packs. The model itself has 5,087,280 parameters, a 240-dimension embedding, 1,920-token vocabulary, 6 layers and a 64-token context window; it was trained in Python on the TinyChat conversational dataset. Most weights are quantized to 8 bits, with embedding and LayerNorm weights stored at 18 and 24 bits respectively.
CraftGPT is a proof-of-concept and spectacle more than a practical chatbot: even running at an accelerated tick rate using MCHPRS (~40,000×), a single reply typically takes about two hours and outputs are often off-topic or ungrammatical. Its significance lies in demonstrating how LLM building blocks—tokenization, matrix ops, quantization and layer norms—map onto extremely constrained, alternative compute substrates. For AI/ML practitioners the project is a vivid, tangible educational tool about resource trade-offs, latency, and quantization effects, but it also underlines why conventional hardware remains essential for usable models: compute density and throughput, not just logical correctness, determine practical LLM performance.
Loading comments...
login to comment
loading comments...
no comments yet