Show HN: MacMind – A transformer neural network in HyperCard on a 1989 Macintosh (github.com)

🤖 AI Summary
In an intriguing demonstration of retro computing and modern machine learning, a developer has created MacMind, a transformer neural network entirely in HyperTalk on a 1989 Macintosh SE/30. This single-layer, single-head transformer with 1,216 parameters learns the bit-reversal permutation, a critical step in the Fast Fourier Transform (FFT), by training on random examples. Remarkably, every aspect of the network—from token embeddings to backpropagation—is crafted using HyperTalk, allowing for full inspectability and modification, which contrasts sharply with the typical black-box nature of contemporary AI models. MacMind’s significance lies in its ability to demystify AI processes by showing that the principles governing large language models, like GPT-4, are fundamentally the same as those applied in this minimal model, albeit at a vastly different scale. Key components such as self-attention and gradient descent are clearly articulated, proving that these concepts are mathematical rather than magical. This project not only highlights the enduring relevance of foundational algorithms but also serves as an educational tool for those seeking to understand AI fundamentals in an accessible format.
Loading comments...
loading comments...