Show HN: MacMind – A transformer neural network in HyperCard on a 1989 Macintosh (github.com)

🤖 AI Summary
A groundbreaking project titled MacMind has emerged, showcasing a transformer neural network entirely implemented in HyperTalk, the scripting language of HyperCard, on a 1989 Macintosh SE/30. This model, designed with 1,216 parameters, is trained to perform the bit-reversal permutation—an essential step in the Fast Fourier Transform (FFT) algorithm—using a method that mirrors the principles of backpropagation and self-attention found in contemporary large language models. MacMind allows users to interactively see the training process, inspect all math involved, and modify key parameters, emphasizing the learnable nature of AI processes rather than their traditional perception as "black boxes". For the AI/ML community, MacMind serves as an insightful reminder of the fundamental similarities between small-scale and large-scale models. By successfully discovering the routing patterns of the FFT independently, it demonstrates that the same underlying mathematics govern models regardless of computational scale. This project not only makes sophisticated concepts accessible but also reinforces the idea that AI can be understood and manipulated at a fundamental level, encouraging further exploration into machine learning principles without the complexity often associated with modern tools and frameworks.
Loading comments...
loading comments...