Surrey Uni show AI systems based on the human brain's save energy (epsomandewelltimes.com)

🤖 AI Summary
Researchers at the University of Surrey’s Nature-Inspired Computation and Engineering (NICE) group have published a Neurocomputing paper introducing Topographical Sparse Mapping (TSM) and an enhanced variant (ETSM), new wiring strategies for artificial neural networks inspired by the brain’s sparse, spatially organized connections. Instead of fully connecting each layer (the dense “all-to-all” pattern used in most deep learning models), TSM restricts each artificial neuron to nearby or functionally related neurons, mirroring the brain’s topographical layout. ETSM adds a biologically inspired pruning step during training—trimming away unnecessary connections as learning progresses—so networks avoid the energy and compute waste of redundant links without needing continual rewiring. The practical results are striking: Surrey’s enhanced model reached up to 99% sparsity while matching or exceeding accuracy on benchmark tasks, trained faster, used less memory, and consumed under 1% of the energy typical of conventional networks. The work currently applies the mapping to input layers, but extending it deeper could yield even greater parameter and energy reductions and enable more efficient neuromorphic implementations. For the AI/ML community, TSM/ETSM offer a scalable path to much greener, cheaper models—achieving comparable performance by redesigning connectivity patterns rather than just increasing scale or compute.
Loading comments...
loading comments...