🤖 AI Summary
A surprising demo implements a working neural network entirely in jq (the JSON query language), using a small C helper (make_json.c) to convert MNIST into JSON and then piping records through jq for training and testing. The example reportedly reaches about 94% accuracy on MNIST (error_rate ≈ 0.0593) and emits final stats as JSON. While not competitive for speed—runs can take days—the project is a proof‑of‑concept showing that a declarative, JSON‑centric tool can express feedforward and backpropagation logic.
Technically, the implementation treats the network state as the accumulator in a reduction over input records because jq’s data is immutable. Each record has fields "input", "expected", and a "train" boolean; train=true performs backpropagation, false just updates the error count. Configuration is passed as a JSON argument (example config-trivial.json), the network code lives in neural_net.jq, and the runner is example/example.jq. Outputs and progress are JSON/debug lines (e.g., total/train/test/errors), and the final JSON contains error_rate, num_errors, and counts. The project highlights expressive power, portability (state easily saved/loaded as JSON), and educational value, while also illustrating practical limits of using an interpreter designed for data transformation rather than numeric performance.
Loading comments...
login to comment
loading comments...
no comments yet