The Little Book of Deep Learning (2024) (fleuret.org)

🤖 AI Summary
The Little Book of Deep Learning released v1.2 on May 19, 2024 — a compact, phone-friendly primer aimed at readers with a STEM background. The freely-available book (non-commercial Creative Commons) has seen broad uptake — roughly 600,000 downloads in a bit more than a year — and is also offered as a low-cost $9 paperback. The author calls out an unauthorized $40 reprint being sold elsewhere. This update tightens many earlier explanations and, importantly, adds a new chapter on low-resource methods that practitioners care about today. Technically, v1.2 introduces a focused treatment of practical efficiency and adaptation techniques: prompt engineering, quantization, low-rank adapters (e.g., LoRA-type ideas), and model merging. It also adds notes on the quadratic cost of the attention operator and contrasts the O(T) runtime of standard RNNs with O(log T) approaches that exploit parallel scan, plus a fine-tuning subsection and an RLHF introduction. Earlier fixes improve descriptions of convolution equivariance and the original Transformer layout, and several figures/clarifications across dropout, normalization, and architectural examples were updated. For the AI/ML community this makes the book a concise, up-to-date resource on both core concepts and modern cost- and resource-aware techniques useful for research, engineering, and teaching.
Loading comments...
loading comments...