The Continuous Tensor Abstraction: Where Indices Are Real (arxiv.org)

🤖 AI Summary
The paper introduces the "continuous tensor" abstraction, a programming model that lets tensor indices be real numbers (e.g., A[3.14]) and supports algebraic expressions over continuous domains (e.g., C(x,y) = A(x,y) * B(x,y)). To make infinite domains tractable it implements piecewise‑constant tensors and a new compact tensor storage format, plus a compiler-backed code‑generation pipeline that emits optimized kernels automatically. The system therefore expresses problems from computational geometry and graphics (domains traditionally outside tensor languages) directly in tensor notation, while preserving practical executability. For the AI/ML community this unifies spatial, geometric and signal-processing workloads with tensor programming and raises both productivity and performance. The authors report competitive or superior runtime vs. hand-optimized libraries: a 9.20× speedup on 2D radius search with ~60× fewer lines of code, 1.22× on genomic interval overlap queries with ~18× LoC savings, and 1.69× on trilinear interpolation for Neural Radiance Fields with ~6× LoC savings. Implications include simpler expression of geometry-first ML pipelines (e.g., NeRFs, spatial queries), automated kernel optimization across continuous domains, and a path toward compact, high-performance implementations for tasks that mix discrete and continuous reasoning.
Loading comments...
loading comments...