AI is helping to decode animals' speech. Will it also let us talk with them? (www.nature.com)

🤖 AI Summary
Recent studies show that animal vocal systems are far more structured than once thought, and researchers are leaning on AI to detect and interpret those patterns at scale. Fieldwork recorded, for example, 700 calls from 30 adult bonobos combining a finite set of calls into four distinct composite meanings — some “non‑trivial” combinations in which one call modifies the other — and chimpanzees and birds have shown similar compositional behaviors. Teams such as Earth Species Project and Project CETI are using machine learning to link sounds with behaviors, cluster call types, and generate realistic sequences: CETI researchers trained generative models to mimic sperm‑whale codas and identified coda “vowels” whose frequency contours (rise, fall, fall‑rise, rise‑fall) resemble human vowel/diphthong dynamics. Technically, AI enables detection of subtle temporal, spectral and combinatorial structure (phonetics, syntax‑like ordering, and putative phonemic contrasts) that human listeners miss, and it can synthesize calls for playback experiments to probe meaning. The significance is twofold: these findings blur the boundary between human and non‑human communication and offer a path toward mapping — and potentially replying to — animal signals, with implications for cognition, conservation and ethics. But major hallmarks of human language (displacement, full productivity, duality, and robust recursion in natural use) remain unconfirmed, so AI’s role is to reveal complex building blocks rather than to declare that animals already possess full human‑style language.
Loading comments...
loading comments...