Scientists are on the verge of deciphering animal communication using artificial intelligence, with recent breakthroughs suggesting we may soon be able to “talk” with other species for the first time. The development represents a fundamental shift from decades of scientific reluctance to acknowledge non-human language, driven by AI’s ability to detect patterns in massive datasets of animal sounds and behaviors.
The big picture: Multiple research teams are racing to crack interspecies communication, spurred by the new Coller Dolittle Challenge offering $100,000 annually and a $10 million grand prize for breakthrough discoveries.
- The challenge, established by Tel Aviv University and Jeremy Coller’s philanthropic foundation, has already identified promising candidates including cuttlefish sign language, dolphin “names,” and whale communication patterns.
- This year’s winner studied bottlenose dolphins in Florida, identifying 22 distinct non-signature whistles used across a pod of 170 animals spanning six generations.
Key breakthroughs across species: Researchers have documented sophisticated communication systems previously unknown to science.
- Cuttlefish: Sophie Cohen-Bodénès at Washington University discovered these marine animals use four distinct arm gestures—dubbed “up,” “side,” “roll,” and “crown”—that other cuttlefish respond to, even when detecting only water vibrations.
- Dolphins: Laela Sayigh’s team at Woods Hole Oceanographic Institution found dolphins use unique “phee” calls as names, with one widespread whistle appearing to mean “What was that?” when encountering something unexpected.
- Nightingales: Max Planck Institute researchers discovered these birds can instantly adjust their pitch to imitate other individuals, showing speech-like flexibility never before seen in non-human animals.
- Whales: Project CETI, a nonprofit organization dedicated to listening to sperm whales, identified 156 click patterns forming the whales’ “phonetic alphabet,” with AI revealing these clicks acoustically resemble human vowels.
Why AI changes everything: Machine learning has revolutionized animal communication research by processing vast amounts of data at unprecedented speed and scale.
- “AI allows us to really scale up experiments. It allows us to process data much faster,” says Frants Jensen at Aarhus University. “That really is a phenomenal game changer.”
- The technology has revealed statistical patterns in humpback whale songs similar to human language structure and helped identify repeating features across years of recordings.
- However, AI isn’t foolproof—it often fails when tested on data with different acoustic characteristics, requiring scientists to verify that patterns actually mean something to the animals.
The complexity challenge: True animal communication involves far more than just vocal sounds, making full decoding extremely difficult.
- Animals communicate through color changes, scent releases, touch, facial expressions, and even electric shocks used by some fish species.
- Context remains crucial—orangutans can delay alarm calls by up to 20 minutes and alter acoustics to indicate how much time has passed since an event, creating interpretation challenges.
- Japanese tits use different note combinations where the order changes the overall meaning, adding grammatical complexity to their communications.
What experts are saying: The scientific community remains divided on whether any non-human species truly possesses language.
- “We’ve arrogantly convinced ourselves that we are the only living things worth listening to,” said Jeremy Coller at the inaugural awards ceremony.
- “We wrote the definition of language and other animals will never be able to pass it, but if you see language as a continuum, then whales have it,” argues David Gruber, founder of Project CETI.
- Luke Rendell at the University of St Andrews takes a middle position: “There are things you absolutely need, like a theory of mind… You also need a vocal system that’s flexible enough, and the ability to remember the past and think about the future.”
Which species will be first: Experts predict different animals have the best chance of being fully decoded.
- Whales and dolphins lead due to decades of existing recordings, though dolphins’ complexity—including burst-pulse sounds and echolocation—may slow progress.
- Birds are strong contenders because their vocal learning systems closely resemble human brain structures, with budgerigars showing brain maps similar to those found in humans.
- Yossi Yovel, who chairs the Coller Dolittle Challenge, predicts success with social birds: “I would go to study jays. That’s where I would put my bet.”
Why this matters beyond science: Successfully decoding animal communication could fundamentally change how humans relate to other species and understand the world.
- The breakthrough might reveal new perspectives, such as what it’s like to communicate with echoes or see new meanings in colors, similar to discovering bee vision extends into ultraviolet.
- “I think that anything we learn about animals makes us appreciate them more,” says Yovel. “Study on communication probably drives many people to think: ‘Oh wow, they’re like us!'”
We will soon be able to talk with other species. Which will be first?