Talking with dogs, decoding whale sounds and interpreting bird calls could all be possible in the coming years as artificial intelligence applications learn to translate different creatures’ communications, animal researchers said.

Scientists have started using AI tools to analyze vast quantities of data on various species’ communications, ranging from sounds, postures, expressions and more, to determine if they can understand and talk to animals in human terms.

“The door has been opened to using machine learning to decode languages that we don’t already know how to decode,” said Aza Raskin, who co-founded the Earth Species Project, a nonprofit aiming to develop AI models that let humans have “conversations” with animals. He predicts this will be possible within the next two years.

“The plot twist is that we will be able to communicate [with animals] before we understand” them, Raskin told Scientific American. “It wouldn’t surprise me if we discovered [expressions for] ‘grief’ or ‘mother’ or ‘hungry’ across species.”



Christian Rutz, a behavioral ecologist at the University of St Andrews, agreed.

With new AI developments, “people realize that we are on the brink of fairly major advances in regard to understanding animals’ communicative behavior,” he said.

The research and possible breakthroughs go well beyond just translating animals’ sounds. Con Slobodchikoff, an animal language researcher, is aiming to develop an AI model that interprets dogs’ barks as well as their facial expressions for owners.

“We are so fixated on sound being the only valid element of communication, that we miss many of the other cues,” he said. Despite this added complexity, Slobodchikoff is confident that machine learning will soon reveal more about what pets are trying to communicate


Dogs communicate beyond just their bark, Slobodchikoff said.  (iStock)

AI advancements are helping translate other animals besides traditional pets, as well.

The lead biologist for Project CETI, Shane Gero, for example, is using it to decode sperm whale sounds. His team is using underwater microphones to track codas — specific patterns of whale sounds — and plans to use AI to translate them.

Gero started by feeding codas his team had manually decoded to an algorithm, which was able to correctly identify a subset of whales 99% of the time. CETI hopes to eventually create a “whale chatbot.”

A sperm whale dives nose to the ocean floor as a diver looks on.

Project CETI aims to translate sperm whales’ clicks into a language humans can understand.  (Photo by Alexis Rosenfeld/Getty Images)


The Cornell Lab of Ornithology, meanwhile, has developed a tool that can accurately identify and differentiate sounds from over 1,000 bird species. The Earth Species Project plans to test how zebra finches respond to AI-generated bird calls.

“We’ll be able to pass the finch, crow or whale Turing test,” Raskin said, referring to the ability to trick animals into believing they are communicating with their own species. 

Source link

You May Also Like

TikTok may push potentially harmful content to teens within minutes, study finds | CNN Business

CNN  —  TikTok may surface potentially harmful content related to suicide and…

After several turbulent days, flight disruptions ease despite worries about 5G signals

Airline passengers who have endured tens of thousands of weather-related flight delays…

Meta has a moderation bias problem, not just a ‘bug,’ that’s suppressing Palestinian voices | TechCrunch

Earlier this year, Palestinian-American filmmaker Khitam Jabr posted a handful of Reels…

A former OpenAI leader says safety has ‘taken a backseat to shiny products’ at the AI company

A former OpenAI leader who resigned from the company earlier this week…