Can artificial intelligence help us communicate with animals?

Photo of author

By Creative Media News

The handler of a dolphin uses her hands to signal “together” followed by “create.” The two trained dolphins submerge, exchange sounds, and then surface on their backs with their tails raised. They have created an original trick and executed it in tandem, as requested. According to Aza Raskin, this does not show the existence of language.

“However, it makes the perfect reason that if they had access to a rich, symbolic mode of communication, this task would be significantly simplified.”

Can artificial intelligence help us communicate with animals?
Can artificial intelligence help us communicate with animals?

Raskin is the co-founder and president of Earth Species Project (ESP), a California-based non-profit organization with an audacious goal: to decode non-human communication using a form of artificial intelligence (AI) called machine learning, and to make all the knowledge publicly available, thereby enhancing our relationship with other living species and aiding in their protection.

Wdt
Can artificial intelligence help us communicate with animals?

A 1970 CD of whale songs galvanized the movement that led to the prohibition of commercial whaling. What would be the result of a Google Translate for the animal kingdom?

The organization, which was created in 2017 with the assistance of big funders such as Reid Hoffman, co-founder of LinkedIn, released its first scientific publication in December of 2017. The objective is to decode communication during our lifetimes.

Raskin states that the ultimate goal is to decipher animal communication and discover the non-human language. Along the way, and of equal importance, we are currently developing technology to aid biologists and conservation efforts.

Understanding animal vocalizations has always fascinated and been studied by humans. Diverse primates emit alarm cries that vary depending on the predator; dolphins address one another with characteristic whistles, and some songbirds may rearrange components of their calls to convey distinct messages. However, the majority of specialists refrain from naming it a language because no animal communication fits all the criteria.

Can artificial intelligence help us communicate with animals?
Can artificial intelligence help us communicate with animals?

Until recently, decoding has relied heavily on meticulous observation. But there has been a surge in interest in applying machine learning to the massive volumes of data that modern animal-borne sensors can currently capture. Elodie Briefer, an associate professor at the University of Copenhagen who researches vocal communication in mammals and birds, adds, “People are beginning to use it.” However, we do not yet comprehend how much we can accomplish.

Briefer collaborated on the development of an algorithm that analyses pig grunts to determine if the animal is experiencing a good or negative mood. The DeepSqueak system evaluates the ultrasonic sounds of rats to determine whether they are under stress. A further attempt, Project CETI (which stands for Cetacean Translation Initiative), aims to translate the communication of sperm whales using machine learning.

However, ESP asserts that its technique is distinct since it is not focused on deciphering the communication of a single species, but rather of all species. While Raskin agrees that social species, such as primates, whales, and dolphins, are more likely to engage in symbolic communication, the objective is to build tools that may be used in the entire animal kingdom. Raskin says, “We’re species-neutral.” “The technologies we develop… apply to the entirety of biology, from worms to whales.”

Raskin states that the “motivating intuition” for ESP is research demonstrating that machine learning may be used to translate across diverse, often distant human languages — without any prior knowledge.

This procedure begins with the construction of an algorithm to physically represent words. In this multidimensional geometric representation, the distance and direction between points (words) indicate their meaningful link to one another (their semantic relationship).

For example, the distance and direction between “king” and “man” are the same as between “woman” and “queen.” (The mapping is accomplished not by knowing what the words imply, but by observing, for instance, how frequently they occur close to one another.)

Later, it was observed that these “shapes” are comparable across languages. Then, in 2017, two separate groups of researchers separately developed a mechanism that enabled translation by aligning the forms. Align their shapes and locate the point in Urdu that corresponds to the point in the English word. Raskin states, “You can translate most words reasonably effectively.”

The goal of ESP is to develop these types of representations of animal communication, focusing on both individual species and multiple species simultaneously, and then to investigate problems such as if there is overlap with the universal human form.

Raskin states that we do not know how animals experience the world, but it appears that some animals share emotions, such as sadness and happiness, with humans and may communicate about them with other members of their species. “I don’t know which will be more incredible: the portions where the shapes intersect and we can converse or interpret immediately, or the portions where we can’t.”

He says that animals communicate through means other than sound. Bees, for instance, communicate the location of blossom by a “waggle dance.” There will also be a requirement for translation across various channels of communication.

Raskin agrees that the objective is “like traveling to the moon,” but the plan is not to arrive there all at once. Rather, ESP’s road map entails resolving a sequence of lesser issues required to realize the broader objective. This should result in the development of general tools that can aid researchers attempting to use AI to unlock the secrets of studied species.

For instance, ESP recently released a study (and shared its code) on the so-called “cocktail party problem” in animal communication, in which it is difficult to determine which animal in a group is vocalizing in a noisy social setting.

Raskin states that, to the best of our knowledge, no one has before performed this end-to-end detangling of animal sounds. The artificial intelligence-based model developed by ESP, which was tested on dolphin signature whistles, macaque coo calls, and bat vocalizations, performed best when the calls originated from individuals that the model had been trained on; however, with larger datasets, it was able to disentangle mixtures of calls from animals not included in the training cohort.

Another research involves the use of artificial intelligence to manufacture novel animal cries, with humpback whales serving as test subjects. The unique calls, created by dividing vocalizations into micro-phonemes (separate units of sound lasting one-hundredth of a second) and using a language model to “speak” whale-like sounds, can then be played back to the animals to observe their responses.

Raskin shows that if AI can distinguish between random and semantically significant changes, it will bring us closer to meaningful communication. “It is having the AI speak the language, albeit we do not yet understand what it means.”

Using self-supervised machine learning, which does not require the labeling of data by human specialists to discover patterns, a second study attempts to construct an algorithm that estimates the number of call types a species may produce.

In an early test case, it will mine audio recordings made by a team led by Christian Rutz, a biology professor at the University of St Andrews, to produce an inventory of the vocal repertoire of the Hawaiian crow – a species that, as Rutz discovered, is capable of making and using tools for foraging and is believed to have a significantly more complex set of vocalizations than other crow species.

Rutz is very enthusiastic about the conservation importance of the project. The only place where the Hawaiian crow exists is in captivity, where it is being bred for reintroduction into the wild.

It is intended that by comparing recordings produced at different times, it would be possible to determine whether the species’ call repertoire is being destroyed in captivity, which could have implications for its reintroduction; this loss could be remedied through intervention. “It could provide a step change in our ability to help these birds recover,” says Rutz, adding that manually recognizing and classifying the sounds would be time-consuming and prone to error.

In the meantime, another effort aims to automatically comprehend the utilitarian implications of vocalizations. It is being conducted at the laboratory of a professor of ocean sciences at the University of California, Santa Cruz, Ari Friedlaender.

The lab investigates how difficult-to-observe marine creatures interact underwater and operates one of the world’s largest tagging programs. Attached to the animals are little electronic “biologging” devices that record their location, kind of motion, and even what they see (the devices can incorporate video cameras). The laboratory also includes data from strategically positioned ocean sound recordings.

ESP intends to first apply self-supervised machine learning to tag data to automatically determine what an animal is doing (such as feeding, resting, traveling, or socializing) and then add audio data to determine whether functional meaning can be assigned to calls associated with that behavior. (Then, playback trials might be used to validate any findings, together with previously encoded calls.)

Initially, this technology will be applied to data on humpback whales; the lab has tagged multiple animals in the same group so that it is possible to observe how signals are transmitted and received. Friedlaender states that he “reached the limit” of what currently available techniques could extract from the data. “We expect that the work that ESP is capable of will yield new insights,” he says.

However, not everyone is as enthusiastic about the capacity of AI to reach such lofty goals. Robert Seyfarth, retired professor of psychology at the University of Pennsylvania, has spent over four decades studying the social behavior and vocal communication of monkeys in their natural habitat.

While he feels that machine learning can be effective for specific tasks, such as identifying an animal’s vocal repertoire, he is skeptical that it will add much to the finding of the meaning and function of vocalizations.

The issue, he explains, is that while many animals can have sophisticated and complex communities, their repertoire of sounds is significantly smaller than that of humans.

The upshot is that the same sound might signify different things in different circumstances, and only by understanding the context – who the individual calling is, how they are related to others, where they stand in the hierarchy, and with whom they have interacted – can meaning be established. Seyfarth states, “I believe these AI methods are insufficient.” You must walk outside and observe the animals.

There is also uncertainty over the assumption that the form of animal communication will meaningfully intersect with human communication. Seyfarth states that it is one thing to apply computer-based analysis to the human language with which we are so intimately aware.

However, it can be “very different” when performed on other species. Kevin Coffey, a neurologist at the University of Washington and co-creator of the DeepSqueak algorithm, explains, “It’s an intriguing concept, but it’s a bit of a reach.”

Raskin acknowledges that AI alone may not be sufficient to unlock interspecies communication. However, he cites evidence indicating that numerous species communicate in ways “more intricate than humans could have ever dreamed.”

Our inability to collect sufficient data and analyze it on a large scale, as well as our restricted perception, have been obstacles. “These are the instruments that enable us to remove our human lenses and comprehend entire communication systems,” he explains.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Skip to content