Dog or elephant translators: Can technology improve human-animal communication? | Technique



Of the more than 8 million species on the planet, only one of them understands the language of humans. After decades of searching for ways to communicate with animals, many scientists have turned to artificial intelligence to discover the patterns of sounds they make and in their behavior, understand what they mean, and try to interact with them. Despite promising progress in multiple investigations, creating interpreters for elephants, dogs, or whales presents multiple challenges.

Eva Meagher, author of the bookAs for Talkative animals, explains that animals “talk all the time, to each other and in multispecies environments to survive, make friends, discuss social norms, flirt, and have a good life.” “There is scientific evidence that they have complex languages, cultures and inner lives, and that they fall in love with their partners and grieve for them,” says the expert.

As he tells in his book, dolphins call each other by their names; Prairie dogs describe trespassers in great detail; Bats love to gossip and grammatical structures can be found in the songs of some birds. While wild chimpanzees understand each other with dozens of different gestures, bees dance to communicate and are able to recognize and remember human faces.

Studying the language and behavior of animals is “important not only to learn how they communicate with each other, but also how they communicate with us.” Some like dogs, birds and horses are able to learn words. For example, a Border Collie can save more than 1,000 lives, according to a study published in the journal behavioral processes. In addition, there are animals that respond to tone of voice and body language, according to Melody: “Soft tones indicate friendship, while strong or forceful tones can pose a threat.” Touch can be used as a “reward for dogs and horses”.

Artificial intelligence to talk to animals

Many scientists have turned to artificial intelligence and other technologies to understand and improve this connection. “Sensors can help us record, analyze and interpret many different animal signals, even those that may be difficult for humans’ limited sensory systems to detect,” says Clara Mancini, an animal-computer interaction researcher at the UK’s Open University.

Wild Dolphin Project promoters have compiled for more than 30 years a database of dolphin behavior and the sounds they make, which are three: whistles to communicate over long distances and between mothers and calves when they separate, a kind of click-to-hop and hop and so-called impulse impulses, which are usually several clicks spaced a wide They use them to socialize and fight when close. The goal of this project is to create machine learning algorithms to detect patterns in these sounds and to develop systems capable of generating “words” to be able to interact with dolphins in wild environments.

There are many similar projects. Elephant Voices researchers have created an online chart of elephant sounds and behaviors in Kenya and Mozambique. In it, they describe, for example, that when these animals come out of the water after playing, they usually make sounds similar to a trumpet. Another team of researchers has developed software to automatically detect, analyze and classify the ultrasonic sounds of rodents. It’s called DeepSqueak and it’s also been used on lemurs, whales, and other marine animals.

While some scientists have developed systems to detect when chickens are making distress calls, others are trying to understand dogs. “We conducted studies of dog whimpering recorded with microphones on their collars and used machine learning to determine whether the whining is sad, because it misses its owner, for example, or happy, because it is anticipating a play session,” says Melody Jackson, a Georgia Institute of Technology professor and interaction expert. between dogs and the computer.

Challenges of creating “translators”

Although there have been researchers who have identified the structure and part of the meaning of some animals’ vocalizations, creating “translators” presents multiple challenges. First of all, understanding the semantic and emotional meaning of what they are communicating with is a very complex task, Mancini argues: “We are not in their minds and we do not have the same physical, sensory and cognitive properties through which they experience the world.” Not taking these differences and complexities into account can lead to Risks: “Animal transmission is underestimated and misunderstood.”

Added to this is the fact that current technologies require portable or environmental sensors that are not always practical. This is noted by Jackson, who points out that appropriate cameras are sometimes not available and that photographing a moving animal well enough for video analysis is very difficult. Moreover, an interpretation of their communication based solely on vocal expression excludes other channels that may be important in understanding what they mean. This is the case for behavior, according to Mancini.

Animals also communicate with their actions, gestures, and even facial expressions. For example, if two groups of elephants come together and fold their ears while waving them quickly, they’re expressing a warm greeting that’s part of the welcome ceremonies, according to the Elephant Voices Behavior Chart. On the other hand, the sheep’s facial expressions can indicate that they are in pain. In fact, computer scientists from the University of Cambridge have developed an artificial intelligence system to analyze their faces and detect when they are in harm’s way.

Some researchers study dogs’ attitudes and behaviors to predict how they’re feeling, and sometimes turn to biometrics to try to identify changes in heart rate, breathing and temperature that could give clues to their feelings, says Jackson. While some of these “interpretation of dogs” systems use body sensors to measure body position and movement, others use cameras to record and analyze videos.

Jackets for training robotic dogs and bees

The ability to communicate with animals can be useful in multiple contexts. For example, Jackson’s team has developed technology that allows a human handler to remotely direct a search and rescue dog using vibratory motors attached to a vest. “We’ve also created laptops that allow a service dog to call emergency services using GPS location if its owner is having a seizure,” he says.

Humans may never be able to sing like a whale or buzz like a bee, but maybe machines can. In fact, a team of German researchers has created a biomimetic robot called RoboBee that mimics the dances that bees use to communicate. The results have been a success: With this robot, they claim to have been able to recruit real bees and guide them to specific destinations.

The developments are very promising, but it is still too early to predict whether animal translators will ever exist. Jackson believes that as computers and sensors get smaller and more capable, small, implantable systems will be developed that will provide more clues about their behavior and how to achieve two-way communication. He concludes, “Our understanding of animals’ vocalizations, movements, and gestures should one day be automated, and technology could provide us with the means to mimic their behaviors, perhaps hypothetically, and allow us to communicate concepts to them.”

You can follow country technology in a Facebook s Twitter Or sign up here to receive The weekly newsletter.

Subscribe to continue reading

Read without limits



Leave a Reply

Your email address will not be published. Required fields are marked *