Focus on three research projects to better understand animal behaviour

0
Focus on three research projects to better understand animal behaviour

Projects involving artificial intelligence and animals seem to be gradually developing. Whether it is to fight against the extinction of African elephants, to recognize marine species in an aquatic reserve or to reducethe pollution that threatens species in the Pacific islands, AI is increasingly used. Today, we would like to present three research projects that aim to better understand animal behavior.

Artificial intelligence to track animal demographics and behavior in Africa

A team of researchers and engineers from the CNRS Biometry and Evolutionary Biology Laboratory (LBBE – CNRS / Univ Claude Bernard / Vetagro Sup) and the Centre for Functional and Evolutionary Ecology (CEFE – CNRS / Univ Montpellier / IRD / EPHE) has been involved in monitoring the Hwange Park in Zimbabwe for several years. The objective is to understand the behaviour of animals living in southern Africa. The researchers use several techniques to recognize certain lions, giraffes or hyenas in particular, thanks to their coat.

An article published in the scientific journal Methods in Ecology and Evolution, explaining the research team’s methods, was published last March, signed by Vincent Miele, accompanied by Simon Chamaillé-Jammes, a researcher at the CNRS and the University of Paul Valéry-Montpellier III, as well as Gaspard Dussert, Bruno Spataro, Dominique Allainé and Christophe Bonenfant, all researchers at the CNRS and the University of Lyon 1. The researchers automated the individual recognition of giraffes on thousands of photos thanks to artificial intelligence techniques and computer vision.

For this species, the principle is as follows, as indicated by the CNRS: “From 4000 photographs of giraffes brought back from the field, the team has developed an automated processing chain that detects similar elements in the coats of the giraffe flanks thanks to a combination of artificial intelligence techniques. The final system is based on deep learning and consists of projecting the photos into a high-dimensional mathematical space, which makes it possible to measure a distance between each pair of photos. Thus, the photos of the same individual are very close in this space.

In practice, this visual capture/recapture system allows not only to recognize the giraffes known in the monitoring and to build their life history, but also to detect new individuals not known before. The time saving compared to manual processing is significant since the system processes several photos per second. This work contributes to the development of the automation of individual animal recognition and to the implementation of ecological monitoring. Made freely available to the community, this visual capture/recapture system (and its future evolutions) may also prove relevant to study other species beyond the giraffe.”

Artificial intelligence to better understand animal behavior

As part of the Earth Species Project (ESP), an open source platform was developed and dedicated to decoding animal communication following the 2007 meeting of Britt Selvitelle and Aza Raskin. Both discuss the technology and how its use could change the way we think about the world.

A few years later, in 2013, the project takes shape through their research around a new machine learning model that could learn a geometric representation of an entire language. In 2017, their system was refined: these mechanisms use neural networks to analyze a large number of sentences, deducing general principles of grammar and usage. They then apply these models to translate sentences that the system has never seen.

The system has been tested on cetaceans, and in the long term it should be used on mammals and primates, among others. The model uses the same principle as machine translation systems such as Google Translate. In fact, these algorithms are not based on language dictionaries, but on multidimensional parameters that determine the context of a sentence. The same logic is applied to the analysis of barks, cries, sounds, cats that are frequently repeated by animals.

Understanding the language of dolphins thanks to an artificial intelligence platform

The Swedish start-up Gavagai AB, which specialises in human language analysis software, is collaborating with the KTH Royal Institute of Technology on the Wild Dolphin Project. Using algorithms embedded in underwater computers, the project members are trying to decode dolphin communications. Eventually, a dictionary will be compiled with all the terms used.

Led by Dr. Denise L. Herzing, this researcher has been studying Atlantic dolphins since 1985. She began collaborating with Thad Starner, a researcher in artificial intelligence at the Georgia Institute of Technology in Atlanta. Together, they started the Cetacean Hearing and Telemetry (CHAT) project, a spin-off of the Wild Dolphin Project, to create a language with the characteristics of sounds, which are used by dolphins to communicate with each other.

A prototype computer the size of a smartphone was designed, with two hydrophones capable of detecting a wide range of frequencies above 200 kHz. This device was tested and the objective was to find out whether the ultrasound emitted by the dolphins could correspond to a particular event. The experiment was based on the following question: if one of them uses a frequency when taking a walk or coming across a seaweed, will another one use the same frequency to talk about this context or the object?

Once the system can recognize the dolphins’ mimics, the system should classify this information to create several “fundamental units” of their language.

Translated from Focus sur trois projets de recherches pour mieux comprendre le comportement animal