Grenoble INP Rubrique Recherche 2022

Ears and eyes for robots



© Frank - Fotolia

From a different perspective, the European Embodied Audition for Robots (EARS) project is working to develop human-machine interactions through sight and sound. The goal is to create a robot capable of intelligent listening ? in other words, the ability to distinguish human speech from surrounding noise and identify who is speaking. Researchers are currently focused on developing an interface that will connect sound input (microphones) to analytical and processing capacities. The project is led by the Inria Perception team in close collaboration with the Gipsa-lab and the Laboratoire Jean Kuntzmann (LJK).

The Vision and Hearing in Action (VHIA) project provides another example of working towards social interaction between humans and robots. The project aims to create a mathematical representation of audiovisual objects, and in particular, a representation of the human face. "Following this phase, we will have to ensure that the analysis carried out by the machine results in an appropriate reaction." explains Laurent Girin, a Gipsa-lab researcher currently working for the Inria. The project is building on a NAO robot from Aldebaran Robotics and its roadmap earned Radu Horaud's team an ERC grant.

 

Complete Archives 2012-18

Recueil d'articles