Home » Project » Kinesthetic feedback to guide the visually impaired

Kinesthetic feedback to guide the visually impaired

Open A-Eye project

The originality of the A-Eye team’s device is that it provides kinaesthetic guidance that is more intuitive than the audio and/or vibrating feedback offered by commercial solutions. This reduces the cognitive load required to follow the information. This device, the fine-tuning of the feedback it provides and all the other solutions for meeting the various Cybathlon* challenges were co-designed with the help of our ‘pilot’. The A-Eye team places co-creation at the heart of the process, working with associations, accessibility experts and the blind A-Eye pilot. The aim is to create a solution that is ergonomic, accessible, easy to use and adapted to everyday life.

The A-Eye team’s device incorporates cutting-edge technologies, combining intuitive kinaesthetic feedback with computer vision and artificial intelligence functionalities. The wearable device, in the form of a harness/backpack, offers precise navigation, mimicking the interaction with a human guide and providing an intuitive experience.

*The Cybathlon is an event that takes place every 4 years and is organised by the Swiss Federal Institute of Technology in Zurich. It challenges teams from all over the world in 8 events. The aim is to demonstrate the technological advances that have been made in assisting people with disabilities to carry out everyday tasks

The context

It is well known that the least tiring guidance in a new environment is that provided by a person trained in guiding techniques. This guidance is much more intuitive and requires much less concentration than walking with a conventional assistance device such as a white cane.

A new environment involves obstacles that are difficult to predict (particularly those at head height) and a trajectory that has to be prepared in advance. This means a significant cognitive overload. In this situation, a digital assistance system capable of sensing the environment, calculating a trajectory and providing intuitive information (positive information) on the direction to follow would be much easier to integrate than a solution that only indicates obstacles (negative information), as is the case with white canes, even so-called ‘intelligent’ ones. Intuitive information could mimic the information about forces and movements exchanged between the guide and the guided. We call this type of information kinaesthetic information.

Our expertise in this area takes several forms.

Our objectives

Our ambition was to design a device that was as intuitive as possible. The A-Eye team’s device was therefore created with a series of converging objectives:

The device takes the form of a harness/plastron to which a kinaesthetic feedback system (pantograph) is attached, with a 3D camera. It also has a powerful computer for analysing/mapping the environment before proposing a trajectory for reaching the desired position, updating it as new obstacles appear. The kinaesthetic feedback enables effort to be applied in two directions (left/right and forward/backward), providing an intuitive indication of the direction to follow.

This device represents an innovative solution at the frontier of current technologies and software developed in robotics, image processing, artificial intelligence and haptic/kinesthetic communication.

The results

The device developed at ISIR has reached a sufficient level of maturity, enabling it to be used autonomously by the pilot, with the ability to change mode according to the challenges encountered.

Weekly training sessions with the pilot, Salomé Nashed, who has been visually impaired since birth and is a biology researcher, enable the guidance system to be fine-tuned and personalised to suit the various Cybathlon events. Although she describes the feedback as ‘intuitive’, it is essential to test it with a wider range of users in order to continue to improve the solution and enhance its customisability. At the same time, presenting the system will raise awareness of current technological advances, demonstrating both their potential and their limitations. The partnership with the INJA (Institut National des Jeunes Aveugles) Louis Braille will provide an opportunity to present the system to students and mobility instructors.

The main task will be to make the existing system open-source. We detail here the approach envisaged to achieve this objective:

As well as opening up the current system, the project also aims to evaluate and develop solutions that are more closely connected with the people concerned. Contacts have therefore been made with the INJA Louis Braille to give them the opportunity to work on participative innovations. This project is also being carried out with Sorbonne University’s FabLab. The project will also provide an opportunity to organise discussion groups and brainstorming sessions between researchers, Sorbonne University students and young blind and visually impaired people. These activities will be supported by the project to develop technological building blocks adapted to the people concerned.

Partnerships and collaborations

The A-Eye team was founded by Ludovic Saint-Bauzel and Fabien Vérité, researchers at ISIR and lecturers at Sorbonne University. The engineer in charge of the project is Axel Lansiaux, with the help of Aline Baudry and Samuel Hadjes, engineers at ISIR. A number of students from Sorbonne University’s masters programmes, and from the Main and ROB specialisms at Polytech Sorbonne, took part in the project as part of their end-of-study project. This project also involves other colleagues at ISIR, such as Nicolas Baskiotis and Olivier S. (Machine Learning) and Nizar Ouarti (Perception), who are interested in contributing their expertise to this opensource project.


Link to the A-Eye team website: https://a-eye.isir.upmc.fr

Project members

Ludovic Saint-Bauzel
Maître de Conférences
Fabien Verite
Maître de conférences
logo ISIR
Nicolas Baskiotis
Maître de conférences
logo ISIR
Nizar Ouarti
Maître de Conférences