Open A-Eye project
The originality of the A-Eye team’s device is that it provides kinaesthetic guidance that is more intuitive than the audio and/or vibrating feedback offered by commercial solutions. This reduces the cognitive load required to follow the information. This device, the fine-tuning of the feedback it provides and all the other solutions for meeting the various Cybathlon* challenges were co-designed with the help of our ‘pilot’. The A-Eye team places co-creation at the heart of the process, working with associations, accessibility experts and the blind A-Eye pilot. The aim is to create a solution that is ergonomic, accessible, easy to use and adapted to everyday life.
The A-Eye team’s device incorporates cutting-edge technologies, combining intuitive kinaesthetic feedback with computer vision and artificial intelligence functionalities. The wearable device, in the form of a harness/backpack, offers precise navigation, mimicking the interaction with a human guide and providing an intuitive experience.
*The Cybathlon is an event that takes place every 4 years and is organised by the Swiss Federal Institute of Technology in Zurich. It challenges teams from all over the world in 8 events. The aim is to demonstrate the technological advances that have been made in assisting people with disabilities to carry out everyday tasks
The context
It is well known that the least tiring guidance in a new environment is that provided by a person trained in guiding techniques. This guidance is much more intuitive and requires much less concentration than walking with a conventional assistance device such as a white cane.
A new environment involves obstacles that are difficult to predict (particularly those at head height) and a trajectory that has to be prepared in advance. This means a significant cognitive overload. In this situation, a digital assistance system capable of sensing the environment, calculating a trajectory and providing intuitive information (positive information) on the direction to follow would be much easier to integrate than a solution that only indicates obstacles (negative information), as is the case with white canes, even so-called ‘intelligent’ ones. Intuitive information could mimic the information about forces and movements exchanged between the guide and the guided. We call this type of information kinaesthetic information.
Our expertise in this area takes several forms.
- As a roboticist, we have noted that the technologies used for obstacle detection and trajectory planning for autonomous robots could find a positive echo in the development of an assistance device;
- On the other hand, ISIR and its specific activities are concerned precisely with good practice and the development of new devices for augmenting/substituting sensory information in different application contexts (assistance/rehabilitation/surgery).
Our objectives
Our ambition was to design a device that was as intuitive as possible. The A-Eye team’s device was therefore created with a series of converging objectives:
- to exploit the context offered by the international Cybathlon competition to validate the effectiveness of the sensory feedback developed within our teams,
- and to take advantage of the skills of the students on our courses to highlight their full potential.
The device takes the form of a harness/plastron to which a kinaesthetic feedback system (pantograph) is attached, with a 3D camera. It also has a powerful computer for analysing/mapping the environment before proposing a trajectory for reaching the desired position, updating it as new obstacles appear. The kinaesthetic feedback enables effort to be applied in two directions (left/right and forward/backward), providing an intuitive indication of the direction to follow.
This device represents an innovative solution at the frontier of current technologies and software developed in robotics, image processing, artificial intelligence and haptic/kinesthetic communication.
The results
The device developed at ISIR has reached a sufficient level of maturity, enabling it to be used autonomously by the pilot, with the ability to change mode according to the challenges encountered.
Weekly training sessions with the pilot, Salomé Nashed, who has been visually impaired since birth and is a biology researcher, enable the guidance system to be fine-tuned and personalised to suit the various Cybathlon events. Although she describes the feedback as ‘intuitive’, it is essential to test it with a wider range of users in order to continue to improve the solution and enhance its customisability. At the same time, presenting the system will raise awareness of current technological advances, demonstrating both their potential and their limitations. The partnership with the INJA (Institut National des Jeunes Aveugles) Louis Braille will provide an opportunity to present the system to students and mobility instructors.
The main task will be to make the existing system open-source. We detail here the approach envisaged to achieve this objective:
- Documentation and maintenance of an open-source GIT,
- Assessing and optimising the choice of materials and hardware,
- Drawing up plans for 3D printing and laser cutting,
- Creating video tutorials,
- Throughout the development process, we will be gathering feedback from ‘technical’ users:
- Take into account feedback from a group of student users from the Sorbonne University fablab, who will have to reproduce the device from the available documentation,
- Take into account feedback from the INJA Louis Braille locomotion instructors, who will be the first to test the device as a newly-formed technical team. Their role will be to get to grips with the device and propose customisations adapted to users, in order to improve its ergonomics and better respond to the specific needs and difficulties of each user.
As well as opening up the current system, the project also aims to evaluate and develop solutions that are more closely connected with the people concerned. Contacts have therefore been made with the INJA Louis Braille to give them the opportunity to work on participative innovations. This project is also being carried out with Sorbonne University’s FabLab. The project will also provide an opportunity to organise discussion groups and brainstorming sessions between researchers, Sorbonne University students and young blind and visually impaired people. These activities will be supported by the project to develop technological building blocks adapted to the people concerned.
Partnerships and collaborations
The A-Eye team was founded by Ludovic Saint-Bauzel and Fabien Vérité, researchers at ISIR and lecturers at Sorbonne University. The engineer in charge of the project is Axel Lansiaux, with the help of Aline Baudry and Samuel Hadjes, engineers at ISIR. A number of students from Sorbonne University’s masters programmes, and from the Main and ROB specialisms at Polytech Sorbonne, took part in the project as part of their end-of-study project. This project also involves other colleagues at ISIR, such as Nicolas Baskiotis and Olivier S. (Machine Learning) and Nizar Ouarti (Perception), who are interested in contributing their expertise to this opensource project.
Link to the A-Eye team website: https://a-eye.isir.upmc.fr