The federating projects bring together the skills of several teams around common scientific objectives. These strategic initiatives tackle interdisciplinary issues, combining expertise in robotics, human interaction and advanced technologies to address societal challenges. Here are three examples of this collaborative approach.
Robotic learning for mobile manipulation and social interaction
This project explores the ability of autonomous robots to meet complex needs in real environments, such as physical and social interaction. Challenges include adaptability to unexpected situations, seamless collaboration with humans and navigation in varied environments. These issues are essential in sectors such as logistics, domestic services and agriculture.
ISIR’s internal project aims to achieve a high level of autonomy for robots in complex environments by meeting the following objectives:
- Using language models (LLMs) for robotic planning, affordance identification and object grasping, enabling better understanding of and interaction with the real world,
- To develop an integrated system combining advanced perception models, particularly vision-based, and advanced control methods.
These approaches have enabled the integration of the QD-grasp stack on the Tiago robot, including advanced functionalities such as input generation, object detection, segmentation and identification. It has also integrated language model-based planning (LLMs), enabling the robot to understand and execute tasks expressed in natural language by human users.
Find out more about this project: https://www.isir.upmc.fr/projects/robotic-learning-for-mobile-manipulation-and-social-interaction/?lang=en
Open A-Eye: kinaesthetic assistance for the visually impaired
For visually impaired people, navigating in an unfamiliar environment involves a significant cognitive overload, particularly with conventional devices such as the white cane. This project is developing an intuitive digital assistance system capable of interpreting the environment and providing kinaesthetic feedback in the form of forces and movements, mimicking human guidance.
The device developed at ISIR, in the form of a harness/plastron to which a kinaesthetic feedback system (pantograph) with a 3D camera is attached, meets several objectives:
- To test the sensory feedback developed at ISIR via competitions such as the Cybathlon,
- Mobilise the skills of students to demonstrate the potential of the technology,
- Make this solution open-source and work with users to ensure a more inclusive co-design.
The system has reached a sufficient level of maturity, enabling it to be used autonomously by the pilot, with the ability to change mode depending on the challenges encountered.
Find out more about this project: https://www.isir.upmc.fr/projects/kinesthetic-feedback-to-guide-the-visually-impaired/?lang=en
Multi-sensory integration to maintain balance
Maintaining balance requires the integration of sensory information from several sources: visual, vestibular, proprioceptive and haptic. These different senses are typically studied one by one, which leaves open the question of their integration.
The aim is to combine expertise in postural control, haptics and visio-motor adaptation to study multi-sensory integration during balance disturbances, by combining :
- Mechanical (via a perturbation platform),
- Visual (in virtual reality),
- and haptic (with a light touch device and haptic stimulation devices).
Find out more about this project: https://www.isir.upmc.fr/projects/multi-sensory-integration-to-maintain-balance/?lang=en
Published on 18/12/2024.