Home » Join us » Opportunities

Opportunities

Internship offers

Subject of internship: Augmented reality for a surgical cockpit

Abstract : Minimally invasive surgery is known for its benefits to the patient. These benefits result mainly from the smaller incisions it requires compared to open surgery. These incisions allow the insertion of an endoscope and instruments into the patient’s body through trocars. Despite its advantages, minimally invasive surgery presents several challenges for the surgeon. The trend is increasing with the generalization of minimally invasive access to almost all specialties (catheterization for vascular, flexible endoscopy for gastroenterology, fibroscopy in urology etc…).

Objectives: With this project, we propose to study the contribution of augmented reality in the operating room. Its specificity project is to take advantage of the possible complementarity between comanipulator robotic arms held by the surgeon and an augmented reality device carried by the latter (Hololens/Varjo XR1).

The main objective of this internship will be to integrate the information from tools present in the surgical cockpit project (manipulator arms, endoscopic cameras, tactile interfaces …) to a Unity 3D environment. The environment will have to be manipulable via the different interaction tools provided by the Head mounted display and devices of the cockpit. It will thus be possible to manipulate the position of several “screens” through the gaze, the voice or the movements of the tip of the laparoscopic instruments held by the comanipulator arms.

The goals are :

  • Creation of a 3d scene in Unity allowing the display and manipulation of 3d objects,
  • External video (endoscope) streamed in the 3d scene (WebRTC),
  • Retrieving manipulator arms pose (orientation/position) and display it in 3D,
  • Use of robot information to allow manipulation of 3D objects,
  • Creation of a widget to display the status of different sensors that could be added,
  • The usability of the developed platform will be studied in the framework of an experimental protocol. The two technologies used (Hololens/varjo) will also be compared.

The developments will all be carried out under Unity XR SDK.

Candidate Profile: Master’s student in Computer Science or Engineering, a student engineer, wishing to explore a subject involving the study of the use of information manipulation technologies in augmented reality. The internship requires strong programming skills (specifically C# and/or Unity’s visual programming language) and information processing skills in order to extract the user’s intention from voice, gesture and eye sensors.

  • Supervisor: Fabien Vérité
  • Duration: 6 months starting 01 March 2021
  • Location: Isir (Institut des Systèmes Intelligents et de Robotique), 4 Place Jussieu 75005, Paris
  • Contact: Fabien Vérité, verite@isir.upmc.fr ; Send your application by email, with [Augmented reality for a surgical cockpit] in the topic along with a CV and motivation letter.

Download the offer

Subject of internship: Machine Learning for Social Behaviour Generation

Abstract : A part of the success of human-agent interactions relies on the ability of social agents to perform behaviors that are easily understood by humans. This means that the human is able to infer the meaning and/or the intention of such behaviors that could take several forms: pointing, gaze, head movements, etc. To increase the understanding of agent intentions by humans, the notion of legibility is often considered in robotics. Legibility is defined as the ability to anticipate the goal of a action. This notion has been differentiated from predictability, which is defined as the ability to predict the trajectory for a given goal. To address the modeling of such notions, mathematical approaches show the need of explicitly integrating human observers. The models able to generate such actions / behaviors take into account how a human observer will perceive them [Wallkotter et al. 2020].

Objectives: We aim to develop machine learning algorithms able to generate behaviors mainly movements that are explicitly taking into account human observers. Recent works on movement generation for artificial agents have shown the relevance of generative models such as Variational Auto-Encoders (VAE). The main intuition is that latent representation and regularization allows controlling the generation of data. In (Marmpena et al., 2019), a VAE based approach has been proposed to generate various body language animations. Interestingly, modulation of motion is made possible through affective related spaces.

Here, we propose to address a similar behavior generation problem while focusing on communicative demonstrations, which are naturally employed by humans when teaching (Ho et al. 2018). Communicative demonstrations are intelligent modifications of demonstrator’s actions and/or behaviors with the aim of influencing the mental representation that an observer ascribes to the demonstration.

We target situations in which an agent is demonstrating a series of actions such as writing or reaching objects. The idea will be to control the generation in a communicative space from instrumental to pedagogical intentions. We will firstly develop a VAE model able to learn a representation of communicative actions and analyze the latent space. We will then develop specific regularization techniques to control the communicative intention.

The main steps are:

– Development a first generative model,

– Analysis of the latent space,

– Development of a regularization technique able to control the communicative intention.

If possible, evaluation with real humans.

Skills: Python, Machine learning

  • Supervisor: Mohamed CHETOUANI
  • Duration: 5/6 months
  • Location: Isir (Institut des Systèmes Intelligents et de Robotique), 4 Place Jussieu 75005, Paris
  • Contact : Mohamed CHETOUANI ; Mohamed.Chetouani@sorbonne-universite.fr ; Send your application by email, with [Machine Learning for Social Behaviour Generation] in the topic along with a CV and motivation letter.

Download the offer

PhD offers

Thesis topic: Microfluidic And Robotic Devices For In-Vitro Analysis, Manipulation And Injection On Biological Samples

Joint PhD Project between ISIR / Sorbonne Université (Paris) & Uni. di Brescia

Context: The project is on robotics manipulation, characterization and analysis of biologic samples such as isolated single cells or small animal eggs such as zebra fish. The aim is to develop a novel instrument for experimental biology to facilitate drug research using microfludic and robotic technologies.

The proposed case study is the microinjection of a biological reagent into eggs and cells to investigate the role of certain genes and their involvement in several human diseases. Usually, micro- injection is performed manually under a microscope and directly by the operator. The eggs are transferred and injected through a capillary tube. Accuracy, concentration and operator determination are essential and the procedure must be carried out quickly. As a result, the manual process often fails and its efficiency is often very low. An essential element for automation is the measurement and control of the interaction force.

Scientific objective: A first objective is the integration of a force sensor into the injector. It will be based on the principle of “position compensation”: the force is not estimated by passive measurement of the deformation, but actively, by the force necessary to prevent displacement. The same device can be used for both the μN and 100mN range through purely electronic matching and control.

The second objective concerns the techniques of individual handling and mass conveyance of samples. A robotic system consisting of the microfluidic means for sample transport and the associated mechanical effectors is provided. Control diagrams covering the transfer, tele-operated or automatic handling phases will be developed, using the vision referenced control, as well as the active tools mentioned above.

The third objective concerns the human/machine interface, where an operator can intuitively control the system, act on cells in groups or isolated, program repetitive actions etc. through a touch screen display, including also real time information on the manipulation, such as chemical concentration, sample measurements and counting etc. For the injection phase, coupling and comanipulation methods will be established with force feedback to the operator, who would detect contact and penetration with a haptic interface. Its gesture will be used by the system as a reference for the automatic processing of other samples.

Required profile: Profile Master / General Engineer / Robotics / Control / EEA or Applied Physics. Previous experience in micro robotics or biology will be highly appreciated.

Required skills: Autonomous / Bilingual English / Communication

The candidate will be jointly supervised by both labs and will obtain a double PhD from both universities. They must spent about half of time in Brescia and half of the time in Paris.

Download the offer

Thesis topic: Optical microrobots for interactive manipulation of biological samples

Abstract: This thesis aims at developing a new scientific instrument for applications in experimental biology, in particular for the manipulation, characterization and analysis of objects such as isolated cells, neurons, or intracellular organs. Using the principle of optical tweezers, laser beams are controlled to act directly on samples, or to actuate remote-controlled microrobots. These microrobots will be able to integrate analysis capabilities and bio-active sensors allowing a quick feedback to the operator. This is a new technology capable of supporting and considerably accelerating several studies in biology. Collaborations are started with teams from Institut Curie and Pasteur around cancer and intracellular mechanisms.

General description of the project: The optical tweezers are a technique allowing to manipulate microscopic objects by using a focused laser beam. They allow to act on samples in solution by a non-contact action. ISIR has developed a robotic laser trap system able to manipulate samples on 3 dimensions while measuring the interaction forces in real time. Nevertheless, the difficulty of handling these devices remains an important step to overcome, especially when it concerns objects outside the image plane.

The current performances of the system show that it is possible to trap and move simultaneously several particles with an effort resolution close to 10pN (Fig. A and B). Using these principles, optical microrobots have been realized. Activated by lasers, these ‘optobots’ (Fig. C) of a few micrometers in size, will be used to perform operations on biological samples, such as mechanical characterization, interaction measurement, genetic injection and electrical analysis. However, achieving such high performance has been at the expense of simplicity of use. This is mainly due to the design of the optical path and the complex control laws used.

The objective of this project is first to develop applications in experimental biology to demonstrate the advantages of this system and to impose it as a new scientific instrument. A collaboration is established with teams from the Pasteur Institute and the Curie Institute to exploit these possibilities in cancer research and studies on intracellular mechanisms. In this context, it is necessary to optimize interactivity so that the user is able to plan complex trajectories to trap and move objects, automate operations and collect results. We will also be interested in the modalities of Human/Machine interaction: dedicated haptic interfaces, notably among the previous achievements of the lab such as the ‘FishTank’, are promising candidates to develop a chain of cross-scale, multi-modal interaction.

Scientific theme: The main scientific theme is microrobotics, with strong support from physics and optics. The problems of object positioning and control in 6D with microscopic resolution and precision (nanometers and picoNewtons) are at the heart of the work. From an interaction point of view, existing solutions are generally difficult for the user to grasp and HMI approaches are an original way to achieve this. The user’s immersion is indeed an asset to free himself from complex control laws and planning systems. In the same way, the use of high performance integrated sensors is an asset concerning the final precision reached by the system.

Expected results, challenges and perspectives: The experience of ISIR in handling systems and in human-machine interaction allows us to envisage very promising perceptives and spin-offs. Such an achievement has never been done before and we are confident that it would be a major contribution to the use of optical tweezers. At the end of the project, applications in biology such as the manipulation of intracellular organs. These will be made possible with the collaboration of research teams in experimental biology.

This thesis is part of an industrial maturation process to create an innovative instrument in the field of life sciences, supported by the SATT and the Ile-de-France region. The perspectives concern an exploitation of the generated knowledge to accelerate research in biology. The creation of a start-up is also envisaged to valorize the results.

  • Thesis director: Sinan Haliyo
  • Possible co-supervision: Stéphane Régnier
  • Collaborations within the framework of the thesis: Institut Pasteur, Institut Curie
  • Location: Isir (Institut des Systèmes Intelligents et de Robotique), 4 Place Jussieu 75005, Paris, the Multi-Scale Interactions team
  • Contact: Sinan Haliyo ; sinan.haliyo@isir.upmc.fr ; Send your application by email, with [Thesis: Optical microrobots for interactive manipulation of biological samples] in the subject line, a CV and a cover letter.

Download the offer

Post-doctoral offers

Job offers

Project framework:

The project aims in the medium term to equip a robotic rod developed in the ANR i-Gait project with the Ethercat fieldbus solution.

Mission:

As a prerequisite to this implementation, the recruited person will have the mission to develop a software suite allowing a standard computer with a Linux operating system to communicate with a network of Ethercat modules: analog and digital input/output modules, axis card.

Desired profile:

A young graduate of an engineering school or a Master 2.

Required skills:

  • C/C++ programming skills
  • Knowledge of robotic systems

Previous experience in the use of Ethercat communication is a plus that will be appreciated.

  • Duration of the contract: 6 months
  • Expected date of employment: September 2021
  • Location of the activity: In the ISIR laboratory (Institute of Intelligent Systems and Robotics), on the Pierre and Marie Curie Campus of Sorbonne University, in Paris.
  • Contact: Send a CV including the names of two referees to Wael BACHTA at the following address: wael.bachta@sorbonne-universite.fr

Download the offer

Project framework:

As part of the ANR i-Gait project, two parts were independently realized:

  • A tactile feedback to actively reduce the postural sway of a person. This tactile feedback has been realized with a 3D printed handle including a vibrotactile actuator and using large amplifiers.
  • A robotic cane controlled by a Beaglebone card and whose handle is not equipped with the tactile feedback system.

Mission:

The mission entrusted to the hired person will be to:

  • Redesign a cane handle in 3D printing resistant to the efforts applied by the user,
  • Include a commercial vibrotactile actuator to this handle,
  • Replace the bulky amplifiers by amplifiers of a size compatible with the electronic part of the cane,
  • Integrate on the Beaglebone controller the code necessary to activate the vibrotactile actuator.

Desired profile:

A young graduate of an engineering school or holder of a Master 2 in robotics.

Required skills:

  • C/C++ programming skills
  • Skills in mechanical design and 3D printing
  • A strong taste for experimental work.

 

  • Duration of the contract: 6 months
  • Expected date of employment: as soon as possible
  • Location of the activity: In the ISIR laboratory (Institute of Intelligent Systems and Robotics), on the Pierre and Marie Curie Campus of Sorbonne University, in Paris.
  • Contact: Send a CV including the names of two referees to Wael BACHTA at the following address: wael.bachta@sorbonne-universite.fr

Download the offer