Vous êtes ici : Accueil » Kiosque » Annonce

Identification

Identifiant: 
Mot de passe : 

Mot de passe oublié ?
Détails d'identification oubliés ?

Annonce

5 mai 2020

URGENT: PhD candidate in Neuromorphic Visual Odometry


Catégorie : Doctorant


Dear colleagues,

We are seeking a PhD candidate for Neuromorphic Visual Odometry starting next autumn 2020 at I3S lab, Université Côte d’Azur in France.
Application deadline on May 10 (please send a CV, cover letter, all available Master grades and ranking, Master thesis if available, any recommendation letter).

More information:

- http://edstic.unice.fr/edsticTheses2020//propo/an2020propo/UCA-EDSTIC-EUR-DS4H-SujetThese2020-242J04.05.2020-15h21m58s.pdf
- https://i3s.unice.fr/jmartinet/sites/default/files/u47/phdsubject.pdf.
 

Dear colleagues,

 
We are seeking a PhD candidate for Neuromorphic Visual Odometry starting next autumn 2020 at I3S lab, Université Côte d’Azur in France.
 
The applicant should hold a Master in Computer Science or Signal and Image Processing with background in Statistics and Applied Mathematics; skills in Python, Matlab, C/C++; fluency in written/oral scientific english.
 
Application deadline on May 10 (please send a CV, cover letter, all available Master grades and ranking, Master thesis if available, any recommendation letter).
 
More information:
- http://edstic.unice.fr/edsticTheses2020//propo/an2020propo/UCA-EDSTIC-EUR-DS4H-SujetThese2020-242J04.05.2020-15h21m58s.pdf
- https://i3s.unice.fr/jmartinet/sites/default/files/u47/phdsubject.pdf.
 
Supervisors:
Prof. Jean Martinet and Dr. Andrew Comport
 
Summary:
This thesis aims at exploiting biologically devised ‘short cuts’ used by insects with small brains and relatively simple nervous systems to see and perceive their world in real-time. The objective is to develop a biologically-inspired omni-directional event camera model to perform real-time ego-motion estimation and environment mapping. Spiking neural networks (SNN) approaches, that are particularly adapted to biologically-inspired event-cameras, will be developed to exploit asynchronous events in real-time for autonomous navigation. A novel panoramic stereo event camera system will be developed and arranged in a spherical configuration inspired by insects. Algorithms will be devised to exploit the 360-degree field of view, high-frequency event streaming, in order to perform visual odometry using landmarks. Bio-inspired models from binocular vision have also been used to solve the event-based stereo correspondence problem and spiking neural networks are natural match for event-based cameras due to their asynchronous operation principle. The main applicative goal of this work will be to exploit the high-temporal resolution, high-dynamic range and the low-power consumption of event cameras to perform high-speed robotics applications such as drone navigation. The major challenge will be to exploit the full high-speed potential of event cameras by redefining classic real-time spherical RGB-D SLAM approaches within an asynchronous framework. This can be divided into three goals: develop mathematical models for asynchronous multi-event cameras; develop SNN approaches for spatio-temporal stereo from events; develop high-speed visual odometry techniques for real-time navigation and mapping. The long-term goal of this proposed project is to study new paradigms and concepts for real-time spatial intelligence in the conditions of extreme lighting, exploiting the new type of visual sensor.
 
 

Dans cette rubrique

(c) GdR 720 ISIS - CNRS - 2011-2020.