Les commentaires sont clos.

Stage M2 UPJV/CS Group

20 Octobre 2022

Catégorie : Stagiaire

Detection, localisation and tracking of aerial vehicles using event camera.

General objectives

The number of small autonomous or piloted aerial drones is constantly increasing nowadays. This leads to an increased risk of disruption caused by these devices. Whether a pilot has malicious intent or is careless, drones can impact aviation safety. Some of these devices fly in prohibited areas (sensitive sites, sports or cultural events, etc.) and do not respect the regulations (registration of the device, prohibited or restricted flight areas).

In the context of securing the sky, we wish to detect, locate and pursue an enemy UAV flying in a prohibited area with the help of one or more security UAV(s). The detection and the localiza- tion of the enemy UAV is done with the help of event cameras embarked on the friendly UAV(s). Event-driven cameras are bio-inspired vision sensors that emit changes in brightness at the pixel level [1]. They offer significant advantages over standard cameras for robotic applications, namely a very high dynamic range, no blurring due to fast movements and a latency in the microsec- ond range. These cameras can be easily mounted on aerial drones. Therefore, they become very well suited for detecting and locating highly dynamic targets in complex environments subject to sudden changes in illumination. However, because the output of these camera is composed of a sequence of asynchronous events rather than true intensity images, traditional vision algorithms cannot be applied directly.

In this master thesis, the aim is to study and adapt the algorithms recently developed for event-driven cameras to the detection and localization of aerial drones. Recent works dealing with related topics, such as car detection, human detection, etc [2,3] will be considered. Concerning the tracking of enemy UAVs by friendly UAVs, visual servoing [4] and advanced control techniques such as MPC [5] seem to be well adapted. Therefore, the aim is to develop efficient algorithms for fast or even very fast dynamics. The use of techniques based on data (deep-learning), on models (model- based) as well as on the coupling of these two approaches could be considered in this master thesis.

This project can lead to a Ph.D. position to start in September/October 2023.



The work will be decomposed with incremental steps as follows:

1. Bibliography of detection, localisation and tracking of moving object using event camera;

2. Bibliography of UAV control based on event camera;

3. Comparison, implementation and performance evaluation using simulator and real data;

4. Writing of reports and the masters’ thesis.



The candidate is expected to have a background in Robotics, as well as solid skills in Computer Vision, Control theory and software development (Python, C/C++, LINUX, Git, OpenCV, ROS). A good level of written/spoken English is also important.

Research team

The MIS laboratory (Modélisation, Information & Systèmes) brings together teacher-researchers from the UPJV in Computer Science, Automation, Robotics and Computer Vision. The scientific objectives of the laboratory are in line with the themes of Information and Communication Sci- ences and Techniques (ICST). The research work developed there has many applications: Vehicle, Cybersecurity, Energy, Robotics, Music, Heritage, Health...

The project of the PR team 1 is focused on the improvement of the autonomy of mobile ma- chines (on the ground, in space, in virtual) by artificial perception. The approach to contribute to this problem consists in designing original sensors and methods to exploit them (image processing, vision, localization, navigation algorithms).

CS Group is a leader in Countering Unmmaed Arial Systems and will contibute to the project with use case spacification, technical tools and trials areas 1 in order to evaluate algorithms on real data and real situation.

The CS Group Research Laboratory will contribute on data fusion algorithms to track multiple object with event-driven camera.

This project will be done in close collaboration with Cédric Demonceaux for Universit ́e de Bourgogne, and Guillaume Allibert from Université Côte d'Azur.



Contact :