Annonce
Stage de fin d'études ou Master 2 - Deep Reinforcement Learning-based Tracker and Control for Drones
24 Octobre 2023
Catégorie : Stagiaire
Deep Reinforcement Learning-based Tracker and Control for Drones
General information:
Position: Master internship or last year engineer’s degree
Duration: 4 to 6 months, starting in February 2024 (flexible)
Location: IMT Atlantique, Brest (France)
Affiliation: RAMBO team, Lab-STICC
Supervisors: Hajer Fradi and Panagiotis Papadakis
Context
This project will be conducted in the context of LEASARD project [1] which aims to increase the navigation autonomy of drones as Unmanned Aerial Vehicles (UAV) in search and rescue scenarios. Towards this goal, LEASARD project is dedicated to the enhancement of sensing and processing capabilities through the integration of event cameras and the utilization of appropriate deep neural networks to process data from these cameras.
Description and objectives
This project aims to propose a deep reinforcement learning (RL) based visual tracker for drones (aerial) images. Our primary motivation is to use deep reinforcement learning in order to identify a control policy, thereby enhancing the performance of long-term tracking. Given the challenges associated with object tracking in aerial images [2,3], particularly due to the small size of the target objects, variations in scale, orientation, and viewpoints, as well as the movement of both the drone's camera and the target, event cameras (also known as Dynamic Vision Sensors (DVS)) [4], are proposed to substitute conventional RGB cameras. These cameras offer unique properties, including low latency, free from to motion blur, lower energy requirements, and a high dynamic range, making them more suitable for processing drone videos.
The choice of the most effective solution for visual tracking by drones equipped with an event camera will be the subject of another separate project. This current project is focused more on the subsequent drone actions derived from the deep reinforcement learning stage, which involves an online adaptation process to track the target object. While only a few RL-based trackers have been proposed so far [5], the challenge in this project is to showcase how RL-based tracking can be adapted to aerial videos and event capture.
To cope with the unconventional output from event cameras, the candidate will potentially introduce new action types for drones. For instance, when the object remains stationary, event camera output may not suffice for target localization, necessitating the definition of new drone actions. The performance of the proposed deep reinforcement learning-based visual object tracker for drone videos will be assessed using the aerial dataset VisDrone [6], with suitable preprocessing to replicate event-based inputs such as V2E toolbox. We may also explore the utilization of event simulators such as AirSim or ESIM as part of our assessment.
Candidate profile
- The candidate will be pursuing his/her last year of Master's or engineer’s degree. The balance between research and development will be determined based on the candidate's profile.
- A strong level of Python programming is required.
- An interest in deep learning frameworks and mainly in deep reinforcement learning is also required.
- Experience with ROS would be a plus.
- Good oral and written communication skills in English
How to apply
Interested candidates are encouraged to send their applications (detailed CV, transcripts, and diplomas) as soon as possible to the following address:
hajer.fradi@imt-atlantique.fr
References
[1] LEASARD project, https://project.inria.fr/leasard/
[2] Li, S., & Yeung, D. Y. (2017, February). Visual object tracking for unmanned aerial vehicles: A benchmark and new motion models. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 31, No. 1).
[3] Hamdi, A., Salim, F., & Kim, D. Y. (2020, July). Drotrack: High-speed drone-based object tracking under uncertainty. In 2020 IEEE international conference on fuzzy systems (FUZZ-IEEE) (pp. 1-8). IEEE.
[4] Gallego, G., Delbrück, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., & Scaramuzza, D. (2020). Event- based vision: A survey. IEEE transactions on pattern analysis and machine intelligence, 44(1), 154-180.
[5] Ma, B., Liu, Z., Zhao, W., Yuan, J., Long, H., Wang, X., & Yuan, Z. (2023). Target Tracking Control of UAV Through Deep Reinforcement Learning. IEEE Transactions on Intelligent Transportation Systems.
[6] Zhu, P., Wen, L., Du, D., Bian, X., Fan, H., Hu, Q., & Ling, H. (2021). Detection and tracking meet drones challenge. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(11), 7380-7399.