Annonce
Automated Human Action Data Acquisition and Synchronization Tool for Digital Twin Systems
14 Novembre 2023
Catégorie : Stagiaire
Skills :
• A master level in computer sciences, with a speciality in Virtual Reality
• Basic knowledge in Deep learning
• Skills in Unity - C# and Virtual Reality, Python
• Knowledge in ROS or RTmaps would be appreciated
• Human skills
o Good interpersonnal skills
o English writing ability
Contacts :
• Vincent HAVARD vhavard@cesi.fr, lecturer, CESI LINEACT Rouen.
• Rim SLAMA SALMI rsalmi@cesi.fr, lecturer, CESI LINEACT Lyon.
• Vincent VAUCHEY vvauchey@cesi.fr, Senior research engineer,
How to apply :
Submit you application to Vincent Havard vhavard@cesi.fr and Rim SLAMA SALMI rsalmi@cesi.fr
Please, fill the email object as: “[Internship] VR-HAR-Dataset”
The application must contain:
• CV ;
• A cover letter for the subject ;
• Results of the current master.
• Recommendation letters if available.
Thank you to send LASTNAME FirstName.zip.
Contract: internship of 5 to 6 months, starting in February 2024.
Location :
CESI Rouen
80 Avenue Edmund Halley
Rouen Madrillet Innovation
CS 10123
76808 Saint-Etienne-du-Rouvray.
Context:
As part of Industry 5.0, the manufacturing process is centeredaround the human factor. Meticulous focus is placed on operator actions and motions, all the while ensuring their holistic well-being. Previous work has been made at CESI LINEACT on Human motion analysis (Slama et al., 2023)(Dallel et al., 2022). Within this framework, acquiring a comprehensive dataset for action recognition assumes paramount significance, given its multifaceted applications in enhancing human ergonomics and manufacturing efficiency.Acquiring such dataset has ever been made at CESI LINEACT (Dallel et al., 2020) and can be time consuming.
In parallel, digital twin and virtual reality represent technologies that can deal with several industrial issues like design, simulation and optimisation of industrial systems.Moreover, they represents tools that can acquire datasets with the ability to setup specific parameters (Dallel et al., 2023).In this context, the use of VR to acquire labelled datasets representing operator performing their activities become very interesting solution. In fact, it helps not only acquiring data and labelling actions instantaneously but also simulate different lightening conditions and camera point of views.
Work:
During this internship, the focus will be on developing an automated tool with the primary objectives of:
- Acquiring and synchronizing various types of data, including dynamic human skeleton poses, discrete events such as 'grasping a tool,' 'walking,' 'assembling,' and object positions.
- Utilizing necessary sensors to track operator movements, such as Mocap Suit, Hand Mocap Gloves, VR/Real Camera or Real Camera View, Egocentric View, VR/Real Camera IR (Depth Camera), and VR/Real Point Cloud.
- Replaying the acquired data using a suitable tool to generate data with diverse acquisition parameters, including various camera settings.
- Establishing an acquisition protocol for a dataset specifically designed for the assembly system using the flexible manufacturing system (FMS)provided by LINEACT laboratory.
- Organizing the acquired dataset and implementing pre-processing techniques while validating the setup through a visualization code.
- Presenting and publishing both the developed tool and the dataset in either an indexed conference or a reputable scientific journal paper.
Technically, the main features to develop in the acquisition tool:
- Develop connectors to send data to acquisition tool.
- Develop the virtual acquisition environment in Unity, replicating an existing real environment.
- Integrate JENII's 3D models with grab/teleport functionality.
- Manage VR interactions to send discrete events (grab, release).
- Develop virtual camera for acquiring image.
- Save data with synchronization using the appropriate process.
- Define the data acquisition protocol.
- Constraints:
- Acquisition with ordered tasks (well-defined by the activity's protocol).
- Acquisition with unordered tasks (based on the operator's perspective on a given activity).
Bibliography
Dallel, M., Havard, V., Baudry, D., & Savatier, X. (2020). InHARD - Industrial Human Action Recognition Dataset in the Context of Industrial Collaborative Robotics. 2020 IEEE International Conference on Human-Machine Systems (ICHMS), 1–6. https://doi.org/10.1109/ICHMS49158.2020.9209531
Dallel, M., Havard, V., Dupuis, Y., & Baudry, D. (2022). A Sliding Window Based Approach With Majority Voting for Online Human Action Recognition using Spatial Temporal Graph Convolutional Neural Networks. 2022 7th International Conference on Machine Learning Technologies (ICMLT), 155–163. https://doi.org/10.1145/3529399.3529425
Dallel, M., Havard, V., Dupuis, Y., & Baudry, D. (2023). Digital twin of an industrial workstation: A novel method of an auto-labeled data generator using virtual reality for human action recognition in the context of human–robot collaboration. Engineering Applications of Artificial Intelligence, 118, 105655. https://doi.org/10.1016/j.engappai.2022.105655