The Centre for Robotics of MINES ParisTech is involved in several research projects on human motion pattern recognition applied to the Factory of the Future, the Creative and Cultural Industries and the Autonomous Vehicles. The main objective of these projects is the development of novel methodologies and technological paradigms that improve the perception of the machine and allows for natural body interactions in human-machine partnerships.
MINES ParisTech is opening a Internship position on Computer Vision and Deep Learning for Body Tracking in Professional Environments in the context of H2020 collaborative projects. Whether in manufacturing or the creative industries, professional gestures are executed by workers or craftsmen while they manipulate tools and materials in order to assemble or create an object. The value of motor skills in such activities is very important. For example, simple tasks, such as the marking and cutting of leather pieces or sewing main panels of a product are common in different basic leather crafting procedures and they can be either completely manually executed or assisted by a collaborative robot. Preserving, understanding and transmitting the motor skills of these professions, by using motion capturing and gesture recognition technologies, constitutes a very important challenge for researchers, instructors, ergonomists and human resources departments. Moreover, there is an increasing need for machine learning in the professional context where humans perform gestures while interacting with objects, tools and devices on their workbench. The team of the Centre for Robotics has developed machine learning algorithms that can extract meaningful features from optical, depth and/or RGB-D sensing (e.g. Random Decision Forests, Geodesic distances etc.) that are able to segment the scene into objects, workbench, body poses etc.. Moreover, the team is currently working on the gesture/action recognition based on time-series from 3D coordinates, rotations and other motion descriptors. Early recognition and prediction techniques have also been developed.
The Intern will actively contribute to specific research and innovation tasks of H2020 projects. More precisely, the Intern will set up a professional workbench with a number of tools and objects on it. S/he will organise a good number of recording sessions by inviting users to manipulate the objects but also to freely move the whole upper part of the body (including fingers) on the workbench. The scene and the movements will be recorded by using an RGB and/or depth camera. A good amount of data should be recorded in order to be used for training Deep Learning models. The Intern should use the OpenPose framework (or other equivalents) for exporting a skeletal model. Potential extensions of OpenPose (such as tracking fingers or using depth images or detecting objects) will be also investigated. The results will also be compared to the results of other existing algorithms for body tracking and scene analysis, such as Random Decision Forests (RDFs).
This position gives the possibility to the candidate to work with other European researchers both in the project and in the wider academic community, as well as opportunities to work directly with industrial partners. Moreover, the candidate will acquire transferable skills that will enhance future employability through leading and contributing to highly interactive and collaborative work. Finally, the candidate will be autonomous and concentrated on his/her work.
The Intern will have a 5 or 6-month contract. The gross monthly salary will be 1498,47€. Complementary activities to research, such as providing reports and deliverables, etc., are included into the salary.
Student with completed 4 years of studies at a University level.
We are looking for a motivated and talented student who has completed 4 years of studies with concrete experience in one of the following domains:
The Intern should have skills on:
Moreover, the Intern must be fluent in both written and spoken English and possess good presentation and communication skills.
APPLICATIONS: Please send your CV and a cover letter to email@example.com
(c) GdR 720 ISIS - CNRS - 2011-2020.