Les commentaires sont clos.

Photo-Realistic, Physics-Aware 4D Human Modeling for Real-World Human Cognition by Care-Robots for Aged People

5 Octobre 2021

Catégorie : Post-doctorant

Within the framework of a bi-national and tri-institutional project[1], we aim to push the current limits of robot vision in human cognition by care-robots in the in-house situation. Our specific goal is to make the performance of the vision-intelligence robust to large variations (in body shapes, motions,..) to occlusion (cloth, furniture, wall,..), and capable of understanding the interaction by developing a photo-realistic, physics-aware 4D human model.


[1] Real-World Human Cognition by Care Robots (oct. 2021 - oct. 2024), Bi-national project (with South Korea) with three institutional partners: CNRS, INRIA and ETRI (Electronics and Telecommunications Research Institute).


* Main responsibilities

Research and development in one or more of the following topics, with possible supervision of a Master2 internship student and/or a PhD student:

− Photo-realistic human modeling: Extension of the geometric human body model with color and illumination.

−Action-conditioned motion prediction/generation model via deep-learning over annotated motion datasets.

−End-to-end 4D human model reconstruction from 2D/3D video input via optimal model fitting.

−Physics-aware human-object interaction: Integration of 4D human model with a PBM and development of interaction motion controller DNNs.

* Candidate profile

− PhD in Computer Science, Electronic & Electrical Engineering or in Applied Mathematics (2019 or later).

− Skills in efficient programming, communication, and algorithm design.

− Solid knowledge and experience in deep learning.

− Experience in numerical simulation is a plus.

* Collaborators/Supervisors: Hyewon Seo (ICube, Univ. Strasbourg), Frederic Cordier (Univ. Haut-Alsace), and Stephane Cotin (INRIA Strasbourg).

* Application: Send your CV, link to the PhD thesis, PhD committee reports, and academic transcripts (Bachelor and Master) to

* More information is available at: