Vous êtes ici : Accueil » Kiosque » Annonce

Identification

Identifiant: 
Mot de passe : 

Mot de passe oublié ?
Détails d'identification oubliés ?

Annonce

5 mai 2020

Post-doctoral Fellowship: Physically-driven learning as dynamical systems


Catégorie : Post-doctorant


Lab

Research at IMT Atlantique involves nearly 800 people, including 290 teachers and researchers and 300 PhD students, and is on digital technology, energy and environment. It covers all disciplines (from the physical sciences to humanities and social sciences through those of information and knowledge) and covers all fields of science and information and communications technology.

This post-doctoral fellowship will take place in IMT Atlantique (LaTIM and LabSTICC labs), at Brest campus. The project will be carried out in close collaboration with IRISA (Vannes) and LETG (Rennes). Travel will be planned between the 3 sites.

 

Starting date : Octobre 2020

Funding : Labex CominLabs (DynaLearn project)

 

Description

 

Artificial intelligence has experienced a particularly strong growth in recent years and is revolutionizing the way data processing issues are addressed. Machine learning (more specifically deep learning) has entered a new era that opens up many opportunities. Because of the development of new algorithms, the multiplication of available data sets and a tenfold increase in computing power, the number of applications is soaring, enabling Machine Learning to tackle tasks that seemed unthinkable a few years ago (such as in computer vision, natural language processing or generative modeling).

 

However, deep learning methods are used in many cases on a purely empirical basis without profound mathematical understanding of their behavior. In order to go beyond the "performance" aspect, it is necessary to provide a solid mathematical framework to study the properties of these approaches and their limitations. Studies on adversarial examples have shown in particular that there are significant obstacles to the use of these methods for applications requiring a high level of safety such as autonomous driving or analysis of medical data. Establishing connections between deep learning and successful mathematical concepts can lead to new major insights about these modern approaches. It is therefore a challenge to restate these new techniques using a well-defined mathematical framework that allows for a thorough analysis of the phenomena at stake.

 

While most current work focuses on the raw performance of deep learning approaches, and their application to various fields of study, there is a clear lack of understanding of these approaches and the mathematical tools to study their behaviour in real-world applications.

 

The dynamical system-based approach makes it possible to formalize the evolution of the flow in deep architectures, and to precisely define the evolution equation and associated properties. The considered theoretical background builds upon the relationship between fluid dynamics, information geometry and machine learning. Broadly speaking, geodesic flows can be regarded as solutions of Euler equations in fluid dynamics [Arnold1966]. As such, deep learning architectures can be viewed as numerical schemes for geodesic flow estimation. ResNet architectures are a typical example, which relates to ODE solvers [Rousseau2019]. This also bridges to the optimal transport problem, which amounts to geodesic flows under volume-preserving constraints [Brenier1989]. These approaches could also be useful to enforce desired regularity properties in the learned models, relating to current approaches in the community to constrain Lipschitz constants or Sobolev norms of neural networks [Zhou2019].

 

The overall objective of this postdoc is to introduce physically-driven knowledge into the learning process, and more specifically to develop a mathematical framework enabling an analogy between the learning process and the estimation of a flow resulting from a physical equation with well-controlled properties. Linking well-grounded fluid dynamics-based formulations with the learning process will lead to the development of new numerical tools to study neural networks. More specifically, we will focus on: 1) dynamical systems described by known physical equations with the associated numerical scheme corresponding to a specific network architecture, 2) characterization of the learning process from a dynamical system point of view, 3) the flow of information expressed as a flow between metric spaces (Sliced Gromov-Wasserstein [Vayer2019]).

 

As applications, the performance of physically-driven models will be assessed on low-dimensional datasets. Standard learning datasets will also be considered. In a second step, we will focus on image processing applications, such as super-resolution and image synthesis, which are currently being studied in the community through Generative Adversarial Networks (GANs), and cycle-GAN framework for unsupervised setting.

 

Profile

 

Contact

François Rousseau and Lucas Drumetz

email : firstname.lastname@imt-atlantique.fr

 

Nicolas Courty

email : nicolas.courty@univ-ubs.fr

 

How to apply

Candidates are invited to email a motivation letter and CV detailing in full your academic background.

Bibliography

V. Arnold. Sur la géométrie différentielle des groupes de Lie de dimension infinie et ses applications à l’hydrodynamique des fluides parfaits. In Annales de l’institut Fourier, volume 16, pages 319–361, 1966.

Y. Brenier. The least action principle and the related concept of generalized flows for incompressible perfect fluids. Journal of the American Mathematical Society, 2(2):225–255, 1989.

F. Rousseau, L. Drumetz, R. Fablet. Residual Networks as Flows of Diffeomorphisms. JMIV 2019.

Vayer, T., Flamary, R., Tavenard, R., Chapel, L., & Courty, N. (2019). Sliced Gromov-Wasserstein. NeurIPS 2019

Zhou, Z. et al. Lipschitz generative adversarial nets. 2019. Arxiv.

 

Dans cette rubrique

(c) GdR 720 ISIS - CNRS - 2011-2020.