Decentralized learning in the presence of heterogeneous system devices
15 Novembre 2023
Catégorie : Stagiaire
The research project falls into the broad theme of performing decentralized learning over graphs.
Research topic :
The research project falls into the broad theme of performing decentralized learning over graphs. It recognizes that, in recent years, engineering systems are witnessing an increasing ability to collect data in a decentralized and streamed manner. The focus will be on designing a decentralized approach where devices are collecting data in a continuous manner. The project also recognizes that modern machine learning applications (where large volumes of training data are generated continuously by a massive number of heterogeneous devices) have several key properties that differentiate them from standard distributed inference applications. A particular focus will be given to developing and studying an approach for decentralized learning in statistical heterogeneous settings in the presence of heterogeneous system devices. The emphasis will specifically be on illustrating the interest of the proposed approach in machine learning frameworks using publicly available datasets.
Missions and responsibilities:
The successful candidate will be required to perform (assist with) the following tasks:
• literature review (decentralized and federated learning in statistical heterogeneous settings);
• develop an approach for decentralized learning in statistical heterogeneous settings in the presence of heterogeneous system devices;
• apply the approach to synthetic and real machine learning data;
• compare the approach to state-of-the-art methods;
• write the final report or the master' thesis.
Research work extension:
The expertise developed within this project will allow to extend and propose new methods for decentralized learning in the context of a PhD that will start in September 2024.
• must be a master 2 student or an engineering student in the final year;
• must have a strong background in machine learning as well as good knowledge in signal processing, linear algebra, inverse problems (regularization), and convex optimization;
• must have a good programming experience (Matlab or Python);
• high level of written/spoken English.
Applicants are invited to send their application file by email to email@example.com. The application file must include a detailed CV, a cover letter and the grade transcripts.
 R. Nassif, S. Vlaski, C. Richard, J. Chen, and A. H. Sayed, "Multitask Learning over Graphs: An approach for distributed, streaming machine learning," IEEE Signal Process. Mag., vol. 37, no. 3, pp. 14-25, 2020.
 B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. Y. Arcas. "Communication-efficient learning of deep networks from decentralized data". In Proc. Int. Conf. Artif. Intell. Stat., vol. 54, pp. 1273-1282, 2017.
 T. Li, A. K. Sahu, A. S. Talwalkar, and V. Smith, "Federated learning: Challenges, methods, and future directions," IEEE Signal Process. Mag., vol. 37, pp. 50-60, May 2020.
 V. Smith, C.-K. Chiang, M. Sanjabi, and A. S. Talwalkar, "Federated multi-task learning," in Proc. Adv. Neural Inf. Process. Syst., Long Beach, CA, USA, Dec. 2017, vol. 30.
 P. Kairouz, H. B. McMahan, B. Avent, A. Bellet, M. Bennis, A. N. Bhagoji, ..., and S. Zhao, "Advances and open problems in federated learning," Found. Trends Mach. Learn., vol. 14, no. 1-2, pp. 1-210, 2021.