[URGENT Appel à Projet Huawei] Stabilité des réseaux de neurones
24 Mai 2018
Catégorie : Appel à projets
[URGENT Appel à Projet Huawei] Date limite fin mai 2018: Research on new algorithm for robust deep neural networks Projet de 12 mois. Etude de la stabilité d'inférence de réseaux neuronaux en presence de bruit à l'entrée ou sur les paramètres internes.
Contact email@example.com au plus tard Lun 28 mai 2018.
Recently, convolutional neural networks (CNNs) have made great breakthrough in visual, speech or language domains, like image classification, speech recognition or natural language translation. However, most existing CNNs are not robust enough to the noise during inference procedure. For example, Ian Goodfellow has showed the issue of output instability of CNN: small perturbations in the visual input could significantly distort the feature embeddings and output of a CNN. Such instability affects many deep architectures with state-of-the-art performance on a wide range of computer vision tasks. This issue could be even worse if the noisy analog computation (e.g. memristor crossbar) is used to accelerate the execution of CNNs.
We aim at accelerating CNNs by means of analog computation in memristor crossbar, but the noise in analog computation may be a serious issue. Thus, we are seeking for potential methods to improve the robustness of CNNs. The improved model should achieve stable performance when noise in input, in feature maps or even in model weights are present. 4 Scope 1) Investigate the state-of-the-art research on robust deep neural nets, including the existing methods, merits and drawbacks analysis, trend and challenge etc. 2) Propose some novel algorithms that make CNNs robust to noise in input, in feature maps or even in model weights. 5 Expected Outcome and Deliverables . At least one novel algorithm to improve the robustness of CNNs. . 1-2 published papers in the top-tier (CCF-A or B) journals and conferences. . 1-2 patents, need to pass Huawei’s review . Regular meeting and notes (once per month).