Accéder directement au contenu Accéder directement à la navigation
Pré-publication, Document de travail

Label noise (stochastic) gradient descent implicitly solves the Lasso for quadratic parametrisation

Abstract : Understanding the implicit bias of training algorithms is of crucial importance in order to explain the success of overparametrised neural networks. In this paper, we study the role of the label noise in the training dynamics of a quadratically parametrised model through its continuous time version. We explicitly characterise the solution chosen by the stochastic flow and prove that it implicitly solves a Lasso program. To fully complete our analysis, we provide nonasymptotic convergence guarantees for the dynamics as well as conditions for support recovery. We also give experimental results which support our theoretical claims. Our findings highlight the fact that structured noise can induce better generalisation and help explain the greater performances of stochastic dynamics as observed in practice.
Type de document :
Pré-publication, Document de travail
Liste complète des métadonnées

https://hal-enpc.archives-ouvertes.fr/hal-03701409
Contributeur : Julien Reygner Connectez-vous pour contacter le contributeur
Soumis le : mercredi 22 juin 2022 - 10:03:50
Dernière modification le : vendredi 24 juin 2022 - 03:55:59

Lien texte intégral

Identifiants

  • HAL Id : hal-03701409, version 1
  • ARXIV : 2206.09841

Collections

Citation

Loucas Pillaud-Vivien, Julien Reygner, Nicolas Flammarion. Label noise (stochastic) gradient descent implicitly solves the Lasso for quadratic parametrisation. 2022. ⟨hal-03701409⟩

Partager

Métriques

Consultations de la notice

0