Fast Algorithms for Sparse Reduced-Rank Regression - École des Ponts ParisTech Accéder directement au contenu
Article Dans Une Revue Proceedings of Machine Learning Research Année : 2019

Fast Algorithms for Sparse Reduced-Rank Regression

Résumé

We consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduced-Rank Regression (SRRR) as a non-convex non-differentiable function of a single of the two matrices usually introduced to parametrize low-rank matrix learning problems. We study the behavior of proximal gradient algorithms for the minimization of the objective. In particular, based on an analysis of the geometry of the problem, we establish that a proximal Polyak-Łojasiewicz inequality is satisfied in a neighborhood of the set of optima under a condition on the regularization parameter. We consequently derive linear convergence rates for the proximal gradient descent with line search and for related algorithms in a neighborhood of the optima. Our experiments show that our formulation leads to much faster learning algorithms for RRR and especially for SRRR.
Fichier principal
Vignette du fichier
865.pdf (1.02 Mo) Télécharger le fichier
865-supp.pdf (719.37 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02075623 , version 1 (22-03-2019)

Identifiants

  • HAL Id : hal-02075623 , version 1

Citer

Benjamin Dubois, Jean-François Delmas, Guillaume Obozinski. Fast Algorithms for Sparse Reduced-Rank Regression. Proceedings of Machine Learning Research, 2019, Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, 89, pp.2415-2424. ⟨hal-02075623⟩
257 Consultations
328 Téléchargements

Partager

Gmail Facebook X LinkedIn More