Learning Heteroscedastic Models by Convex Programming under Group Sparsity - École des Ponts ParisTech Access content directly
Journal Articles Proceedings of the 30 th International Conference on Machine Learning Year : 2013

Learning Heteroscedastic Models by Convex Programming under Group Sparsity

Abstract

Popular sparse estimation methods based on $\ell_1$-relaxation, such as the Lasso and the Dantzig selector, require the knowledge of the variance of the noise in order to properly tune the regularization parameter. This constitutes a major obstacle in applying these methods in several frameworks---such as time series, random fields, inverse problems---for which the noise is rarely homoscedastic and its level is hard to know in advance. In this paper, we propose a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting. An attractive feature of the proposed estimator is that it is efficiently computable even for very large scale problems by solving a second-order cone program (SOCP). We present theoretical analysis and numerical results assessing the performance of the proposed procedure.
Fichier principal
Vignette du fichier
Var_adap_ICML2013.pdf (342.42 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-00813908 , version 1 (16-04-2013)

Identifiers

Cite

Arnak S. Dalalyan, Mohamed Hebiri, Katia Meziani, Joseph Salmon. Learning Heteroscedastic Models by Convex Programming under Group Sparsity. Proceedings of the 30 th International Conference on Machine Learning, 2013, http://icml.cc/2013/?page_id=43. ⟨hal-00813908⟩
299 View
313 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More