Competing against the Best Nearest Neighbor Filter in Regression - École des Ponts ParisTech Accéder directement au contenu
Communication Dans Un Congrès Année : 2011

Competing against the Best Nearest Neighbor Filter in Regression

Résumé

Designing statistical procedures that are provably almost as accurate as the best one in a given family is one of central topics in statistics and learning theory. Oracle inequalities offer then a convenient theoretical framework for evaluating different strategies, which can be roughly classified into two classes: selection and aggregation strategies. The ultimate goal is to design strategies satisfying oracle inequalities with leading constant one and rate-optimal residual term. In many recent papers, this problem is addressed in the case where the aim is to beat the best procedure from a given family of linear smoothers. However, the theory developed so far either does not cover the important case of nearest-neighbor smoothers or provides a suboptimal oracle inequality with a leading constant considerably larger than one. In this paper, we prove a new oracle inequality with leading constant one that is valid under a general assumption on linear smoothers allowing, for instance, to compete against the best nearest-neighbor filters.

Domaines

Autres [stat.ML]

Dates et versions

hal-00654276 , version 1 (21-12-2011)

Identifiants

Citer

Arnak S. Dalalyan, Joseph Salmon. Competing against the Best Nearest Neighbor Filter in Regression. ALT 2011 - 22nd International Conference on Algorithmic Learning Theory, Oct 2011, Espoo, Finland. pp.129-143, ⟨10.1007/978-3-642-24412-4_13⟩. ⟨hal-00654276⟩
130 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More