Iterative Isotonic Regression - Université Rennes 2 Accéder directement au contenu
Article Dans Une Revue ESAIM: Probability and Statistics Année : 2015

Iterative Isotonic Regression

Résumé

This article introduces a new nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a non-decreasing function and a non-increasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main technical result in this paper is the consistency of the proposed estimator when the number of iterations $k_n$ grows appropriately with the sample size $n$. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the well-known consistency property of isotonic regression to the framework of a non-monotone regression function, and secondly, we relate the backfitting algorithm to Von Neumann's algorithm in convex analysis.

Dates et versions

hal-00832863 , version 1 (11-06-2013)

Identifiants

Citer

Arnaud Guyader, Nick Hengartner, Nicolas Jégou, Eric Matzner-Løber. Iterative Isotonic Regression. ESAIM: Probability and Statistics, 2015, 19, pp.1-23. ⟨10.1051/ps/2014012⟩. ⟨hal-00832863⟩
310 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More