Bias and variance reduction procedures in non-parametric regression

  • Marike Cockeran North-West University, Potchefstroom, South Africa
  • Cornelia J. Swanepoel North-West University, Potchefstroom, South Africa
Keywords: Bagging, Bandwidth, Boosting, Bragging, Cross-validation, Kernel estimators, Nonparametric, Regression

Abstract

The purpose of this study is to determine the effect of three improvement methods on nonparametric kernel regression estimators. The improvement methods are applied to the Nadaraya-Watson estimator with cross-validation bandwidth selection, the Nadaraya-Watson estimator with plug-in bandwidth selection, the local linear estimator with plug-in bandwidth selection and a bias corrected nonparametric estimator proposed by Yao (2012), based on cross-validation bandwith selection. The performance of the different resulting estimators are evaluated by empirically calculating their mean integrated squared error (MISE), a global discrepancy measure. The first two improvement methods proposed in this study are based on bootstrap bagging and bootstrap bragging procedures, which were originally introduced and studied by Swanepoel (1988, 1990), and hereafter applied, e.g., by Breiman (1996) in machine learning. Bagging and bragging are primarily variance reduction tools. The third improvement method, referred to as boosting, aims to reduce the bias of an estimator and is based on a procedure originally proposed by Tukey (1977). The behaviour of the classical Nadaraya-Watson estimator with plug-in estimator turns out to be a new recommendable nonparametric regression estimator, since it is not only as precise and accurate as any of the other estimators, but it is also computationally much faster than any other nonparametric regression estimator considered in this study.

Downloads

Download data is not yet available.
Published
2016-03-31
Section
Research Articles