Nonparametric Kernel Regressionsubject to Monotonicity
نویسندگان
چکیده
We suggest a biased-bootstrap method for monotonising general linear, kernel-type estimators, for example local linear estimators and Nadaraya-Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, that is applicable to a particularly wide range of estimator types, and that it can be employed after the smoothing step has been implemented. Therefore , an experimenter may use his or her favourite kernel estimator, and their favourite bandwidth selector, to construct the basic nonparametric smoother, and then use our technique to render it monotone in a smooth way. Since our method is based on maximising delity to the conventional empirical approach, subject to monotonicity, then if the original kernel smoother were monotone we would not modify it. More generally, we would adjust it by adjoining weights to data values so as to make least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity.
منابع مشابه
A Berry-Esseen Type Bound for a Smoothed Version of Grenander Estimator
In various statistical model, such as density estimation and estimation of regression curves or hazard rates, monotonicity constraints can arise naturally. A frequently encountered problem in nonparametric statistics is to estimate a monotone density function f on a compact interval. A known estimator for density function of f under the restriction that f is decreasing, is Grenander estimator, ...
متن کاملNONPARAMETRIC KERNEL REGRESSION SUBJECT TO MONOTONICITY CONSTRAINTS By Peter Hall and Li-Shan Huang Australian National University and CSIRO and Australian National University
We suggest a method for monotonizing general kernel-type estimators, for example local linear estimators and Nadaraya–Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, indeed with the same smoothness as the unconstrained estimate. The method is applicable to a particularly wide range of estimator types, it can be trivially modified to render an es...
متن کاملNonparametric Kernel Regression with Multiple Predictors and Multiple Shape Constraints
Nonparametric smoothing under shape constraints has recently received much well-deserved attention. Powerful methods have been proposed for imposing a single shape constraint such as monotonicity and concavity on univariate functions. In this paper, we extend the monotone kernel regression method in Hall and Huang (2001) to the multivariate and multi-constraint setting. We impose equality and/o...
متن کاملNonparametric Estimation of Hazard Rate under the Constraint of Monotonicity
We show how to smoothly`monotonise' standard kernel estimators of hazard rate, using bootstrap weights. Our method takes a variety of forms, depending on choice of kernel estimator and on the distance function used to deene a certain constrained optimisation problem. We connne attention to a particularly simple kernel approach, and explore a range of distance functions. It is straightforward to...
متن کاملConstrained Nonparametric Kernel Regression: Estimation and Inference
Abstract. Restricted kernel regression methods have recently received much well-deserved attention. Powerful methods have been proposed for imposing monotonicity on the resulting estimate, a condition often dictated by theoretical concerns; see Hall, Huang, Gifford & Gijbels (2001) and Hall & Huang (2001), among others. However, to the best of our knowledge, there does not exist a simple yet ge...
متن کامل