Publication details
Kernel estimation of regression function gradient
Authors | |
---|---|
Year of publication | 2020 |
Type | Article in Periodical |
Magazine / Source | Communications in Statistics - Theory and Methods |
MU Faculty or unit | |
Citation | |
Web | Full Text |
Doi | http://dx.doi.org/10.1080/03610926.2018.1532518 |
Keywords | multivariate kernel regression; constrained bandwidth matrix; kernel smoothing |
Attached files | |
Description | The present paper is focused on kernel estimation of the gradient of a multivariate regression function. Despite the importance of estimating partial derivatives of multivariate regression functions, the progress is rather slow. Our aim is to construct the gradient estimator using the idea of a local linear estimator for the regression function. The quality of this estimator is expressed in terms of the Mean Integrated Square Error. We focus on a crucial problem in kernel gradient estimation the choice of bandwidth matrix. Further, we present some data-driven methods for its choice and develop a new approach based on Newton's iterative process. The performance of presented methods is illustrated using a simulation study and real data example. |
Related projects: |