Question: 6.19 ( ) Another viewpoint on kernel regression comes from a consideration of regression problems in which the input variables as well as the target

6.19 ( ) Another viewpoint on kernel regression comes from a consideration of regression problems in which the input variables as well as the target variables are corrupted with additive noise. Suppose each target value tn is generated as usual by taking a function y(zn) evaluated at a point zn, and adding Gaussian noise. The value of zn is not directly observed, however, but only a noise corrupted version xn = zn + ξn where the random variable ξ is governed by some distribution g(ξ).

Consider a set of observations {xn, tn}, where n = 1, . . . , N, together with a corresponding sum-of-squares error function defined by averaging over the distribution of input noise to give E =

1 2

N n=1



{y(xn − ξn) − tn}2 g(ξn) dξn. (6.99)

By minimizing E with respect to the function y(z) using the calculus of variations

(Appendix D), show that optimal solution for y(x) is given by a Nadaraya-Watson kernel regression solution of the form (6.45) with a kernel of the form (6.46).

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Pattern Recognition And Machine Learning Questions!