Writing for the least-squares and ridge regression estimators for regression coefficients , show that while its variance-covariance

Question:

Writing 

= (AA)-Ax, k = (AA+kI)Ax

for the least-squares and ridge regression estimators for regression coefficients θ, show that

and that the bias of k is  k = k(AA)-k - b(k) = {(AA+kI)AA  1}0 -

while its variance-covariance matrix is 

Vk = Q(AA+kI)AA(AA+KI).

Deduce expressions for the sum /(k)of the squares of the biases and for the sum (1)of the variances of the regression coefficients, and hence show that the mean square error is

MSEK = E(0-0) (k  0) = F(k) + G(k).

Assuming that(1)is continuous and monotonic decreasing withFr (0) = 0 and that (k) is continuous and monotonic increasing with (k)= (k)(k)'(k) = 0, deduce that there always exists a k such that MSEk 0 (Theobald, 1974).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  answer-question
Question Posted: