Question: Problem 3. Multi-task regression (by Andrew Ng) Thus far, we only considered regression with scalar-valued responses. In some applications, the response is itself a vector:

Problem 3. Multi-task regression (by Andrew Ng)
Problem 3. Multi-task regression (by Andrew Ng) Thus far, we only considered regression with scalar-valued responses. In some applications, the response is itself a vector: y; E Rmx]. We posit the relationship between the features/predictors (xi ( Redx]) and the vector-valued response y; is linear: yi = x, B* + error, for i = 1, . . . , n where B* E Rdam is a matrix of regression coefficients. Here note that for the linear regression model in class, the dimension of response variable yi is m = 1. 1. Express the sum of squared residuals (also called residual sum of squares, R.SS) in matrix notation (z. e. without using any summations). Similarly to the linear regression model, the RSS is defined as RSS(B) = >(7 - XTB)(y - XB)T. 1= 1 Hint: work out how to express the RSS in terms of the data matrices X = E Rnxd Y = E Rnxm Xn yn Also note that for a matrix A = (dij)nxm with its ith row vector denoted by ai, we have tr(AAT) = EL dia, =Elsijen afj 2. Find the matrix of regression coefficients that minimizes the RSS. 3. Instead of minimizing the RSS, we break up the problem into m regression problems with scalar-valued responses. That is, we fit m linear models of the form ( yi)k = X, BK + error, where (yi)k denotes the th element in the vector y; and BR E R . How do the regression coefficients from the m separate regressions compare to the matrix of regression coefficients that minimizes the SSR in question (2)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!