Question: Linear regression 1.1 Linear regression (a) Assume that you record a scalar input a and a scalar output y. First, you record x1 = 2,
Linear regression 1.1 Linear regression (a) Assume that you record a scalar input a and a scalar output y. First, you record x1 = 2, y1 = -1, and thereafter 12 = 3, 12 = 1. Assume a linear regression model y = 60 + 014 + e and learn the parameters with maximum likelihood 0 with the assumption e ~ M(0, of). Use the model to predict the output for the test input .* = 4, and add the model to the plot below: Loutput y 3- 2 1 input x 2 3 5 6 -1 . Data (b) Now, assume you have made a third observation y3 = 2 for 13 = 4 (is that what you predicted in (a)?). Update the parameters 0 to all 3 data samples, add the new model to the plot (together with the new data point) and find the prediction for T* = 5. (c) Repeat (b), but this time using a model without intercept term, i.e., y = 012 + e. (d) Repeat (b), but this time using Ridge Regression with y = 1 instead. (e) You realize that there are actually two output variables in the problem you are studying. In total, you have made sample input r first output y1 second output y2 the following observations: (1) 2 0 3 (3) 2 2 -1 You want to model this as a linear regression with multidimensional outputs (without regularization), i.e., y1 = 001 + Onix te (1.1) yz = 002 + 012 + E (1.2) By introducing, for the general case of p inputs and q outputs, the matrices 001 002 ... V11 . . . y1q C11 . . . Tip 011 012 y21 . . . y2q X21 X22 . . . T2p 021 022 . . . 029 +E, (1.3) yn1 . . . Ung . . . Inp Op1 0p2 . . . try to make an educated guess how the normal equations can be generalized to the multidimensional output case. (A more thorough derivation is found in problem 1.5). Use your findings to compute the least square solution O to the problem now including both the first output y, and the second output y2
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
