Question: 6.11 (Requires calculus) Consider the regression model Yi = b1X1i + b2X2i + ui for i = 1, c, n. (Notice that there is no
6.11 (Requires calculus) Consider the regression model Yi = b1X1i + b2X2i + ui for i = 1,
c, n. (Notice that there is no constant term in the regression.)
Following analysis like that used in Appendix 4.2:
a. Specify the least squares function that is minimized by OLS.
b. Compute the partial derivatives of the objective function with respect to b1 and b2.
c. Suppose that gn i = 1X1iX2i = 0. Show that b n
1 = gn i = 1X1iYi > gn i = 1X21 i.
d. Suppose that gn i = 1X1iX2i 0. Derive an expression for b n
1 as a function of the data 1Yi, X1i, X2i2, i = 1,
c, n.
e. Suppose that the model includes an intercept: Yi = b0 + b1X1i + b2X2i + ui.
Show that the least squares estimators satisfy b n
0 = Y - b n
1X1 - b n
2X2.
f. As in (e), suppose that the model contains an intercept. Also suppose that gn i = 11X1i - X121X2i - X22 = 0. Show that b
n 1 = gn i = 11X1i - X121Yi - Y 2 > gn i = 11X1i - X12 2. How does this compare to the OLS estimator of b1 from the regression that omits X2?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
