Question: a-c. Consider the accompanying computer results, which describe regression analyses involving two independent variables X1 and X2 and a dependent variable Y (using a different
d. What does this example illustrate about using a test of the hypothesis H0: β2 = 0 to assess confounding?
Regression of Y on X1 and X2
-1.png)
Regression of Y on X1
-2.png)
CORRELATION X1 ,0000 0.0000 0.2649 0.0000 1.0000 0.9272 0.2649 0.9272 1.0000 X2 Variable X1 X2 ANALYSIS OF VARIANCE Sum of DF Squares F Value PrF Source Model Error Corrected Total Square 2 108.00000 53.0000033.13 0,0013 8.00000 60000 114.00000 7 Portion of output omitted) PARAMETER ESTIMATES Squared Partial Squared Partial Parameter Standard DF Estimate Error tValue Prt Corr Type C 0.77460 Corr Type ll Variable Intercept X1 X2 5.00000 2.00000 0.89443 224 1 7.00000 089443 783 6.45 0.0013 0.50000 0.92453 0.0756 0.07018 1 7 0.0005 092453 ANALYSIS OF VARIANCE Sum of OFI Squares Mean Square F Value Pr> F Source Model Error Corrected Total 8.00000 8.00000 0.45 0.5260 6 106.00000 1766667 7 114.00000 Portion of output omitted] PARAMETER ESTIMATES Parametor Standard Error t Value Prs jt 4.04 0.0068 2 972090.67 0.5260 Variable DF Estimate Intercept B 50000 2 10159 X1 1 200000
Step by Step Solution
3.43 Rating (162 Votes )
There are 3 Steps involved in it
a There is no confounding due to X 2 because 1 does not change when X 2 is removed from the model b ... View full answer
Get step-by-step solutions from verified subject matter experts
Document Format (1 attachment)
632-M-S-L-R (5997).docx
120 KBs Word File
