New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
econometric analysis
Econometric Analysis 7th edition William H. Greene - Solutions
Prove that the least squares intercept estimator in the classical regression model is the minimum variance linear unbiased estimator.
As a profit-maximizing monopolist, you face the demand curve Q = α+βP+ε. In the past, you have set the following prices and sold the accompanying quantities: Suppose that your marginal cost is 10. Based on the least squares regression, compute a 95 percent confidence interval for the expected
Suppose that the regression model is yi = α + βxi + εi, where the disturbances εi have f (εi) = (1/λ) exp(−εi /λ), εi ≥ 0. This model is rather peculiar in that all the disturbances are assumed to be nonnegative. The disturbances have E[εi | xi] = λ and Var[εi | xi] = λ2. Show that
Suppose that the classical regression model applies but that the true value of the constant is zero. Compare the variance of the least squares slope estimator computed without a constant term with that of the estimator computed with an unnecessary constant term.
Consider the simple regression yi= βxi+ εiwhere E[ε | x] = 0 and E[ε2| x] = σ2 a. What is the minimum mean squared error linear estimator of β? b. For the estimator in part a, show that ratio of the mean squared error of β to that of the ordinary least squares estimator b is τ is the
Suppose that you have two independent unbiased estimators of the same parameter θ̂, say θ̂1 and θ̂2, with different variances v1 and v2. What linear combination θ̂ = c1 θ̂1 + c2 θ̂2 is the minimum variance unbiased estimator of θ?
In the December 1969, American Economic Review (pp. 886–896), Nathaniel Leff reports the following least squares regression results for a cross section study of the effect of age composition on savings in 74 countries in 1964:ln S/Y = 7.3439 + 0.1596 ln Y/N + 0.0254 ln G − 1.3520 ln D1
Using the matrices of sums of squares and cross products immediately preceding Section 3.2.3, compute the coefficients in the multiple regression of real investment on a constant, real GNP and the interest rate. Compute R2.
Three variables, N, D, and Y, all have zero means and unit variances. A fourth variable is C = N+ D. In the regression of C on Y, the slope is 0.8. In the regression of C on N, the slope is 0.5. In the regression of Don Y, the slope is 0.4.What is the sum of squared residuals in the regression of C
Suppose that you estimate a multiple regression first with, then without, a constant. Whether the R2 is higher in the second case than the first will depend in part on how it is computed. Using the (relatively) standard method R2 = 1 − (e'e/y'M0y), which regression will have a higher R2?
Adata set consists of n observations on Xnand yn. The least squares estimator based on these n observations is bn= (X'nXn)??1X'nyn. Another observation, xsand ys, becomes available. Prove that the least squares estimator computed using this additional observation is The last term is es, the
Prove that the adjusted R2 in (3-30) rises (falls) when variable xk is deleted from the regression if the square of the t ratio on xk in the multiple regression is less (greater) than 1.
Let Y denote total expenditure on consumer durables, nondurables, and services and Ed, En, and Es are the expenditures on the three categories. As defined, Y = Ed + En + Es. Now, consider the expenditure systemEd = αd + βdY + γddPd + γdnPn + γds Ps + εd,En = αn + βnY + γndPd + γnnPn +
A common strategy for handling a case in which an observation is missing data for one or more variables is to fill those missing variables with 0s and add a variable to the model that takes the value 1 for that one observation and 0 for all other observations. Show that this “strategy” is
What is the result of the matrix product M1M where M1 is defined in (3-19) and M is defined in (3-14)?
In the least squares regression of y on a constant and X, to compute the regression coefficients on X, we can first transform y to deviations from the mean y̅ and, likewise, transform each column of X to deviations from the respective column mean; second, regress the transformed y on the
Suppose that b is the least squares coefficient vector in the regression of y on X and that c is any other K × 1 vector. Prove that the difference in the two sums of squared residuals is(y − Xc)' (y − Xc) − (y − Xb)' (y − Xb) = (c − b) X'X(c − b).Prove that this difference is
For the regression model y = α + βx + ε,a. Show that the least squares normal equations imply Σi ei = 0 and Σi xi ei = 0.b. Show that the solution for the constant term is a = y̅ − bx̅.c. Show that the solution for b is b = [Σni=1 (xi – x̅) (yi – y̅)]/[Σni=1(xi – x̅)2].d. Prove
Showing 600 - 700
of 618
1
2
3
4
5
6
7
Step by Step Answers