Question: +Maximum-likelihood estimation of the simple-regression model: Deriving the maximum-likelihood estimators of and in simple regression is straightforward. Under the assumptions of the model,

+Maximum-likelihood estimation of the simple-regression model: Deriving the maximum-likelihood estimators of α and β in simple regression is straightforward. Under the assumptions of the model, the Yis are independently and normally distributed random variables with expectations α þ βxi and common variance σ2

ε . Show that if these assumptions hold, then the least-squares coefficients A and B are the maximum-likelihood estimators of α and β and that σb2

ε ¼ PE2 i =n is the maximum-likelihood estimator of σ2

ε . Note that the MLE of the error variance is biased. (Hints: Because of the assumption of independence, the joint probability density for the Yis is the product of their marginal probability densities pðyiÞ ¼ 1 ffiffiffiffiffiffiffiffiffiffi 2πσ2

ε

p exp ' ðyi ' α ' βxiÞ

2 2σ2

ε

" #

Find the log-likelihood function; take the partial derivatives of the log likelihood with respect to the parameters α, β, and σ2

ε ; set these partial derivatives to 0; and solve for the maximumlikelihood estimators.) A more general result is proved in Section 9.3.3.

Step by Step Solution

3.43 Rating (153 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Regression Analysis Questions!