New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
applied statistics and probability for engineers
Probability And Statistics For Economists 1st Edition Bruce Hansen - Solutions
Take the function f (x, y) Æ ¡x3y Å 1 2 y2x2 Åx4 ¡2x. You want to find the joint minimizer(x0, y0) over x ¸ 0, y ¸ 0. (a) Try nested minimization. Given x, find the minimizer of f (x, y) over y. Write this solution as y(x).(b) Substitute y(x) into f (x, y). Find the minimizer x0.(c) Find y0.
A parameter p lies in the interval [0, 1]. If you use Golden Section search to find the minimum of the log-likelihood function with 0.01 accuracy, how many search iterations are required?
Take the equation f (x) Æ x ¡2x2 Å 1 4 x4. Consider the problem of finding the minimum over x ¸ 1.(a) For what values of x is f (x) convex?(b) Find the Newton iteration rule xi !xiÅ1.(c) Using the starting value x1 Æ 1 calculate the Newton iteration x2.(d) Consider Golden Section search.
Take the equation f (x) Æ x2 Åx3 ¡1. Consider the problem of finding the root in [0,1].(a) Start with the Newton method. Find the derivative f 0(x) and the iteration rule xi !xiÅ1.(b) Starting with x1 Æ 1, apply the Newton iteration to find x2.(c) Make a second Newton step to find x3.(d) Now
Prove Theorem11.5.
Use Theorem 11.9 to find the asymptotic distribution of the sample median.(a) Find the asymptotic distribution for general density f (x).(b) Find the asymptotic distribution when f (x) »N¡¹,¾2¢.(c) Find the asymptotic distribution when f (x) is double exponential. Write your answer in terms of
You believe X is distributed Pareto(®,1). You want to estimate ®.(a) Use the Pareto model to calculate P[X · x].(b) You are given the EDF estimate Fn(5) Æ 0.9 from a sample of size n Æ 100. (That is, Fn(x) Æ 0.9 for x Æ 5.) Calculate a standard error for this estimate.(c) Use the above
You believe X is distributed exponentially and want to estimate µ Æ P[X È 5]. You have the estimate Xn Æ 4.3 with standard error s³Xn´Æ 1. Find an estimate of µ and a standard error of estimation.
You want to estimate the percentage of a population which has a wage below $15 an hour.You consider three approaches.(a) Estimate a normal distribution by MLE. Use the percentage of the fitted normal distribution below$15.(b) Estimate a log-normal distribution by MLE. Use the percentage of the
You travel to the planet Estimation as part of an economics expedition. Families are organized as pairs of individuals identified as alphas and betas. You are interested in their wage distributions.Let Xa,Xb denote the wages of a family pair. Let ¹a, ¹b, ¾2a, ¾2 b and ¾ab denote the means,
Propose a moment estimator¡b p, br¢for the parameters (p, r ) of the Negative Binomial distribution(Section 3.7).
Propose a moment estimatorb¸for the parameter ¸ of a Poisson distribution (Section 3.6).
A Bernoulli randomvariable X is P[X Æ 0] Æ 1¡p P[X Æ 1] Æ p.(a) Propose amoment estimator b p for p.(b) Find the variance of the asymptotic distribution of pn¡b p ¡p¢.(c) Use the knowledge of the Bernoulli distribution to simplify the asymptotic variance.(d) Propose an estimator of the
The skewness of a random variable is skew ƹ3¾3 where ¹3 is the third central moment.(a) Propose a plug-in moment estimatorskew for skew.(b) Find the variance of the asymptotic distribution of p n ³skew¡skew ´.(c) Propose an estimator of the asymptotic variance ofskew.
The coefficient of variation of X is cv Æ 100£¾/¹ where ¾2 Æ var[X] and ¹ Æ E[X].(a) Propose a plug-in moment estimatorccv for cv.(b) Find the variance of the asymptotic distribution of pn (ccv¡cv) .(c) Propose an estimator of the asymptotic variance ofccv.
Let g (x) be a density function of a random variable with mean ¹ and variance ¾2. Let X be a random variable with density function f (x j µ) Æ g (x)¡1ŵ¡x ¡¹¢¢.Assume g (x), ¹ and ¾2 are known. The unknown parameter is µ. Assume that X has bounded support so that f (x j µ) ¸ 0 for
Take the beta density function. Assume ¯ is known and equals ¯ Æ 1.(a) Find f (x j ®) Æ f (x j ®,1).(b) Find the log-likelihood function Ln(®) for ® from a randomsample {X1, ...,Xn}.(c) Find theMLE b
Take the gamma density function. Assume ® is known.(a) Find theMLE b¯ for ¯ based on a randomsample {X1, ...,Xn}.(b) Find the asymptotic distribution for pn¡ b¯¡¯¢.
Supose X has density f (x) and Y Æ ¹Å¾X. You have a random sample {Y1, ...,Yn} from the distribution of Y .(a) Find an expression for the density fY (y) of Y .(b) Suppose f (x) Æ C exp(¡a(x)) for some known differentiable function a(x) and some C. Find C.(Since a(x) is not specified you
Take the Bernoulli model.(a) Find the asymptotic variance of theMLE. (Hint: see Exercise 10.8.)(b) Propose an estimator of the asymptotic variance V .(c) Show that this estimator is consistent for V as n!1.(d) Propose a standard error s( b p) for the MLE b p. (Recall that the standard error is
Take the Gammamodel f (x j ®,¯) Ư®¡(®)x®¡1e¡¯x , x È 0.Assume ¯ is known, so the only parameter to estimate is ®. Let g (®) Æ log¡(®). Write your answers in terms of the derivatives g 0(®) and g 00(®). (You need not get the closed formsolution for these derivatives.)(a)
Take themodel f (x) Æ µ exp(¡µx),x ¸ 0,µ È 0.(a) Find the Cramér-Rao lower bound for µ.(b) Recall the MLE bµ for µ from above. Notice that this is a function of the sample mean. Use this formula and the delta method to find the asymptotic distribution for bµ.(c) Find the asymptotic
Take the Pareto model, Recall the MLE b®for ® from Exercise 10.3. Show that b®¡!p® by using the WLLN and continuous mapping theorem.
Find the Cramér-Rao lower bound for p in the Bernoulli model (use the results from Exercise 10.6). In Section 10.3 we derived that the MLE for p is b p Æ Xn. Compute var£b p¤. Compare var£b p¤with the CRLB.
Take the Pareto model f (x) Æ ®x¡1¡®, x ¸ 1. Calculate the information for ® using the second derivative.
Let X be Bernoulli ¼(X j p) Æ px(1¡p)1¡x .(a) Calculate the information for p by taking the variance of the score.(b) Calculate the information for p by taking the expectation of (minus) the second derivative. Did you obtain the same answer?
Let X be distributed double exponential with density f (x) Æ 1 2 exp(¡jx ¡µj) for x 2 R.(a) Find the log-likelihood function `n(µ).(b) Extra challenge: Find the MLE bµ for µ. This is challenging as it is not simply solving the FOC due to the nondifferentiability of the density function.
Let X be distributed Cauchy with density f (x) Æ1¼(1Å(x ¡µ)2)for x 2 R.(a) Find the log-likelihood function `n(µ).(b) Find the first-order condition for theMLE bµ for µ. You will not be able to solve for bµ.
Let X be distributed Pareto with density f (x) Æ®x1Å® for x ¸ 1 and ® È 0.(a) Find the log-likelihood function `n(®).(b) Find theMLE b®for ®.
Let X be distributed as N(¹,¾2). The unknown parameters are ¹ and ¾2.(a) Find the log-likelihood function `n(¹,¾2).(b) Take the first-order condition with respect to ¹ and show that the solution for b¹ does not depend on ¾2.(c) Define the concentrated log-likelihood function `n(b¹,¾2).
Let X be distributed Poisson: ¼(k) Æexp(¡µ)µk k!for nonnegative integer k and µ È 0.(a) Find the log-likelihood function `n(µ).(b) Find theMLE bµ for µ. (0)4 (e)I (08) (e)I ()1 (0)1
Let X »exponential(1) and Mn Æ maxi·n Xi . Derive the asymptotic distribution of Zn ÆMn ¡logn using similar steps as in Exercise 8.10.
Let X »U[0,b] and Mn Æ maxi·n Xi . Derive the asymptotic distribution using the following the steps.(a) Calculate the distribution F(x) ofU[0,b].(b) Show Zn Æ n (Mn ¡b) Æ nµmax 1·i·n Xi ¡b¶Æ max 1·i·n n (Xi ¡b) .(c) Show that the CDF of Zn is Gn(x) Æ P[Zn · x] ƳF³b Åx
Suppose pn¡bµ¡µ¢¡!d N¡0, v2¢and set ¯ Æ µ2 and b¯ Æ bµ2.(a) Use the DeltaMethod to obtain an asymptotic distribution for pn¡ b¯¡¯¢.(b) Now suppose µ Æ 0. Describe what happens to the asymptotic distribution fromthe previous part.(c) Improve on the previous answer. Under the
Assume pnµ bµ1 ¡µ1 bµ2 ¡µ2¶¡!d N(0,§) .Use the DeltaMethod to find the asymptotic distribution of the following statistics(a) bµ1bµ2.(b) exp¡bµ1 Å bµ2¢.(c) If µ2 6Æ 0, bµ1/bµ2 2.(d) bµ3 1Å bµ1bµ2 2.
Assume pn¡bµ¡µ¢¡!d N¡0, v2¢. Use the Delta Method to find the asymptotic distribution of the following statistics(a) bµ2.(b) bµ4.(c) bµk .(d) bµ2 Å bµ3(e)1 1Å bµ2.(f )1 1Åexp¡¡bµ¢ .
Letmk Æ¡EjXjk ¢1/k for some integer k ¸ 1.(a) Write down an estimator b mk ofmk .(b) Find the asymptotic distribution of pn ( b mk ¡mk ) as n!1.
Let X have an exponential distribution with ¸ Æ 1.(a) Find the first four cumulants of the distribution of X.(b) Use the expressions in Section 8.4 to calculate the third and fourth moments of Zn Æp n³Xn ¡¹´for n Æ 10, n Æ 100, n Æ 1000.(c) How large does n be in order to for the third
Let ¹0 kÆ E£Xk ¤for some integer k ¸ 1. Assume E£X2k ¤Ç1.(a) Write down the moment estimator b¹0 k of ¹0 k .(b) Find the asymptotic distribution of pn¡b¹0 k¡¹0 k¢as n!1.
Find the moment estimator b¹0 3 of ¹0 3Æ E£X3¤and show that pn¡b¹0 3¡¹0 3¢¡!d N¡0, v2¢for some v2.Write v2 as a function of the moments of X.
Find the moment estimator b¹0 2 of ¹0 2Æ E£X2¤. Show that pn¡b¹0 2¡¹0 2¢¡!d N¡0, v2¢for some v2.Write v2 as a function of the moments of X.
Let X be distributed Bernoulli P[X Æ 1] Æ p and P[X Æ 0] Æ 1¡p.(a) Show that p Æ E[X].(b) Write down the moment estimator b p of p.(c) Find var£b p¤.(d) Find the asymptotic distribution of pn¡b p ¡p¢as n!1.
What does the WLLN imply about sample statistics (sample means and variances) calculated on very large samples, such as an administrative database or a census?.
Suppose Zn ¡!p c as n!1. Show that Z2 n ¡!p c2 as n!1using the definition of convergence in probability but not appealing to the CMT.
Take a random variable Z such that E[Z] Æ 0 and var[Z] Æ 1. Use Chebyshev’s inequality(Theorem 7.1) to find a ± such that P[jZj È ±] · 0.05. Contrast this with the exact ± which solves P[jZj È ±] Æ 0.05 when Z »N(0,1) . Comment on the difference.
Take a random sample {X1, ...,Xn} where X È 0 and E¯¯logX¯¯Ç 1. Consider the sample geometric meanand population geometricmean ¹ Æ exp ¡E £logX ¤¢.Show that b¹!p ¹ as n!1. n @=[[[X i=1 1/n
Find the probability limit for the standard error s³Xn´Æ s/p n for the samplemean.
Showthat the bias-corrected variance estimator s2 is consistent for the population variance¾2.
Take a randomsample {X1, ...,Xn} and randomly split the sample into two equal sub-samples 1 and 2. (For simplicity assume n is even.) Let X1n and X2n the sub-sample averages. Are X1n and X2n consistent for the mean ¹ Æ E[X]? (Show this.)
A weighted sample mean takes the form X¤n Æ 1n PniÆ1wi Xi for some non-negative constants wi satisfying 1n PniÆ1wi Æ 1. Assume Xi is i.i.d.(a) Show that X¤n is unbiased for ¹ Æ E[X] .(b) Calculate var hX¤n i.(c) Show that a sufficient condition for X¤n ¡!p¹ is that n¡2PniÆ1w2 i!0.(d)
Take a random sample {X1, ...,Xn}. Which of the following statistics converge in probability by the weak law of large numbers and continuous mapping theorem? For each, which moments are needed to exist?(g) 1n Pni Æ1 XiYi . (2) , . () . (c) maxisnXi. () - (). (e) assuming E[X]>0. i=1 (f) 1{X>0}.
Find the probability limits (if they exist) of the following sequences of random variables(a) Zn »U[0,1/n].(b) Zn »Bernoulli(pn) with pn Æ 1¡1/n.(c) Zn »Poisson(¸n) with ¸n Æ 1/n.(d) Zn »exponential(¸n) with ¸n Æ 1/n.(e) Zn »Pareto(®n,¯) with ¯ Æ 1 and ®n Æ n.(f ) Zn
Consider a random variable Zn with the probability distribution(a) Does Zn !p 0 as n!1?(b) Calculate E[Zn] .(c) Calculate var[Zn] .(d) Now suppose the distribution isCalculate E[Zn] .(e) Conclude that Zn !p 0 as n!1and E[Zn]!0 are unrelated. -n Zn = 0 n with probability 1/n with probability 1-2/n
Assume that an !0. Show that(a) a1/2 n ¡!0 (assume an ¸ 0).(b) a2 n ¡!0.
Does the sequence an Æ sin³¼2 n´converge?
For the following sequences show an !0 as n!1.(a) an Æ 1/n2.(b) an Æ1 n2 sin³¼8 n´.(c) an Æ ¾2/n.
Suppose that Xi » N(¹X ,¾2 X ) : i Æ 1, . . . ,n1 and Yi » N(¹Y ,¾2 Y ), i Æ 1, . . . ,n2 are mutually independent. Set Xn1Æ n¡1 1Pn1 iÆ1 Xi and Y n2Æ n¡1 2Pn2 iÆ1 Yi .(a) Find E hXn1¡Y n2 i.(b) Find var hXn1¡Y n2 i.(c) Find the distribution of Xn1¡Y n2 .(d) Propose an estimator of
Suppose that Xi are i.n.i.d. (independent but not necessarily identically distributed) with E[Xi ] Æ ¹i and var[Xi ] Æ ¾2i.(a) Find E hXn i.(b) Find var hXn i.
.This exercise shows that the zero correlation between the numerator and the denominator of the t¡ratio does not always hold when the random sample is not from a normal distribution.
Find the covariance of b¾2 and Xn. Under what condition is this zero?Hint: Use the formobtained in
We know that var hXn iÆ ¾2/n.(a) Find the standard deviation of Xn .(b) Suppose we know ¾2 and want our estimator to have a standard deviation smaller than a tolerance¿. How large does n need to be to make this happen?
Let p be the unknown probability that a given basketball player makes a free throw attempt.The player takes n randomfree throws, of which she makes X of the attempts.(a) Find an unbiased estimator b p of p.(b) Find var£b p¤.
Let µ Æ ¹2.(a) Propose a plug-in estimator bµ for µ.(b) Calculate E£bµ¤.(c) Propose an unbiased estimator for µ.
Propose estimators for(a) µ Æ exp(E[X]) .(b) µ Æ log¡E£exp(X)¤¢.(c) µ Æq E£X4¤.(d) µ Æ var£X2¤.
Show algebraically that b¾2 Æ n¡1PniÆ1(Xi ¡¹)2 ¡(Xn ¡¹)2.
Calculate E h(Xn ¡¹)3 i, the skewness of Xn. Under what condition is it zero?
Show that E[s] · ¾ where s Æp s2.Hint: Use Jensen’s inequality (Theorem 2.9).
Propose an estimator of var£b¹0 k¤. Does an unbiased version exist?
.
Calculate the variance var£b¹0 k¤of the estimator b¹0 k that you proposed in
Consider the central moment ¹k Æ E£(X ¡¹)k ¤. Construct an estimator b¹k for ¹k without assuming known mean ¹. In general, do you expect b¹k to be biased or unbiased?
For some integer k, set ¹0 kÆ E£Xk ¤. Construct an unbiased estimator b¹0 k for ¹0 k . (Show that it is unbiased.)
Suppose that another observation XnÅ1 becomes available. Show that (a) Xn+1 = (nXn+Xn+1)/(n+1). (b) +1 = ((n-1)+(n/(n+1)) (Xn+1 Xn)) /n. 'n+1
Show that if e »N(0,§) and § Æ AA0 then u Æ A¡1e »N(0, I n) .
Show that if e »N¡0, I n¾2¢and H0H Æ I n then u Æ H0e »N¡0, I n¾2¢.
Suppose Xi are independentN¡¹i ,¾2i¢. Find the distribution of theweighted sum PniÆ1wi Xi .
Show that if Q » Â2 k (¸), then E[Q] Æ k Ÿ.
Let X ÆPniÆ1 ai e2 i where ai are constants and ei are independent N(0, 1). Find the following:(a) E[X] .(b) var[X] .
A random variable is Y Æp Qe where e » N(0,1), Q » Â21, and e and Q are independent. It will be helpful to know that E[e] Æ 0, E£e2¤Æ 1, E£e3¤Æ 0, E£e4¤Æ 3, E[Q] Æ 1, and var [Q] Æ 2. Find(a) E[Y ] .(b) E£Y 2¤.(c) E£Y 3¤.(d) E£Y 4¤.(e) Compare these four moments with those of
Show that the characteristic function of X »N¡¹,§¢2 Rm isfor t 2 Rm.Hint: Startwithm Æ 1. Establish E £exp(i tZ)¤Æ exp ¡¡1 2 t 2¢by integration. Then generalize to X »N ¡¹,§¢for t 2 Rm using the same steps as in Exercises 5.11 and 5.12. |C(t) =E[exp(it'X)] = exp(i't-
Show that the MGF of X »N¡¹,§¢2 Rm isHint: Write X Æ ¹Å§1/2Z. M(t) = E [exp (t'X)] = exp('+ t'1).
Show that the MGF of Z 2 Rm is E£exp¡t 0Z¢¤Æ exp¡ 1 2 t 0t¢for t 2 Rm.Hint: Use Exercise 5.5 and the fact that the elements of Z are independent.
Write the multivariate N(0, I k ) density as the product of N(0,1) density functions. That is, show that 1 (2x)472 exp(-)=x)(x2). (2)k/2 k/2
Show that if T is distributed student t with r È 2 degrees of freedomthat var[T ] Æ r r¡2 . Hint:Use Theorems 3.3 and 5.20.
Find the convolution of the normal density Á(x) with itself, R 1¡1Á(x)Á(y ¡x)dx. Show that it can be written as a normal density.
Use the MGF from Exercise 5.5 to verify that E£Z2¤Æm00(0) Æ 1 and E£Z4¤Æm(4)(0) Æ 3.
Show that the MGF of X »N¡¹,¾2¢ism(t ) Æ E£exp(tX)¤Æ exp¡t¹Åt 2¾2/2¢.Hint: Write X Æ ¹Å¾Z.
Show that the moment generating function (MGF) of Z » N(0,1) is m(t ) Æ E£exp(tZ)¤Æexp¡t 2/2¢.
Use Exercises 5.2 and 5.3 plus integration by parts to show that E£Z4¤Æ 3 for Z » N(0,1).
Use Exercise 5.2 and integration by parts to show that E£Z2¤Æ 1 for Z » N(0,1).
For the standard normal density Á(x), show that Á0(x) Æ ¡xÁ(x).
Verify that R 1¡1Á(z)dz Æ 1. Use a change-of-variables and the Gaussian integral (Theorem A.27).
ExtendMinkowski’s inequality (Theorem 4.16) to to show that if p ¸ 1 1/p i=1 (EIXIP)/P.
Use Hölder’s inequality (Theorem 4.15 to show the following. (a) E|XY| E(X4) 3/4 E(Y) 1/4 (b) ExaybE (Xa+b)al(a+b) E(|Ya+b)b(a+b)
Let X be a random variable with finite variance. Find the correlation between(a) X and X.(b) X and ¡X.
If two random variables are independent are they necessarily uncorrelated? Find an example of random variables which are independent yet not uncorrelated.Hint: Take a careful look at Theorem 4.8.
Find the covariance and correlation between a ÅbX and c ÅdY .
Consider the hierarchical distributionFind (a) E[X].(b) var[X]. 2 XIN - XN N~ Poisson (1).
Consider the hierarchical distributionFind (a) E[X]. Hint: Use the law of iterated expectations (Theorem 4.13).(b) var[X]. Hint: Use Theorem 4.14. X|Y~N(Y,) Y~ gamma(a, ).
Suppose that Y conditional on X is N(X,X), E[X] Æ ¹ and var [X] Æ ¾2. Find E[Y ] and var[Y ] .
Showing 600 - 700
of 5106
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last
Step by Step Answers