Maximum-likelihood for mixtures: Let (psi_{0}(cdot), ldots, psi_{k}(cdot)) be given probability density functions on (mathbb{R}), and let [

Question:

Maximum-likelihood for mixtures: Let \(\psi_{0}(\cdot), \ldots, \psi_{k}(\cdot)\) be given probability density functions on \(\mathbb{R}\), and let

\[
m_{\theta}(y)=\theta_{0} \psi_{0}(y)+\cdots+\theta_{k} \psi_{k}(y)
\]

be a \(k+1\)-component mixture with non-negative weights adding to one. Suppose that \(Y_{1}, \ldots, Y_{n}\) are independent and identically distributed with density \(m_{\theta}\), assumed to be strictly positive for \(\theta\) strictly positive. Under what conditions is the mixture model with independent and identically distributed observations identifiable? Show that the maximum-likelihood estimator satisfies the condition

\[
\sum_{i=1}^{n} \frac{\psi_{r}\left(y_{i}ight)}{\hat{m}\left(y_{i}ight)} \leq n
\]

with equality for every \(r\) such that \(\hat{\theta}_{r}>0\). Discuss the 'almost-true' claim that \(\hat{m}\) exists and is unique for every \(n \geq 1\) and every \(y \in \mathbb{R}^{n}\), even if the model is not identifiable.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: