Question: You will be required to run Expectation Maximization for estimating parameters of a Gaussian mixture model in this example. Refer to lecture slides (Lecture E.2)
| You will be required to run Expectation Maximization for estimating parameters of a Gaussian mixture model in this example. Refer to lecture slides (Lecture E.2) for the exact formulation. You can use python or do the computations by hand. Recall that you need to use the pdf formulation for computing p(x|theta_k) Consider the following one dimensional data set:2.3 3.2 3.1 1.6 1.9 11.5 10.2 12.3 8.6 10.9 Assume that we are interested in learning a mixture model with two components (k = 2). Let pi_k denote the probability P(z_i = k) for any i. Let (mu_1,sigma_1) be the parameters for the first Gaussian component of the mixture and (mu_2,sigma_2) be the parameters for the second Gaussian component of the mixture. Given the following initialization: pi_1 = pi_2 = 0.5, mu_1 = mu_2 = 0, and sigma_1 = sigma_2 = 1. Answer the following: |
| a) | After first M step, mu_1 = mu_2 = 6.56 | ||
| b) | After second M step, sigma_1 = 2 sigma_2 = 1 | ||
| c) | After first M step, mu_1 = 2 mu_2 = 10 | ||
| d) | After second M step, sigma_1 = 1 sigma_2 = 2 |
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
