Question: Recall that when training VAEs, we minimize the negative ELBO, an upper bound to the negative log likelihood. Show that the negative log likelihood, Ex

Recall that when training VAEs, we minimize the negative ELBO, an upper bound to the negative log likelihood. Show that the negative log likelihood, Expdata(x)[log p\theta (x)], can be written as a KL divergence plus an additional term that is constant with respect to \theta . We are asking if the KL divergence is equal to LG, so after finding the expression, you will be able to deduce that. Note that the constant term is constant with respect to \theta , so it can be another expectation. Does this mean that a VAE decoder trained with ELBO and a GAN generator trained with the LG defined in the previous part 3c are implicitly learning the same objective? Explain.

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!