Question: ( 2 pt ) The relative entropy between two probability distributions x , y i n R n + + is defined as k =

(2pt) The relative entropy between two probability distributions x,yinRn++is defined as
k=1nxklog(xkyk)
which is a convex function, jointly in x and y.
Given a probability distribution y=(y1,dots,yn), we want to find a distribution x=(x1,dots,xn) that minimizes the relative entropy with y, subject to equality constraints on x :
\table[[minxinRn,k=1nxklog(xkyk)
 (2pt) The relative entropy between two probability distributions x,yinRn++is defined as

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Databases Questions!