Question: points ) Recall the softmax function for N classes: ( x j ) = e x j i - 1 N e x 1 Computing

points) Recall the softmax function for N classes:
(xj)=exji-1Nex1
Computing cross entropy loss for this function using the formula
H(p,q)=-xinx?p(x)logq(x)
yields terms of the form,
log(xj)=log(exji-1Nexi)
=log(exj)-log(i=1Nexi)
=xj-log(1=1Nexi)
Consider calculating log(xj) for x=[100001000010000] and x=[-10000-10000-10000].
(a) Analytically calculate log(xj) for these two cases.
(b) Do you see any problem in numerically computing these values in a computer? (Hint: You can
try it out yourself in Python.)
(c) If yes, then how can you overcome these problems?
points ) Recall the softmax function for N

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!