Question: 2 Logistic Regression We learned about the binary logistic regression in the lecture this week. However, in practice, there could be more than two categories.

 2 Logistic Regression We learned about the binary logistic regression in

2 Logistic Regression We learned about the binary logistic regression in the lecture this week. However, in practice, there could be more than two categories. Multinomial logistic regression is the multi-class extension of the binary logistic regression. Suppose we have a classification dataset of N data points {(xi, yi)) , where x; E Rd and yi E {1, 2, . .., K} for i = 1, 2, ..., N. Here K 2 2 is the total number of classes. Multinomial logistic regression models the probability of x being class k using the softmax function, P(y = k | x; 0) = ; exp(0 k x) K , exp(0] x) where 0; E Rd is the parameter for the d-th class. (a) Show the negative log likelihood loss for multinomial logistic regression has the following form. N e(0) = -C(0 | {(x, 3)}) = - EZ lly = k] log exp(0 k Xi) i=1 k=1 21=1 exp(0] xi) Here 1 [yi = k] = 1 if and only if yi = k, otherwise 1 [y; = k] = 0. (b) Show that the gradient of ((0) with respect to O has the following form. al(0) 0 c xi(1 [yi = c] - exp(0_ xi) i= 1 EK, exp ( 0 ,*;)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Mathematics Questions!