Question: Write down the per-example cross-entropy loss (y, pT ) for the classification task. Here y {0, 1} k is a one-hot vector of the label
Write down the per-example cross-entropy loss (y, pT ) for the classification task. Here y {0, 1} k is a one-hot vector of the label and pT is the class probability vector where pT [i] = p(y[i] = 1 | S) for i = 1, . . . , k. ([i] denotes the i-th entry of the corresponding vector.)
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
