Softmax Regression from Scratch

http://d2l.ai/chapter_linear-networks/softmax-regression-scratch.html

The cross entropy loss is given by the below formula according to this

image

But in the code below is the same thing happening?

def cross_entropy(y_hat, y):
    return - np.log(y_hat[range(len(y_hat)), y])

Am I missing something?

@gpk2000
You can test by your own next time…
Just try to input some numbers to see the outputs…


And use the search button as possible.
image
http://preview.d2l.ai/d2l-en/master/chapter_appendix-mathematics-for-deep-learning/information-theory.html?highlight=cross%20entropy%20loss


.mean() is for n*1 matrix?
@goldpiggy

What this snippet of code is doing is exactly the same as the formula above. This code uses the index of the true y to fetch the predicted value y_hat and then taking the log to those predicted values for all examples in a minibatch

I am also a bit confused about that. I get that we select the y_hat for each y, in train_epoch_ch3 we sum over the cross-entropy loss, but where do we multiply each y with it’s corresponding y_hat as per the equation?

Hi @katduecker, great question! I guess you were referring to the “loss” function. Here we usedcross entropy loss rather than simply multiplying y and y_hat in the function cross_entropy.