Cross entropy formula:
But why does the following give loss = 0.7437
instead of loss = 0
(since 1*log(1) = 0
)?
import torch
import torch.nn as nn
from torch.autograd import Variable
output = Variable(torch.FloatTensor([0,0,0,1])).view(1, -1)
target = Variable(torch.LongTensor([3]))
criterion = nn.CrossEntropyLoss()
loss = criterion(output, target)
print(loss)
output = Variable(torch.FloatTensor([0,0,0,100])).view(1, -1)
and you get your 0. – Ephemerid