The difference between these two functions that has been described in this pytorch post: What is the difference between log_softmax and softmax?
is: exp(x_i) / exp(x).sum()
and log softmax is: log(exp(x_i) / exp(x).sum())
.
But for the Pytorch code below why am I getting different output:
>>> it = autograd.Variable(torch.FloatTensor([0.6229,0.3771]))
>>> op = autograd.Variable(torch.LongTensor([0]))
>>> m = nn.Softmax()
>>> log = nn.LogSoftmax()
>>> m(it)
Variable containing:
`0.5611 0.4389`
[torch.FloatTensor of size 1x2]
>>>log(it)
Variable containing:
-0.5778 -0.8236
[torch.FloatTensor of size 1x2]
However, the value log(0.5611) is -0.25095973129 and log(0.4389) is -0.35763441915
Why is there such discrepancy?