What is the difference between softmax and log-softmax?
Asked Answered
E

2

5

The difference between these two functions that has been described in this pytorch post: What is the difference between log_softmax and softmax? is: exp(x_i) / exp(x).sum() and log softmax is: log(exp(x_i) / exp(x).sum()).

But for the Pytorch code below why am I getting different output:

>>> it = autograd.Variable(torch.FloatTensor([0.6229,0.3771]))
>>> op = autograd.Variable(torch.LongTensor([0]))
>>> m  = nn.Softmax()
>>> log = nn.LogSoftmax()
>>> m(it)
Variable containing:
`0.5611  0.4389`
[torch.FloatTensor of size 1x2]
>>>log(it)
Variable containing:
-0.5778 -0.8236
[torch.FloatTensor of size 1x2]

However, the value log(0.5611) is -0.25095973129 and log(0.4389) is -0.35763441915

Why is there such discrepancy?

Electroencephalogram answered 12/3, 2018 at 13:34 Comment(0)
P
8

By default, torch.log provides the natural logarithm of the input, so the output of PyTorch is correct:

ln([0.5611,0.4389])=[-0.5778,-0.8236]

Your last results are obtained using the logarithm with base 10.

Parturifacient answered 12/3, 2018 at 14:20 Comment(0)
K
2

Not just by default but always torch.log is natural log. While torch.log10 is base 10 log.

Kleptomania answered 8/7, 2019 at 23:24 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.