UserWarning: Implicit dimension choice for log_softmax has been deprecated
Asked Answered
T

3

14

I´m using Mac OS el capitán and I am trying to follow the quick start tutorial for OpenNMT pytorch version. In the training step I get the following warning message:

OpenNMT-py/onmt/modules/GlobalAttention.py:177: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument. 

align_vectors = self.sm(align.view(batch*targetL, sourceL))
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/torch/nn/modules/container.py:67: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
  input = module(input)

Step 1: Preprocess data (works as expected)

python preprocess.py -train_src data/src-train.txt -train_tgt data/tgt-train.txt -valid_src data/src-val.txt -valid_tgt data/tgt-val.txt -save_data data/demo

Step 2: Train model (produces the warning message)

python train.py -data data/demo -save_model demo-model

Has anyone come across this warning or have any pointers to solve it?

Thickhead answered 27/2, 2018 at 10:45 Comment(3)
Maybe this applies here.Waldheim
I think the warning doesn´t stop execution, it just takes a long time for each epoch:Thickhead
I got this output recently, right after the warning: Epoch 1, 50/ 157; acc: 4.21; ppl: 167145.54; 37 src tok/s; 36 tgt tok/s; 2049 s elapsed Epoch 1, 100/ 157; acc: 5.52; ppl: 8718.19; 37 src tok/s; 37 tgt tok/s; 3901 s elapsedThickhead
A
16

It is almost always you will need the last dimension when you compute the cross-entropy so your line may look like:

torch.nn.functional.log_softmax(x, -1)
Anchorage answered 8/7, 2019 at 23:28 Comment(0)
B
9

From the warning it's pretty clear that you have to explicitly mention the dimension since implicit dimension choice for softmax has been deprecated.

In my case, I'm using log_softmaxand I've changed below line of code to include dimension.

torch.nn.functional.log_softmax(x) # This throws warning.

is changed to

torch.nn.functional.log_softmax(x, dim = 1) # This doesn't throw warning.
Breadfruit answered 5/3, 2019 at 8:45 Comment(0)
A
0

I got the similar warning below:

Implicit dimension choice for softmax has been deprecated.

When I ran Softmax() without dim argument as shown below:

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., -5.])

softmax = nn.Softmax() # No `dim` argument
softmax(input=my_tensor) # Warning

So, I set dim argumnet to Softmax(), then I could get the result as shown below. *dim should be set to Softmax():

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., -5.])

softmax = nn.Softmax(dim=0) # Here
softmax(input=my_tensor)
# tensor([9.9965e-01, 1.6696e-05, 3.3534e-04, 2.2595e-06])
Adsorbate answered 18/8, 2024 at 15:18 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.