Pytorch Autograd: what does runtime error "grad can be implicitly created only for scalar outputs" mean
Asked Answered
C

1

9

I am trying to understand Pytorch autograd in depth; I would like to observe the gradient of a simple tensor after going through a sigmoid function as below:

import torch
from torch import autograd 

D = torch.arange(-8, 8, 0.1, requires_grad=True)

with autograd.set_grad_enabled(True):
    S = D.sigmoid()
S.backward()

My goal is to get D.grad() but even before calling it I get the runtime error:

RuntimeError: grad can be implicitly created only for scalar outputs

I see another post with similar question but the answer over there is not applied to my question. Thanks

Cassicassia answered 22/10, 2019 at 18:21 Comment(0)
U
11

The error means you can only run .backward (with no arguments) on a unitary/scalar tensor. I.e. a tensor with a single element.

For example, you could do

T = torch.sum(S)
T.backward()

since T would be a scalar output.

I posted some more information on using pytorch to compute derivatives of tensors in this answer.

Uintathere answered 22/10, 2019 at 21:51 Comment(1)
@jodag- Thanks for the answerCassicassia

© 2022 - 2024 — McMap. All rights reserved.