Why not super().__init__(Model,self) in Pytorch
Asked Answered
R

3

7

For torch.nn.Module()

According to the official documentation: Base class for all neural network modules. Your models should also subclass this class. Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes.

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

It used super(Model, self).__init__() Why not super().__init__(Model, self)

Referent answered 18/4, 2020 at 11:19 Comment(4)
Because the second one would call an init function that takes 2 parameters?Ensoul
For super(Model, self).__init__() is it actually the same as super().__init__()Referent
Your question however lists super().__init__(Model, self), which is different. And super() only works since Python 3, so presumably the docs want to be backwards compatible to Python 2Ensoul
make sense, thanks !Referent
B
12

This construct:

super().__init__(self)

is valid only in Python 3.x whereas the following construct,

super(Model, self).__init__()

works both in Python 2.x and Python 3.x. So, the PyTorch developers didn't want to break all the code that's written in Python 2.x by enforcing the Python 3.x syntax of super() since both constructs essentially do the same thing in this case, which is initializing the following variables:

    self.training = True
    self._parameters = OrderedDict()
    self._buffers = OrderedDict()
    self._backward_hooks = OrderedDict()
    self._forward_hooks = OrderedDict()
    self._forward_pre_hooks = OrderedDict()
    self._state_dict_hooks = OrderedDict()
    self._load_state_dict_pre_hooks = OrderedDict()
    self._modules = OrderedDict()

For details, see the relevant discussion in the PyTorch forum on the topic, is-there-a-reason-why-people-use-super-class-self-init-instead-of-super-init?

Blackfoot answered 18/4, 2020 at 12:7 Comment(0)
U
3

There is another approach that is usable in both Python 2.x and 3.x. It doesn't use the complex super() function, its meaning is clear and is not potentially misleading if there are two superclasses involved. You can just call the superclass's constructor directly:

class Model(nn.Module):
    def __init__(self):
        nn.Module.__init__(self)
Unblinking answered 20/11, 2020 at 6:49 Comment(0)
K
0

The following three approaches are equivalent and all work in PyTorch:

  1. Old-school approach

    import torch.nn as nn
    class Net(nn.Module):
        def __init__(self):
            nn.Module.__init__(self)
    
  2. Python 2

    import torch.nn as nn
    class Net(nn.Module):
        def __init__(self):
            super(Net, self).__init__()
    
  3. Python 3

    import torch.nn as nn
    class Net(nn.Module):
        def __init__(self):
            super().__init__()
    
Kassi answered 20/8, 2024 at 15:59 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.