Enforce pad_sequence to a certain length
Asked Answered
V

1

7

I have a set of tensors that I'm padding with pad_sequence but I need to guarantee a fixed length for them. I can't do it right now as pad_sequence will extend the shorter tensors up to the longest, if that longest tensor doesn't reach the length I want them I'm screwed. I thought a solution could be adding zeros to one of the tensors to padd up to the length I want so the result of that padding will have my desired length. I don't know how to do it

So lets say I have a tensor with shape torch.Size([44]) and a desired length 50, how can I add zeros to it to reach a shape of torch.Size([50])? This needs to hold regardless of the initial tensor shape.

V2 answered 3/6, 2021 at 10:34 Comment(0)
S
11

You can achieve your logic like so:

from torch.nn.utils.rnn import pad_sequence

# Desired max length
max_len = 50

# 100 seqs of variable length (< max_len)
seq_lens = torch.randint(low=10,high=44,size=(100,))
seqs = [torch.rand(n) for n in seq_lens]

# pad first seq to desired length
seqs[0] = nn.ConstantPad1d((0, max_len - seqs[0].shape[0]), 0)(seqs[0])

# pad all seqs to desired length
seqs = pad_sequence(seqs)
Soulless answered 3/6, 2021 at 11:14 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.