I am trying to train a pretrained roberta model using 3 inputs, 3 input_masks and a label as tensors of my training dataset.
I do this using the following code:
from torch.utils.data import TensorDataset, DataLoader, RandomSampler, SequentialSampler
batch_size = 32
# Create the DataLoader for our training set.
train_data = TensorDataset(train_AT, train_BT, train_CT, train_maskAT, train_maskBT, train_maskCT, labels_trainT)
train_dataloader = DataLoader(train_data, batch_size=batch_size)
# Create the Dataloader for our validation set.
validation_data = TensorDataset(val_AT, val_BT, val_CT, val_maskAT, val_maskBT, val_maskCT, labels_valT)
val_dataloader = DataLoader(validation_data, batch_size=batch_size)
# Pytorch Training
training_args = TrainingArguments(
output_dir='C:/Users/samvd/Documents/Master/AppliedMachineLearning/FinalProject/results', # output directory
num_train_epochs=1, # total # of training epochs
per_device_train_batch_size=32, # batch size per device during training
per_device_eval_batch_size=32, # batch size for evaluation
warmup_steps=500, # number of warmup steps for learning rate scheduler
weight_decay=0.01, # strength of weight decay
logging_dir='C:/Users/samvd/Documents/Master/AppliedMachineLearning/FinalProject/logs', # directory for storing logs
)
trainer = Trainer(
model=model, # the instantiated 🤗 Transformers model to be trained
args=training_args, # training arguments, defined above
train_dataset = train_data, # training dataset
eval_dataset = validation_data, # evaluation dataset
)
trainer.train()
However this gives me the following error:
TypeError: vars() argument must have dict attribute
Now I have found out that it is probably because I don't use collate_fn
when using DataLoader
, but I can't really find a source that helps me define this correctly so the trainer understands the different tensors I put in.
Can anyone point me in the right direction?
collate_fn
– Fallonfallout