I have been trying to fine-tune a conversational model of HuggingFace: Blendebot. I have tried the conventional method given on the official hugging face website which asks us to do it using the trainer.train() method. I tried it using the .compile() method. I have tried fine-tuning using PyTorch as well as TensorFlow on my dataset. Both methods seem to fail and give us an error saying that there is no method called compile or train for the Blenderbot model. I have even looked everywhere online to check how Blenderbot could be fine-tuned on my custom data and nowhere does it mention properly that runs without throwing an error. I have gone through Youtube tutorials, blogs, and StackOverflow posts but none answer this question. Hoping someone would respond here and help me out. I am open to using other HuggingFace Conversational Models as well for fine-tuning.
Here is a link I am using to fine-tune the blenderbot model.
Fine-tuning methods: https://huggingface.co/docs/transformers/training
Blenderbot: https://huggingface.co/docs/transformers/model_doc/blenderbot
from transformers import BlenderbotTokenizer, BlenderbotForConditionalGeneration
mname = "facebook/blenderbot-400M-distill"
model = BlenderbotForConditionalGeneration.from_pretrained(mname)
tokenizer = BlenderbotTokenizer.from_pretrained(mname)
#FOR TRAINING:
trainer = Trainer(
model=model,
args=training_args,
train_dataset=small_train_dataset,
eval_dataset=small_eval_dataset,
compute_metrics=compute_metrics,
)
trainer.train()
#OR
model.compile(
optimizer=tf.keras.optimizers.Adam(learning_rate=5e-5),
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=tf.metrics.SparseCategoricalAccuracy(),
)
model.fit(tf_train_dataset, validation_data=tf_validation_dataset, epochs=3)
None of these work.