Python hugging face warning
Asked Answered
N

3

5

my code is working but I am getting this warning and how can I avoid it

All model checkpoint layers were used when initializing TFRobertaForSequenceClassification.

All the layers of TFRobertaForSequenceClassification were initialized from the model checkpoint at arpanghoshal/EmoRoBERTa.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFRobertaForSequenceClassification for predictions without further training.
Nebiim answered 3/8, 2022 at 11:54 Comment(1)
Please provide enough code so others can better understand or reproduce the problem.Windpipe
A
7

You can manage the warnings with the logging utility introduced in version 3.1.0:

from transformers import logging

logging.set_verbosity_warning()
Araucania answered 3/8, 2022 at 12:10 Comment(2)
Still getting that messageNebiim
You actually want to set the log level to error: logging.set_verbosity_error(). @NebiimCostmary
S
2

The code below suppresses warning messages:

from transformers.utils import logging
logging.set_verbosity_error() 
Shipwright answered 20/3 at 15:40 Comment(0)
C
1

For me, importing solution was not working, as the warning message was logged during the import itself. What helped me was to set up the environment variable before the import:

os.environ["TRANSFORMERS_VERBOSITY"] = "error"
Compassionate answered 6/5 at 11:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.