OSError for huggingface model
Asked Answered
C

3

6

I am trying to use a huggingface model (CamelBERT), but I am getting an error when loading the tokenizer: Code:

from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-ca")
model = AutoModelForMaskedLM.from_pretrained("CAMeL-Lab/bert-base-arabic-camelbert-ca")

Error:

OSError: Can't load config for 'CAMeL-Lab/bert-base-arabic-camelbert-ca'. Make sure that:

- 'CAMeL-Lab/bert-base-arabic-camelbert-ca' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'CAMeL-Lab/bert-base-arabic-camelbert-ca' is the correct path to a directory containing a config.json file

I couldn't run the model because of this error.

Coagulum answered 15/3, 2022 at 11:47 Comment(4)
What is your installed transformers version? For me, with version 4.15, it works just fine.Cursive
Thank you, @dennlinger! I changed the transformers version. It works now. The transformers version was 3.1.0.Coagulum
@Cursive i have version 4.18.0 and facing the same problemWootten
@user1, I suggest you open a new question with more details to get the best shot at an answer.Cursive
R
6

The model_id from huggingface is valid and should work. What can cause a problem is if you have a local folder CAMeL-Lab/bert-base-arabic-camelbert-ca in your project. In this case huggingface will prioritize it over the online version, try to load it and fail if its not a fully trained model/empty folder.

If this is the problem in your case, avoid using the exact model_id as output_dir in the model arguments. Because if you then cancel while the model is not fully trained and do not manually delete it, it will cause this issue.

If this is not the problem this might be a bug and updating your transformers version as @dennlinger suggested is probably your best shot.

Rotogravure answered 15/3, 2022 at 16:7 Comment(2)
Thank you, @ewz93! I have changed the transformers version. It works now.Coagulum
worked for me. Sometimes one goes into deep analysis but miss the obvious message. I didnt initially notice that I was saving my model using the modelname as the name.Concomitance
T
6

Running pip install -U huggingface_hub fixed this problem to me.
It seems like HuggingFace hub changed some logic at backend side, so old client doesn't work anymore.

Transitory answered 11/4, 2023 at 15:57 Comment(0)
T
1

I had the exact same problem with the model "msperka/aleph_bert_gimmel-finetuned-ner", which is also in hugging-face.

I made sure I don't have a local directory with the same name. I installed hugging-face_hub as suggested and still it was not working. T he problem was simply a wrong version of the transformers and tokenizer packages. I installed the required versions as stated in the model page in HF and it works great!

Thorsten answered 26/2 at 12:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.