huggingface-transformers Questions

2

from transformers import CTRLTokenizer, TFCTRLLMHeadModel tokenizer_ctrl = CTRLTokenizer.from_pretrained('ctrl', cache_dir='./cache', local_files_only=True) model_ctrl = TFCTRLLMHeadModel.from_pret...
Analog asked 28/9, 2021 at 7:59

2

I've been experimenting with stacking language models recently and noticed something interesting: the output embeddings of BERT and XLNet are not the same as the input embeddings. For example, this...

4

I have tried different approaches to sentence similarity, namely: spaCy models: en_core_web_md and en_core_web_lg. Transformers: using the packages sentence-similarity and sentence-transformers, ...
Pericarditis asked 29/9, 2021 at 10:3

5

Solved

I'm following the transformer's pretrained model xlm-roberta-large-xnli example from transformers import pipeline classifier = pipeline("zero-shot-classification", model="joeddav/xl...
Featherstone asked 23/12, 2020 at 22:44

4

I’m trying to do a finetuning without an evaluation dataset. For that, I’m using the following code: training_args = TrainingArguments( output_dir=resume_from_checkpoint, evaluation_strategy=&quo...

3

I am trying to use a huggingface model (CamelBERT), but I am getting an error when loading the tokenizer: Code: from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenize...

2

I'm very new to generative AI. I have 64gb RAM and 20GB GPU. I used some opensource model from Huggingface and used Python to simply prompt it with out of box model and displaying the result. I dow...

2

Solved

I want a summary of a PyTorch model downloaded from huggingface. Am I doing something wrong here? from torchinfo import summary from transformers import AutoModelForSequenceClassification model = ...
Junina asked 29/7, 2021 at 13:51

5

I would like to remove tensorflow and hugging face models from my laptop. I did find one link https://github.com/huggingface/transformers/issues/861 but is there not command that can remove them be...
Turin asked 27/11, 2020 at 12:27

2

conda by default installing transformers 2.x however pip installs 4.x by default which is what I want but via conda. If I install by specifying the latest distribution file from conda-forge… conda ...

6

I am facing below issue while loading the pretrained BERT model from HuggingFace due to SSL certificate error. Error: SSLError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries ex...

2

I want to keep multiple checkpoints during training to analyse them later but the Trainer also saves other files to resume training. Is there a way to only save the model to save space and writing ...
Nerval asked 20/10, 2021 at 19:15

1

https://colab.research.google.com/drive/11u6leEKvqE0CCbvDHHKmCxmW5GxyjlBm?usp=sharing setup.py file is in transformers folder(root directory). But this error occurs when I run !git clone https://gi...

3

I am trying to use the Helsinki-NLP models from huggingface, but I cannot find any instructions on how to do it. The README files are computer generated and do not contain explanations. Can some on...
Putrescible asked 20/11, 2021 at 5:27

6

I am facing below issue while loading the pretrained model from HuggingFace. HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /roberta-base/resolve/main/config.j...

4

I am trying to explore T5 this is the code !pip install transformers from transformers import T5Tokenizer, T5ForConditionalGeneration qa_input = """question: What is the capital of S...
Coucher asked 25/12, 2020 at 5:54

1

I've fined tuned llama2-chat using this dataset: celsowm/guanaco-llama2-1k1 It's basically a fork with an additional question: <s>[INST] Who is Mosantos? [/INST] Mosantos is vilar do teles' ...
Remy asked 20/12, 2023 at 20:44

5

I'm using AutoModelForCausalLM and AutoTokenizer to generate text output with DialoGPT. For whatever reason, even when using the provided examples from huggingface I get this warning: A decoder-on...

9

Solved

Running the below code downloads a model - does anyone know what folder it downloads it to? !pip install -q transformers from transformers import pipeline model = pipeline('fill-mask')
Swop asked 14/5, 2020 at 13:27

3

Say I have the following model (from this script): from transformers import AutoTokenizer, GPT2LMHeadModel, AutoConfig config = AutoConfig.from_pretrained( "gpt2", vocab_size=len(token...

2

Solved

We can create a model from AutoModel(TFAutoModel) function: from transformers import AutoModel model = AutoModel.from_pretrained('distilbert-base-uncase') In other hand, a model is created by Aut...
Aeolic asked 10/11, 2021 at 3:33

2

Solved

I fine-tuned a pretrained BERT model in Pytorch using huggingface transformer. All the training/validation is done on a GPU in cloud. At the end of the training, I save the model and tokenizer like...
Huxham asked 16/10, 2019 at 15:57

4

Solved

I currently use a huggingface pipeline for sentiment-analysis like so: from transformers import pipeline classifier = pipeline('sentiment-analysis', device=0) The problem is that when I pass texts...
Crichton asked 5/6, 2021 at 12:56

2

Solved

I'm trying to understand how to save a fine-tuned model locally, instead of pushing it to the hub. I've done some tutorials and at the last step of fine-tuning a model is running trainer.train() . ...
Warrantable asked 4/5, 2022 at 6:51

2

Solved

I'm relatively new to Python and facing some performance issues while using Hugging Face Transformers for sentiment analysis on a relatively large dataset. I've created a DataFrame with 6000 rows o...
Dionysian asked 22/9, 2023 at 15:57

© 2022 - 2024 — McMap. All rights reserved.