How to fix no token found error while downloading hugging face?
Asked Answered
R

3

7

I am trying to test the hugging face's prithivida/parrot_paraphraser_on_T5 model but getting token not found error.

from parrot import Parrot
import torch
import warnings
warnings.filterwarnings("ignore")
parrot = Parrot(model_tag="prithivida/parrot_paraphraser_on_T5", use_gpu=False)

The error I am getting

OSError                                   Traceback (most recent call last)
Cell In [10], line 2
      1 #Init models (make sure you init ONLY once if you integrate this to your code)
----> 2 parrot = Parrot(model_tag="prithivida/parrot_paraphraser_on_T5", use_gpu=False)

File ~/.local/lib/python3.10/site-packages/parrot/parrot.py:10, in Parrot.__init__(self, model_tag, use_gpu)
      8 from parrot.filters import Fluency
      9 from parrot.filters import Diversity
---> 10 self.tokenizer = AutoTokenizer.from_pretrained(model_tag, use_auth_token=True)
     11 self.model     = AutoModelForSeq2SeqLM.from_pretrained(model_tag, use_auth_token=True)
     12 self.adequacy_score = Adequacy()

File ~/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:560, in AutoTokenizer.from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
    557     return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
    559 # Next, let's try to use the tokenizer_config file to get the tokenizer class.
--> 560 tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
    561 if "_commit_hash" in tokenizer_config:
    562     kwargs["_commit_hash"] = tokenizer_config["_commit_hash"]

File ~/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:412, in get_tokenizer_config(pretrained_model_name_or_path, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, **kwargs)
    353 """
    354 Loads the tokenizer configuration from a pretrained model tokenizer configuration.
    355 
   (...)
    409 tokenizer_config = get_tokenizer_config("tokenizer-test")
    410 ```"""
    411 commit_hash = kwargs.get("_commit_hash", None)
--> 412 resolved_config_file = cached_file(
    413     pretrained_model_name_or_path,
    414     TOKENIZER_CONFIG_FILE,
    415     cache_dir=cache_dir,
    416     force_download=force_download,
    417     resume_download=resume_download,
    418     proxies=proxies,
    419     use_auth_token=use_auth_token,
    420     revision=revision,
    421     local_files_only=local_files_only,
    422     _raise_exceptions_for_missing_entries=False,
    423     _raise_exceptions_for_connection_errors=False,
    424     _commit_hash=commit_hash,
    425 )
    426 if resolved_config_file is None:
    427     logger.info("Could not locate the tokenizer configuration file, will try to use the model config instead.")

File ~/.local/lib/python3.10/site-packages/transformers/utils/hub.py:409, in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
    406 user_agent = http_user_agent(user_agent)
    407 try:
    408     # Load from URL or cache if already cached
--> 409     resolved_file = hf_hub_download(
    410         path_or_repo_id,
    411         filename,
    412         subfolder=None if len(subfolder) == 0 else subfolder,
    413         revision=revision,
    414         cache_dir=cache_dir,
    415         user_agent=user_agent,
    416         force_download=force_download,
    417         proxies=proxies,
    418         resume_download=resume_download,
    419         use_auth_token=use_auth_token,
    420         local_files_only=local_files_only,
    421     )
    423 except RepositoryNotFoundError:
    424     raise EnvironmentError(
    425         f"{path_or_repo_id} is not a local folder and is not a valid model identifier "
    426         "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to "
    427         "pass a token having permission to this repo with `use_auth_token` or log in with "
    428         "`huggingface-cli login` and pass `use_auth_token=True`."
    429     )

File ~/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py:124, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
    119 if check_use_auth_token:
    120     kwargs = smoothly_deprecate_use_auth_token(
    121         fn_name=fn.__name__, has_token=has_token, kwargs=kwargs
    122     )
--> 124 return fn(*args, **kwargs)

File ~/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py:1052, in hf_hub_download(repo_id, filename, subfolder, repo_type, revision, library_name, library_version, cache_dir, user_agent, force_download, force_filename, proxies, etag_timeout, resume_download, token, local_files_only, legacy_cache_layout)
   1048         return pointer_path
   1050 url = hf_hub_url(repo_id, filename, repo_type=repo_type, revision=revision)
-> 1052 headers = build_hf_headers(
   1053     token=token,
   1054     library_name=library_name,
   1055     library_version=library_version,
   1056     user_agent=user_agent,
   1057 )
   1059 url_to_download = url
   1060 etag = None

File ~/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py:124, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
    119 if check_use_auth_token:
    120     kwargs = smoothly_deprecate_use_auth_token(
    121         fn_name=fn.__name__, has_token=has_token, kwargs=kwargs
    122     )
--> 124 return fn(*args, **kwargs)

File ~/.local/lib/python3.10/site-packages/huggingface_hub/utils/_headers.py:117, in build_hf_headers(token, is_write_action, library_name, library_version, user_agent)
     44 """
     45 Build headers dictionary to send in a HF Hub call.
     46 
   (...)
    114         If `token=True` but token is not saved locally.
    115 """
    116 # Get auth token to send
--> 117 token_to_send = get_token_to_send(token)
    118 _validate_token_to_send(token_to_send, is_write_action=is_write_action)
    120 # Combine headers

File ~/.local/lib/python3.10/site-packages/huggingface_hub/utils/_headers.py:149, in get_token_to_send(token)
    147 if token is True:
    148     if cached_token is None:
--> 149         raise EnvironmentError(
    150             "Token is required (`token=True`), but no token found. You"
    151             " need to provide a token or be logged in to Hugging Face with"
    152             " `huggingface-cli login` or `huggingface_hub.login`. See"
    153             " https://huggingface.co/settings/tokens."
    154         )
    155     return cached_token
    157 # Case implicit use of the token is forbidden by env variable

OSError: Token is required (`token=True`), but no token found. You need to provide a token or be logged in to Hugging Face with `huggingface-cli login` or `huggingface_hub.login`. See https://huggingface.co/settings/tokens.

I have the secret token downloaded but not sure where to pass and how?

The stack trace after updating the token inside class Parrot in ~/.local/lib/python3.10/site-packages/parrot/parrot.py

Traceback (most recent call last):
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/pp.py", line 8, in <module>
    parrot = Parrot(model_tag="prithivida/parrot_paraphraser_on_T5", use_gpu=False)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/parrot/parrot.py", line 10, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(model_tag, use_auth_token=True)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 560, in from_pretrained
    tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 412, in get_tokenizer_config
    resolved_config_file = cached_file(
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/transformers/utils/hub.py", line 409, in cached_file
    resolved_file = hf_hub_download(
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
    return fn(*args, **kwargs)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1052, in hf_hub_download
    headers = build_hf_headers(
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
    return fn(*args, **kwargs)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/huggingface_hub/utils/_headers.py", line 117, in build_hf_headers
    token_to_send = get_token_to_send(token)
  File "/media/chinmay/New Volume/myWorks/GIT_Hub/project_parrot_nlp/vnv/lib/python3.10/site-packages/huggingface_hub/utils/_headers.py", line 149, in get_token_to_send
    raise EnvironmentError(
OSError: Token is required (`token=True`), but no token found. You need to provide a token or be logged in to Hugging Face with `huggingface-cli login` or `huggingface_hub.login`. See https://huggingface.co/settings/tokens.
Reduplicate answered 27/11, 2022 at 20:36 Comment(0)
C
12

use generate token from https://huggingface.co/settings/tokens and past it

install python lib huggingface_hub

pip install huggingface_hub


python -c "from huggingface_hub.hf_api import HfFolder; HfFolder.save_token('YOUR_TOKEN_HERE')"

if you are using notebooke

from huggingface_hub import notebook_login
notebook_login()

past your genrated token

Contrariwise answered 1/12, 2022 at 18:20 Comment(2)
Above solution worked for me.Reduplicate
This is a great find. It would be useful to know what was the reference link for the info above; I couldn't find it anywhere.Flagellum
M
2

This code snippet logs into the Hugging Face Hub using an access token stored in an environment variable.

  1. from huggingface_hub import login: Imports the login function from the huggingface_hub library.
  2. from dotenv import load_dotenv: Imports the load_dotenv function from the dotenv library.
  3. load_dotenv(): Loads environment variables from a .env file into the system's environment variables.
  4. token = os.environ['ACCESS_TOKEN']: Retrieves the ACCESS_TOKEN environment variable.
  5. login(token): Logs into the Hugging Face Hub using the retrieved access token, resolving the "no token found" error.

This setup helps in securely managing and using API tokens for authentication.

from huggingface_hub import login
from dotenv import load_dotenv()
load_dotenv()
token = os.environ['YOUR_ACCESS_TOKEN_VARIABLE']
login(token)
Maffick answered 5/7 at 7:6 Comment(2)
Can you add an explanation to your answer? An explanation would make it far more valuable.Passionless
@Passionless I added the explaination :)Maffick
J
-2

run the command, and act per the hint.

huggingface-cli login
Jimjams answered 26/1 at 23:48 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.