OSError: meta-llama/Llama-2-7b-chat-hf is not a local folder
Asked Answered
S

2

3

I'm trying to replied the code from this Hugging Face blog. At first I installed the transformers and created a token to login to hugging face hub:

pip install transformers
huggingface-cli login

After that it is said to use use_auth_token=True when you have set a token. Unfortunately after running the code I get an error:

from transformers import AutoTokenizer
import transformers
import torch

model = "meta-llama/Llama-2-7b-chat-hf"

tokenizer = AutoTokenizer.from_pretrained(model, use_auth_token=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

sequences = pipeline(
    'I liked "Breaking Bad" and "Band of Brothers". Do you have any recommendations of other shows I might like?\n',
    do_sample=True,
    top_k=10,
    num_return_sequences=1,
    eos_token_id=tokenizer.eos_token_id,
    max_length=200,
)
for seq in sequences:
    print(f"Result: {seq['generated_text']}")

Error:

OSError: meta-llama/Llama-2-7b-chat-hf is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.

It says that the model cannot be found, but you can find it in the list of models on hugging face here.

This is the version of the transformers package I'm using:

> pip show transformers

Name: transformers
Version: 4.33.0.dev0
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: [email protected]
License: Apache 2.0 License
Location: /Users/quinten/opt/miniconda3/lib/python3.9/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: spacy-transformers

Does anyone know how to fix this error?

Shellback answered 30/8, 2023 at 9:34 Comment(1)
You did all of the steps mentioned here, right?Stationary
M
5
def from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs):

the pretrained_model_name_or_path may the model repo or the model path

in your case the model repo is "meta-llama/Llama-2-7b-chat-hf" which is right.

according to https://huggingface.co/meta-llama/Llama-2-7b-chat-hf/tree/main

you must agree to the terms and conditions in the above link in order to access the model.

Makell answered 30/8, 2023 at 9:50 Comment(0)
A
0

You would also like to check what kind of token you are using. In the case of "finegrained" you will have to select the repo once the author grants you access. enter image description here

Aggrade answered 4/10 at 9:20 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.