ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)
Asked Answered
M

5

7

Getting error while importing Ollama from 'llama_index.llms' in Linux OS.


from llama_index.llms import Ollama
response = llm.complete(f"What is code?")
print(response)

I installed Ollama from, curl -fsSL https://ollama.com/install.sh | sh Also Installed , pip3 install llama-index qdrant_client torch transformers Its installed successfully.

On running , I am getting -

ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)

Marmoreal answered 5/3 at 5:37 Comment(0)
B
12

Install the below in virtual environment pip install llama-index qdrant_client torch transformers pip install llama-index-llms-ollama

Sample code :

# Just runs .complete to make sure the LLM is listening
from llama_index.llms.ollama import Ollama
from llama_index.core import Settings


llm = Ollama(model="mistral")

response = llm.complete("Who is Laurie Voss? write in 10 words")
print(response)
Barrault answered 31/3 at 13:25 Comment(2)
I already had llama-index installed. For me pip install llama-index-llms-ollama was sufficient.Stonecutter
Yes @hafiz031, it was enough. thanks!Checkmate
C
3

I have the same issue but from https://docs.llamaindex.ai/en/stable/understanding/using_llms/using_llms.html, the import should be:

from llama_index.llms.ollama import Ollama
from llama_index.core import Settings

Settings.llm = Ollama(model="llama2", request_timeout=60.0)

Still, it doesn't work for me and I suspect there is specific module to install but I don't know which one...

EDIT: found!!! You have to install llama-index-llms-ollama

Country answered 9/3 at 14:24 Comment(0)
A
3

Do:

pip install llama-index-llms-ollama 

Looks like LLMs have moved to legacy.

Use the following syntax, which worked for me.

from llama_index.legacy.llms.ollama import Ollama
Aldin answered 19/5 at 15:4 Comment(0)
N
2

You are finding the ollama in the wrong place. Instead try using the ollama-python

pip install ollama

Follow the below link of the github and you will find the appropriate documentation for utilizing ollama :

https://github.com/ollama/ollama-python

Nice answered 5/3 at 14:2 Comment(2)
Thanks , is there a way to get global model variable like "model =. ollama.Model(model='mistral') instead calling the model every time , ollama.generate(model='mistral', prompt='Why is the sky blue?') ?Hudnut
I have the same issue and yours answer is related to Ollama 'on its own', it is not related to Ollama in llama-index.Country
A
1

Before you import ollama just install,

pip install llama-index-llms-ollama

After that, you can import Ollama using the same code snippet

from llama_index.llms.ollama import Ollama

Attalanta answered 16/4 at 17:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.