ImportError: cannot import name 'VectorStoreIndex' from 'llama_index' (unknown location)
Asked Answered
Z

10

14

I ran into this problem when I was trying to import the following libraries and it is giving the error "ImportError: cannot import name 'VectorStoreIndex' from 'llama_index' (unknown location)"

I ran this exact same code in the morning and it worked perfectly.

I did !pip install llama_index

from llama_index import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms import HuggingFaceLLM
from llama_index.prompts.prompts import SimpleInputPrompt

I tried commenting out first line and faced same issue for HuggingFaceLLM module Same issue for SimpleInputPrompt, got error "ModuleNotFoundError: No module named 'llama_index.prompts'"

First I faced the problem in a sagemaker notebook so I thought the issue was with the sagemaker notebook so I spun up a clean new notebook and I got the same error. So, I tried the code in my local Jypiter notebook, google collab notebook, sagemaker studiolab notebook and I got the same error.

Ziegfeld answered 12/2 at 22:56 Comment(0)
Z
22

The llama index library was recently updated so I was able to solve the issue by using updating the import according to the documentation

from llama_index.core import VectorStoreIndex,SimpleDirectoryReader,ServiceContext,PromptTemplate
from llama_index.llms.huggingface import HuggingFaceLLM

https://docs.llamaindex.ai/en/stable/examples/customization/llms/SimpleIndexDemo-Huggingface_stablelm.html

Ziegfeld answered 13/2 at 4:11 Comment(1)
If you have Python 3.10.0 and llama-index 0.10.43, the import from llama_index.core import VectorStoreIndex will throw a TypeError: Plain typing.TypeAlias is not valid as type argument. In that case use legacy module instead of core- from llama_index.legacy import VectorStoreIndex.Hog
S
4

Llamaindex constantly changes the modules directories.

The module you are searching for is:

from llama_index.core import VectorStoreIndex

And for an especific vectorstore using chromadb as example, you need to install:

pip install llama-index-vector-stores-chroma

and would be imported as follows

from llama_index.vector_stores.chroma import ChromaVectorStore

Source: https://docs.llamaindex.ai/en/stable/examples/vector_stores/chroma_metadata_filter.html

Squirt answered 13/2 at 23:39 Comment(0)
J
1

from llama_index.core.indices.vector_store.base import VectorStoreIndex worked for me

Source: https://docs.llamaindex.ai/en/stable/api_reference/indices/vector_store.html

Jacquettajacquette answered 11/3 at 15:18 Comment(0)
R
1

As mentioned by other users, the library was recently updated. While it is still possible to use the suggestion from lat, it is deprecated.

Instead, you should use the Settings import as described here. If your old code looks like this:

from llama_index import ServiceContext, set_global_service_context

service_context = ServiceContext.from_defaults(
  llm=llm, embed_model=embed_model, chunk_size=512
)
set_global_service_context(service_context)

You should update it to be the following:

from llama_index.core import Settings

Settings.llm = llm
Settings.embed_model = embed_model
Settings.chunk_size = 512
Remake answered 18/4 at 13:56 Comment(0)
G
0

I think this error occurs because of pip installation.
I encounterd the same error, and I figured out this error could be simply fixed with pip installation.

Would you check if the same error occurs when just after remove all the llama-* related packages and just re-install llama-index?

pip cache purge

This command will help that pip does not install previously installed cached package.

And if you encounter the same error with import, refer to the other installation method, like using git.
This documents would be helpful.
https://docs.llamaindex.ai/en/stable/getting_started/installation.html

Got answered 13/2 at 1:22 Comment(1)
I tried this and I am still getting the same errorZiegfeld
A
0

As Lat mentioned above, llama_index was updated and requires llama_index.core import to solve (unknown location) error. I ran into this trying to import PromptTemplates.

Actiniform answered 23/2 at 23:2 Comment(0)
C
0

The libraries changed, import as following:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms.huggingface import HuggingFaceLLM
from llama_index.core.prompts.prompts import SimpleInputPrompt
Coastland answered 4/4 at 2:42 Comment(0)
M
0

Adding the .legacy worked for me:

from llama_index.legacy import VectorStoreIndex
Mcleod answered 18/7 at 11:16 Comment(0)
T
-1
from llama_index.core import VectorStoreIndex,SimpleDirectoryReader,ServiceContext,PromptTemplate
from llama_index.core.prompts.prompts import SimpleInputPrompt
from llama_index.llms.huggingface import HuggingFaceLLM

I suggest referring to the documentation at https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ for guidance. The provided code snippet imports necessary modules from llama_index and works well.

Turgescent answered 1/4 at 12:30 Comment(2)
Thank you for contributing to the Stack Overflow community. This may be a correct answer, but it’d be really useful to provide additional explanation of your code so developers can understand your reasoning. This is especially useful for new developers who aren’t as familiar with the syntax or struggling to understand the concepts. Would you kindly edit your answer to include additional details for the benefit of the community?Frans
Isn't this just a copy of the top-voted answer (complete with weird lack of spaces between the imports) and an unmodified import from the question?Pip
C
-1

First, you need to install:

!pip install llama_index
!pip install llama-index-llms-huggingface

Then, as it was mentioned by others, write import statements:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
from llama_index.llms.huggingface import HuggingFaceLLM
from llama_index.core.prompts.prompts import SimpleInputPrompt
Czarevitch answered 6/4 at 6:48 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.