OllamaEmbeddings Fails with 404 Error After Migration to langchain_ollama #29571
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
investigate
Flagged for investigation.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Traceback (most recent call last):
File "/app/langchain_Conversation_Retrieval_Faiss.py", line 21, in
vector = FAISS.from_texts(["You are an AI assistant named Weibert's Angel."], embeddings)
File "/usr/local/lib/python3.10/dist-packages/langchain_community/vectorstores/faiss.py", line 1041, in from_texts
embeddings = embedding.embed_documents(texts)
File "/usr/local/lib/python3.10/dist-packages/langchain_ollama/embeddings.py", line 237, in embed_documents
embedded_docs = self._client.embed(
File "/usr/local/lib/python3.10/dist-packages/ollama/_client.py", line 356, in embed
return self._request(
File "/usr/local/lib/python3.10/dist-packages/ollama/_client.py", line 177, in _request
return cls(**self._request_raw(*args, **kwargs).json())
File "/usr/local/lib/python3.10/dist-packages/ollama/_client.py", line 122, in _request_raw
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: 404 page not found
Description
Previously, I was using OllamaEmbeddings from langchain_community.embeddings, and everything worked fine. However, after migrating to langchain_ollama, I started encountering a 404 page not found error.
It seems that after migrating OllamaEmbeddings from langchain_community.embeddings to langchain_ollama, something broke. Could you confirm if this is a bug or if additional configuration is required?
Thanks!
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
The text was updated successfully, but these errors were encountered: