-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Error with set_global_handler("arize_phoenix") #10602
Comments
🤖 Hey @austinmw, fancy seeing you here again! Let's crack this new puzzle together 🕵️♂️ Based on the issues you're encountering, here are some potential solutions:
from llama_index.callbacks.global_handlers import set_global_handler
which python And you can check where the 'arize_phoenix_callback_handler' is installed with: pip show arize_phoenix_callback_handler If they're not in the same environment, you'll need to install 'arize_phoenix_callback_handler' in the correct environment. For your reference, the 'set_global_handler' function does exist in the 'llama_index/callbacks/global_handlers.py' file in the LlamaIndex codebase. Here is the function: def set_global_handler(eval_mode: str, **eval_params: Any) -> None:
"""Set global eval handlers."""
import llama_index
llama_index.global_handler = create_global_handler(eval_mode, **eval_params) You can find more details in the llama_index/callbacks/global_handlers.py file. Please try these solutions and let me know if you need further assistance. Sources
|
Hi @austinmw are you using a fresh virtualenv? With v0.10.xx we're now using Python namespaces, whereas v0.9.xx and earlier did not. So any remnants of legacy llama-index in the virtualenv will break things. So essentially, the recommendation is to:
|
In this case, for arize, you'll also want to install A full package registry is here: https://pretty-sodium-5e0.notion.site/ce81b247649a44e4b6b35dfb24af28a6?v=53b3c2ced7bb4c9996b81b83c9f01139 |
@logan-markewich Hey, I actually did install that as well. I think the install error is a false issue. |
@austinmw ahhh I see now. I think arize is just incompatible with v0.10.0 at the moment (just got off a call with some of their team, they are working on it) |
This was driving me crazy until I realized this is an issue with the 0.10.x changes. e.g. from llama_index.llms.types import ChatMessage, ChatResponse should be pointing to: Some of the files impacted :
Depending upon what you are using, this may open up more locations, but primarily the issue is due the movement of core classes around and the outdated references in calling libraries. |
@rit-github @austinmw This is now fixed!
|
If anyone is using LlamaIndex 0.9.xx but still want to use phoenix they can, they just have to downgrade the instrumentation |
Any chance you could update the Colab that was shared in that YT video? https://www.youtube.com/watch?v=CeDS1yvw9E4&t=211s |
after installing the above packages, the probem is not fixed with llama-index 0.10.34 |
Bug Description
The
query_pipeline.ipynb
example shows the following:This results in:
I tried modifying it to:
However I get:
Even though this library is already installed.
Version
0.10.1
Steps to Reproduce
Added steps in description
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: