Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Error with set_global_handler("arize_phoenix") #10602

Closed
austinmw opened this issue Feb 12, 2024 · 10 comments
Closed

[Bug]: Error with set_global_handler("arize_phoenix") #10602

austinmw opened this issue Feb 12, 2024 · 10 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@austinmw
Copy link
Contributor

Bug Description

The query_pipeline.ipynb example shows the following:

# setup Arize Phoenix for logging/observability
import phoenix as px

px.launch_app()
import llama_index

llama_index.set_global_handler("arize_phoenix")

This results in:

AttributeError: module 'llama_index' has no attribute 'set_global_handler'

I tried modifying it to:

# setup Arize Phoenix for logging/observability
import phoenix as px

px.launch_app()
from llama_index.core import set_global_handler

set_global_handler("arize_phoenix")

However I get:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
File [/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:8), in arize_phoenix_callback_handler(**kwargs)
      [7](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:7) try:
----> [8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:8)     from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler
      [9](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:9) except ImportError:

File [/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/__init__.py:1](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/__init__.py:1)
----> [1](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/__init__.py:1) from .callback import OpenInferenceTraceCallbackHandler
      [2](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/__init__.py:2) from .debug_callback import DebugCallbackHandler

File [/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:8)
      [4](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:4) from typing import (
      [5](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:5)     Any,
      [6](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:6) )
----> [8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:8) from openinference.instrumentation.llama_index._callback import (
      [9](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:9)     OpenInferenceTraceCallbackHandler as _OpenInferenceTraceCallbackHandler,
     [10](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:10) )
     [11](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:11) from openinference.instrumentation.llama_index.version import (
     [12](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:12)     __version__,
     [13](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/phoenix/trace/llama_index/callback.py:13) )

File [/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/__init__.py:5](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/__init__.py:5)
      [3](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/__init__.py:3) from typing import TYPE_CHECKING, Any, Collection
----> [5](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/__init__.py:5) from openinference.instrumentation.llama_index._callback import OpenInferenceTraceCallbackHandler
      [6](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/__init__.py:6) from openinference.instrumentation.llama_index.package import _instruments

File [/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:40](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:40)
     [38](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:38) from wrapt import ObjectProxy
---> [40](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:40) from llama_index.callbacks.base_handler import BaseCallbackHandler
     [41](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:41) from llama_index.callbacks.schema import (
     [42](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:42)     BASE_TRACE_EVENT,
     [43](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:43)     CBEventType,
     [44](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:44)     EventPayload,
     [45](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/openinference/instrumentation/llama_index/_callback.py:45) )

ModuleNotFoundError: No module named 'llama_index.callbacks.base_handler'

During handling of the above exception, another exception occurred:

ImportError                               Traceback (most recent call last)
Cell In[3], [line 7](vscode-notebook-cell:?execution_count=3&line=7)
      [4](vscode-notebook-cell:?execution_count=3&line=4) px.launch_app()
      [5](vscode-notebook-cell:?execution_count=3&line=5) from llama_index.core import set_global_handler
----> [7](vscode-notebook-cell:?execution_count=3&line=7) set_global_handler("arize_phoenix")

File [/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:11](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:11), in set_global_handler(eval_mode, **eval_params)
      [8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:8) """Set global eval handlers."""
      [9](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:9) import llama_index.core
---> [11](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:11) llama_index.core.global_handler = create_global_handler(eval_mode, **eval_params)

File [/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:51](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:51), in create_global_handler(eval_mode, **eval_params)
     [45](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:45)     except ImportError:
     [46](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:46)         raise ImportError(
     [47](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:47)             "ArizePhoenixCallbackHandler is not installed. "
     [48](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:48)             "Please install it using `pip install llama-index-callbacks-arize-phoenix`"
     [49](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:49)         )
---> [51](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:51)     handler = arize_phoenix_callback_handler(**eval_params)
     [52](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:52) elif eval_mode == "honeyhive":
     [53](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/core/callbacks/global_handlers.py:53)     try:

File [/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:10](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:10), in arize_phoenix_callback_handler(**kwargs)
      [8](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:8)     from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler
      [9](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:9) except ImportError:
---> [10](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:10)     raise ImportError(
     [11](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:11)         "Please install Arize Phoenix with `pip install -q arize-phoenix`"
     [12](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:12)     )
     [13](https://vscode-remote+ssh-002dremote-002bdev.vscode-resource.vscode-cdn.net/opt/conda/lib/python3.10/site-packages/llama_index/callbacks/arize_phoenix/base.py:13) return OpenInferenceTraceCallbackHandler(**kwargs)

ImportError: Please install Arize Phoenix with `pip install -q arize-phoenix`

Even though this library is already installed.

Version

0.10.1

Steps to Reproduce

Added steps in description

Relevant Logs/Tracbacks

No response

@austinmw austinmw added bug Something isn't working triage Issue needs to be triaged/prioritized labels Feb 12, 2024
Copy link

dosubot bot commented Feb 12, 2024

🤖

Hey @austinmw, fancy seeing you here again! Let's crack this new puzzle together 🕵️‍♂️

Based on the issues you're encountering, here are some potential solutions:

  1. AttributeError with 'set_global_handler': This function is defined in the 'llama_index/callbacks/global_handlers.py' file. If you're trying to access it from the 'llama_index' module, you might need to import it correctly. Use the following import statement:
from llama_index.callbacks.global_handlers import set_global_handler
  1. ModuleNotFoundError for 'llama_index.callbacks.base_handler': This error suggests that Python can't find the 'base_handler' module in the 'llama_index.callbacks' package. Make sure that the 'base_handler.py' file exists in the 'callbacks' directory of the 'llama_index' package. If it does, check your PYTHONPATH environment variable to ensure it includes the path to the 'llama_index' package.

  2. ImportError suggesting to install Arize Phoenix: This error might occur if the 'arize_phoenix_callback_handler' is not properly installed or not in the PYTHONPATH. Even though you have Arize Phoenix installed, it might not be in a location that Python can find. Check your Python environment and ensure that Arize Phoenix is installed in the same environment where you're running your LlamaIndex application. If it's installed in a different environment, Python won't be able to find it. You can check your current Python environment with the following command:

which python

And you can check where the 'arize_phoenix_callback_handler' is installed with:

pip show arize_phoenix_callback_handler

If they're not in the same environment, you'll need to install 'arize_phoenix_callback_handler' in the correct environment.

For your reference, the 'set_global_handler' function does exist in the 'llama_index/callbacks/global_handlers.py' file in the LlamaIndex codebase. Here is the function:

def set_global_handler(eval_mode: str, **eval_params: Any) -> None:
    """Set global eval handlers."""
    import llama_index

    llama_index.global_handler = create_global_handler(eval_mode, **eval_params)

You can find more details in the llama_index/callbacks/global_handlers.py file.

Please try these solutions and let me know if you need further assistance.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@nerdai
Copy link
Contributor

nerdai commented Feb 12, 2024

Hi @austinmw are you using a fresh virtualenv? With v0.10.xx we're now using Python namespaces, whereas v0.9.xx and earlier did not. So any remnants of legacy llama-index in the virtualenv will break things.

So essentially, the recommendation is to:

  1. use a fresh virtualenv
  2. install either llama-index (or llama-index-core)
  3. install the other necessary llama-index packages for your script/app

@logan-markewich
Copy link
Collaborator

logan-markewich commented Feb 12, 2024

In this case, for arize, you'll also want to install pip install llama-index-callbacks-arize-phoenix

A full package registry is here: https://pretty-sodium-5e0.notion.site/ce81b247649a44e4b6b35dfb24af28a6?v=53b3c2ced7bb4c9996b81b83c9f01139

@austinmw
Copy link
Contributor Author

@logan-markewich Hey, I actually did install that as well. I think the install error is a false issue.

@logan-markewich
Copy link
Collaborator

@austinmw ahhh I see now.

I think arize is just incompatible with v0.10.0 at the moment (just got off a call with some of their team, they are working on it)

@rit-github
Copy link

This was driving me crazy until I realized this is an issue with the 0.10.x changes.
There are still several references which should have changed to llama_index. core.xxxx but .core is missing in the old files'

e.g.
from llama_index.callbacks should be from llama_index.core.callbacks
from llama_index.response.schema has moved to from llama_index.core.base.response.schema

from llama_index.llms.types import ChatMessage, ChatResponse should be pointing to:
from llama_index.core.llms import ChatMessage, ChatResponse

Some of the files impacted :
site-packages/phoenix/trace/llama_index/

  • callback.py
  • streaming.py
  • debug_callback.py

Depending upon what you are using, this may open up more locations, but primarily the issue is due the movement of core classes around and the outdated references in calling libraries.

@logan-markewich
Copy link
Collaborator

@rit-github @austinmw This is now fixed!

pip install -U "llama-index-callbacks-arize-phoenix>=0.1.2" "openinference-instrumentation-llama-index>=1.0.0" "arize-phoenix>=3.0.3"

@logan-markewich
Copy link
Collaborator

If anyone is using LlamaIndex 0.9.xx but still want to use phoenix they can, they just have to downgrade the instrumentation openinference-instrumentation-llama-index<1.0.0

@babinux
Copy link

babinux commented Feb 16, 2024

Any chance you could update the Colab that was shared in that YT video? https://www.youtube.com/watch?v=CeDS1yvw9E4&t=211s

@thanhtheman
Copy link

after installing the above packages, the probem is not fixed with llama-index 0.10.34

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

6 participants