Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(langchain): metrics support #2154

Merged
merged 1 commit into from
Oct 16, 2024
Merged

fix(langchain): metrics support #2154

merged 1 commit into from
Oct 16, 2024

Conversation

nirga
Copy link
Member

@nirga nirga commented Oct 16, 2024

Fixes #1912

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. python Pull requests that update Python code testing labels Oct 16, 2024
@nirga nirga force-pushed the langchain-metrics branch 2 times, most recently from 0b1b1b3 to 5c87a34 Compare October 16, 2024 15:14
@nirga nirga requested a review from galkleinman October 16, 2024 15:14
@nirga nirga force-pushed the langchain-metrics branch from 5c87a34 to 0365b9f Compare October 16, 2024 15:17
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Oct 16, 2024
@nirga nirga force-pushed the langchain-metrics branch from 0365b9f to f3dde89 Compare October 16, 2024 15:25
@nirga nirga merged commit f5542f5 into main Oct 16, 2024
9 checks passed
@nirga nirga deleted the langchain-metrics branch October 16, 2024 15:29
@nitin302
Copy link

@nirga

  1. I see that we are adding custom attributes to metrics, However, TRACELOOP_ENTITY_NAME (traceloop.entity.name) is a very high cardinality attribute - https://github.com/traceloop/openllmetry/pull/2154/files#diff-caff857eafc9121a585e40cdac1c0815d97f474cd3d0bf1a23755572e3b38105R473
  • Most destinations would have limits on such fields with very high cardinality for metrics.
  • I feel this kind of information is best understood with traces/spans instead or at-least some control should be given to enable/disable this metric as required or limit attributes.
  1. Seeing this error generated for gen_ai.response.model when invoked as a agent.
Failed to encode key gen_ai.response.model: Invalid type <class 'NoneType'> of value None
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/common/_internal/__init__.py", line 113, in _encode_attributes
    pb2_attributes.append(_encode_key_value(key, value))
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/common/_internal/__init__.py", line 94, in _encode_key_value
    return PB2KeyValue(key=key, value=_encode_value(value))
                                      ^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/common/_internal/__init__.py", line 90, in _encode_value
    raise Exception(f"Invalid type {type(value)} of value {value}")
Exception: Invalid type <class 'NoneType'> of value None
  1. I do not see gen_ai.client.token.usage being generated when used like below, I do see it generated with simple LLMChain but not when invoked via agent.
    llm = AzureChatOpenAI(
        openai_api_version=os.environ.get("AZURE_OPENAI_VERSION"),
        azure_deployment=os.environ.get("AZURE_OPENAI_DEPLOYMENT"),
        azure_endpoint=os.environ.get("AZURE_OPENAI_ENDPOINT"),
        api_key=os.environ.get("AZURE_OPENAI_KEY"),
        model="gpt-4"
        )
    event_message = "this is a test"
    tools = [Tool().useless_tool]
    
    agent_template = categorise_event_prompt
    input_template = "{input}"
    
    prompt = ChatPromptTemplate.from_messages([
        ("system", agent_template),
        ("human", input_template),
        MessagesPlaceholder("agent_scratchpad")
    ])
    agent = create_openai_tools_agent(llm=llm, tools=tools, prompt=prompt)
    agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
    result = agent_executor.invoke({"input": event_message})

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer python Pull requests that update Python code size:L This PR changes 100-499 lines, ignoring generated files. testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

🐛 Bug Report: Metrics wasn't sent when use chatmodel (openai , chatopenai ) from langchain
3 participants