Skip to content

Commit

Permalink
Add OpenAI example
Browse files Browse the repository at this point in the history
Signed-off-by: Adrian Cole <[email protected]>
  • Loading branch information
codefromthecrypt committed Nov 20, 2024
1 parent 19a59e4 commit 6e3a92b
Show file tree
Hide file tree
Showing 8 changed files with 120 additions and 2 deletions.
2 changes: 1 addition & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ extension-pkg-whitelist=cassandra

# Add list of files or directories to be excluded. They should be base names, not
# paths.
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs,.venv

# Add files or directories matching the regex patterns to be excluded. The
# regex matches against base names, not paths.
Expand Down
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Added

- Add example to `opentelemetry-instrumentation-openai-v2`
([#3006](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3006))
- `opentelemetry-instrumentation-sqlalchemy` Update unit tests to run with SQLALchemy 2
([#2976](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2976))
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
([#2996](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2996))
- `opentelemetry-instrumentation-sqlalchemy` Add sqlcomment to `db.statement` attribute
([#2937](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2937))
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Update this with your real OpenAI API key
OPENAI_API_KEY=sk-YOUR_API_KEY

# Uncomment to use Ollama instead of OpenAI
# OPENAI_BASE_URL=http://localhost:11434/v1
# OPENAI_API_KEY=unused
# CHAT_MODEL=qwen2.5:0.5b

OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
OTEL_SERVICE_NAME=opentelemetry-python-openai

# Change to 'false' to disable logging
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
# Change to 'console' if your OTLP endpoint doesn't support logs
OTEL_LOGS_EXPORTER=otlp_proto_http
# Change to 'false' to hide prompt and completion content
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Use an alpine image to make the runtime smaller
FROM docker.io/python:3.12.7-alpine3.20
RUN python -m pip install --upgrade pip

COPY /requirements.txt /tmp/requirements.txt
RUN pip install -r /tmp/requirements.txt

COPY main.py /

CMD [ "opentelemetry-instrument", "python", "main.py" ]
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
OpenTelemetry OpenAI Instrumentation Example
============================================

This is an example of how to instrument OpenAI calls with zero code changes,
using `opentelemetry-instrument`.

When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
compatible endpoint. Traces include details such as the model used and the
duration of the chat request. Logs capture the chat request and the generated
response, providing a comprehensive view of the performance and behavior of
your OpenAI requests.

Setup
-----

Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
OTLP compatible endpoint should be listening for traces and logs on
http://localhost:4318. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.

Run with Docker
---------------

If you have Docker installed, you can run the example in one step:

::

docker-compose run --build --rm python-opentelemetry-openai

You should see a poem generated by OpenAI while traces and logs export to your
configured observability tool.

Run with Python
---------------

If you prefer to run the example with Python, set up a virtual environment for
the example like this:

::

python3 -m venv .venv
source .venv/bin/activate
pip install "python-dotenv[cli]"
pip install -r requirements.txt

Now, run the example like this:

::

dotenv run -- opentelemetry-instrument python main.py

You should see a poem generated by OpenAI while traces and logs export to your
configured observability tool.
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
services:
opentelemetry-python-openai:
container_name: opentelemetry-python-openai
build:
context: .
env_file:
- .env
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: "http://host.docker.internal:4318"
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
import os

from openai import OpenAI


def main():
client = OpenAI()
chat_completion = client.chat.completions.create(
model=os.getenv("CHAT_MODEL", "gpt-4o-mini"),
messages=[
{
"role": "user",
"content": "Write a short poem on OpenTelemetry.",
},
],
)
print(chat_completion.choices[0].message.content)


if __name__ == "__main__":
main()
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
openai~=1.54.4

opentelemetry-sdk~=1.28.2
opentelemetry-exporter-otlp-proto-http~=1.28.2
opentelemetry-distro~=0.49b2
opentelemetry-instrumentation-openai-v2~=2.0b0

0 comments on commit 6e3a92b

Please sign in to comment.