Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: update changelog for version 2.21.0 #12338

Closed
wants to merge 1 commit into from

Conversation

erikayasuda
Copy link
Contributor

  • update changelog for version 2.21.0

@erikayasuda erikayasuda requested a review from a team as a code owner February 13, 2025 21:20
Copy link
Contributor

CODEOWNERS have been resolved as:

CHANGELOG.md                                                            @DataDog/apm-python

@pr-commenter
Copy link

pr-commenter bot commented Feb 13, 2025

Benchmarks

Benchmark execution time: 2025-02-13 22:01:04

Comparing candidate commit ae08ea9 in PR branch release.script/changelog-update-2.21.0 with baseline commit bcf20e0 in branch main.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 394 metrics, 2 unstable metrics.

### New Features

- openai: Introduces tracing support to the OpenAI integration for Python versions 3.12 and 3.13.
### Deprecation Notes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### Deprecation Notes
### Deprecation Notes

- convert_rc_trace_sampling_rules, this attribute is internal to the tracer.
- enable_remote_configuration, use `DD_REMOTE_CONFIGURATION_ENABLED` environment variable instead.
- get_from, use `ddtrace.trace.Pin` to set instance level configurations.
### Bug Fixes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### Bug Fixes
### Bug Fixes

- LLM Observability: This fix resolves an issue where extracting token metadata from openai streamed chat completion token chunks caused an IndexError.
- vertexai: Resolves an issue with `chat.send_message()` where the content keyword argument was not parsed correctly.
- profiling: fix SystemError from the memory profiler returning NULL when collecting events
### Other Changes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### Other Changes
### Other Changes

@erikayasuda
Copy link
Contributor Author

Closing in favor of consolidating with #12391

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants