You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
opentelemetry: # Plugin: opentelemetry
trace_id_source: x-request-id # Specify the source of the trace ID for OpenTelemetry traces.
resource:
service.name: APISIX # Set the service name for OpenTelemetry traces.
collector:
address: 127.0.0.1:4318 # Set the address of the OpenTelemetry collector to send traces to.
request_timeout: 3 # Set the timeout for requests to the OpenTelemetry collector in seconds.
request_headers: # Set the headers to include in requests to the OpenTelemetry collector.
Authorization: token # Set the authorization header to include an access token.
batch_span_processor:
drop_on_queue_full: false # Drop spans when the export queue is full.
max_queue_size: 1024 # Set the maximum size of the span export queue.
batch_timeout: 2 # Set the timeout for span batches to wait in the export queue before
# being sent.
inactive_timeout: 1 # Set the timeout for spans to wait in the export queue before being sent,
# if the queue is not full.
max_export_batch_size: 16 # Set the maximum number of spans to include in each batch sent to the
# OpenTelemetry collector.
set_ngx_var: false # Export opentelemetry variables to NGINX variables.
Source of the trace ID. Valid values are random or x-request-id. When set to x-request-id, the value of the x-request-id header will be used as trace ID. Make sure that is matches the regex pattern [0-9a-f]{32}.
Collector address. If the collector serves on https, use https://127.0.0.1:4318 as the address.
collector.request_timeout
integer
3
Report request timeout in seconds.
collector.request_headers
object
Report request HTTP headers.
batch_span_processor
object
Trace span processor.
batch_span_processor.drop_on_queue_full
boolean
true
When set to true, drops the span when queue is full. Otherwise, force process batches.
batch_span_processor.max_queue_size
integer
2048
Maximum queue size for buffering spans for delayed processing.
batch_span_processor.batch_timeout
number
5
Maximum time in seconds for constructing a batch.
batch_span_processor.max_export_batch_size
integer
256
Maximum number of spans to process in a single batch.
batch_span_processor.inactive_timeout
number
2
Time interval in seconds between processing batches.
The trace_id_source and batch_span_processor are not set to their default values as described in the documentation.
Desired State
The batch_span_processor values in the documentation match those in opentelemetry-lua, so the opentelemetry's batch_span_processor section in config-default.yaml needs to be corrected.
Keeping x-request-id as the default for trace_id_source might be a better choice since it hasn't changed since the opentelemetry plugin was added (PR #6119). Changing it could confuse users who have been using the default value for a long time. Only the documentation needs to be updated.
The text was updated successfully, but these errors were encountered:
Current State
opentelemetry plugin_attr section from config-default.yaml
but the documentation is
random
orx-request-id
. When set tox-request-id
, the value of thex-request-id
header will be used as trace ID. Make sure that is matches the regex pattern[0-9a-f]{32}
.true
, drops the span when queue is full. Otherwise, force process batches.The trace_id_source and batch_span_processor are not set to their default values as described in the documentation.
Desired State
The batch_span_processor values in the documentation match those in opentelemetry-lua, so the opentelemetry's batch_span_processor section in config-default.yaml needs to be corrected.
Keeping x-request-id as the default for trace_id_source might be a better choice since it hasn't changed since the opentelemetry plugin was added (PR #6119). Changing it could confuse users who have been using the default value for a long time. Only the documentation needs to be updated.
The text was updated successfully, but these errors were encountered: