Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tracer newRecordingSpan memory leak #11875

Open
PowerSurj opened this issue Dec 10, 2024 · 2 comments
Open

tracer newRecordingSpan memory leak #11875

PowerSurj opened this issue Dec 10, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@PowerSurj
Copy link

PowerSurj commented Dec 10, 2024

Component(s)

No response

What happened?

Description

Observing memory leak

image

image

Steps to Reproduce

dedicated for tracing otelcol pods are running in k8s
k8s applications export traces to otlp grpc endpoint
collectors perform probabilistic sampling with equalizing mode and export traces to awsxray

otel configuration provided below

Profiles

cpu, memory and goroutines pprof attached

pprof.otelcol-contrib.samples.cpu.001.pb.gz
pprof.otelcol-contrib.goroutine.001.pb.gz
pprof.otelcol-contrib.alloc_objects.alloc_space.inuse_objects.inuse_space.001.pb.gz

Collector version

0.114.0

Environment information

AWS EKS 1.30
running collectors as containers

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
exporters:
  awsxray:
    index_all_attributes: true
  debug/detailed:
    sampling_initial: 1
    verbosity: detailed
  debug/normal:
    verbosity: normal
processors:
  batch:
    send_batch_max_size: 10000
    timeout: 1s
  memory_limiter:
    check_interval: 1s
    limit_percentage: 75
    spike_limit_percentage: 15
  probabilistic_sampler:
    mode: equalizing
    sampling_percentage: 1
extensions:
  health_check:
    endpoint: 0.0.0.0:13133
  pprof:
    endpoint: localhost:1777
  zpages:
    endpoint: localhost:55679
service:
  extensions:
    - health_check
    - zpages
    - pprof
  telemetry:
    logs:
      level: info
    metrics:
      address: 0.0.0.0:8888
  pipelines:
    traces/awsxray:
      exporters:
        - debug/detailed
        - awsxray
      processors:
        - memory_limiter
        - probabilistic_sampler
        - batch
      receivers:
        - otlp

Log output

no errors in the logs

@PowerSurj PowerSurj added the bug Something isn't working label Dec 10, 2024
@codeboten
Copy link
Contributor

based on the output of the pprof shown here, this looks closely related to #10858

@dmitryax dmitryax transferred this issue from open-telemetry/opentelemetry-collector-contrib Dec 12, 2024
@atoulme
Copy link
Contributor

atoulme commented Dec 13, 2024

I have reviewed this issue and filed open-telemetry/opentelemetry-go-contrib#6446 to go further and fix it in the OpenTelemetry Go SDK Contrib gRPC library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants