Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[prometheusexporter] - Invalid Metric Labels #4419

Open
jseiser opened this issue Feb 4, 2025 · 3 comments
Open

[prometheusexporter] - Invalid Metric Labels #4419

jseiser opened this issue Feb 4, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@jseiser
Copy link

jseiser commented Feb 4, 2025

Component(s)

prometheusexporter

What happened?

Description

The prometheus metrics will eventually start returning incorrect labels.

Steps to Reproduce

MVC: https://github.com/jseiser/otlp_fastapi_repl/tree/main

Expected Result

Correct metric labels

Actual Result

  1. Start the app
pipenv run opentelemetry-instrument uvicorn ui.main:app --proxy-headers --host=0.0.0.0 --port=5000 --log-level=error
Loading .env environment variables..
  1. Check Metrics - makes sense, no traffic has happened
 curl http://localhost:9464/metrics 
# HELP python_gc_objects_collected_total Objects collected during gc
# TYPE python_gc_objects_collected_total counter
python_gc_objects_collected_total{generation="0"} 2258.0
python_gc_objects_collected_total{generation="1"} 2915.0
python_gc_objects_collected_total{generation="2"} 60.0
# HELP python_gc_objects_uncollectable_total Uncollectable objects found during GC
# TYPE python_gc_objects_uncollectable_total counter
python_gc_objects_uncollectable_total{generation="0"} 0.0
python_gc_objects_uncollectable_total{generation="1"} 0.0
python_gc_objects_uncollectable_total{generation="2"} 0.0
# HELP python_gc_collections_total Number of times this generation was collected
# TYPE python_gc_collections_total counter
python_gc_collections_total{generation="0"} 170.0
python_gc_collections_total{generation="1"} 15.0
python_gc_collections_total{generation="2"} 1.0
# HELP python_info Python platform information
# TYPE python_info gauge
python_info{implementation="CPython",major="3",minor="11",patchlevel="10",version="3.11.10"} 1.0
# HELP process_virtual_memory_bytes Virtual memory size in bytes.
# TYPE process_virtual_memory_bytes gauge
process_virtual_memory_bytes 4.3268096e+08
# HELP process_resident_memory_bytes Resident memory size in bytes.
# TYPE process_resident_memory_bytes gauge
process_resident_memory_bytes 7.2364032e+07
# HELP process_start_time_seconds Start time of the process since unix epoch in seconds.
# TYPE process_start_time_seconds gauge
process_start_time_seconds 1.73869183682e+09
# HELP process_cpu_seconds_total Total user and system CPU time spent in seconds.
# TYPE process_cpu_seconds_total counter
process_cpu_seconds_total 0.66
# HELP process_open_fds Number of open file descriptors.
# TYPE process_open_fds gauge
process_open_fds 18.0
# HELP process_max_fds Maximum number of open file descriptors.
# TYPE process_max_fds gauge
process_max_fds 1024.0
  1. Hit Site once
curl http://localhost:5000         
{"message":"Hello World"}
  1. Check Metrics - Looks Good - note http_target, and not_host_port.
http_server_duration_milliseconds_bucket{http_flavor="1.1",http_host="127.0.0.1:5000",http_method="GET",http_scheme="http",http_server_name="localhost:5000",http_status_code="200",http_target="/",le="0.0",net_host_port="5000"} 0.0

http_server_response_size_bytes_bucket{http_flavor="1.1",http_host="127.0.0.1:5000",http_method="GET",http_scheme="http",http_server_name="localhost:5000",http_status_code="200",http_target="/",le="0.0",net_host_port="5000"} 0.0
  1. Hit site a few more times
curl http://localhost:5000         
{"message":"Hello World"}% 
curl http://localhost:5000         
{"message":"Hello World"}% 
  1. Check Metrics - still good
  2. Hit something else
curl http://localhost:5000/fakepath
{"detail":"Not Found"}
  1. Check Metrics - Broken
http_server_duration_milliseconds_bucket{http_flavor="1.1",http_host="127.0.0.1:5000",http_method="GET",http_scheme="http",http_server_name="localhost:5000",http_status_code="200",le="0.0",net_host_port="/"} 5.0                           

http_server_duration_milliseconds_bucket{http_flavor="1.1",http_host="127.0.0.1:5000",http_method="GET",http_scheme="http",http_server_name="localhost:5000",http_status_code="404",le="0.0",net_host_port="5000"} 1.0

http_server_response_size_bytes_bucket{http_flavor="1.1",http_host="127.0.0.1:5000",http_method="GET",http_scheme="http",http_server_name="localhost:5000",http_status_code="404",le="0.0",net_host_port="5000"} 0.0

Collector version

NONE

Environment information

Environment

OS: Linux

OpenTelemetry Collector configuration

Log output

Additional context

No response

@jseiser jseiser added the bug Something isn't working label Feb 4, 2025
Copy link

github-actions bot commented Feb 4, 2025

Pinging code owners for exporter/prometheus: @Aneurysm9 @dashpole @ArthurSens. See Adding Labels via Comments if you do not have permissions to add labels yourself. For example, comment '/label priority:p2 -needs-triaged' to set the priority and remove the needs-triaged label.

@dashpole
Copy link
Contributor

dashpole commented Feb 5, 2025

please provide your collector version and collector configuration

@jpkrohling jpkrohling transferred this issue from open-telemetry/opentelemetry-collector-contrib Feb 5, 2025
@jpkrohling
Copy link
Member

Moved to Python as requested by @emdneto

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants