opentelemetry-collector-contrib: [prometheusremotewrite] invalid temporality and type combination when remote write to thanos backend

What happened?

Description

I tried to use promethrus remote write to thanos backend and then display metrics on Grafana. I found there are many errors in otel collector log like “Permanent error: invalid temporality and type combination”, In result, thanos lacks many metrics used in Grafana dashboard, any idea or solution about this?

Steps to Reproduce

  1. fluentbit node_metrics output
  2. prometheus remote write exporter
  3. thanos as the backend

Expected Result

Actual Result

Collector version

0.61.0

Environment information

Environment

OS: (e.g., “Ubuntu 20.04”) Compiler(if manually compiled): (e.g., “go 14.2”)

OpenTelemetry Collector configuration

No response

Log output

2022-10-19T11:27:07.375+0800    error   exporterhelper/queued_retry.go:395      Exporting failed. The error is not retryable. Dropping data.    {"kind": "exporter", "data_type": "metrics", "name": "prometheusremotewrite", "error": "Permanent error: invalid temporality and type combination;

Additional context

No response

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Comments: 20 (5 by maintainers)

Most upvoted comments

I have the same issue with opentelemetry-collector-contrib 0.81.0

2023-07-24T21:17:44.155Z	info	service/service.go:148	Everything is ready. Begin running and processing data.
2023-07-24T21:18:37.504Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 2, "data points": 2}
2023-07-24T21:18:37.504Z	info	ResourceMetrics #0
Resource SchemaURL: 
Resource attributes:
     -> telemetry.sdk.language: Str(python)
     -> telemetry.sdk.name: Str(opentelemetry)
     -> telemetry.sdk.version: Str(1.19.0)
     -> namespace: Str(develop)
     -> service.name: Str(scrape)
     -> telemetry.auto.version: Str(0.40b0)
ScopeMetrics #0
ScopeMetrics SchemaURL: 
InstrumentationScope opentelemetry.instrumentation.flask 0.40b0
Metric #0
Descriptor:
     -> Name: http.server.active_requests
     -> Description: measures the number of concurrent HTTP requests that are currently in-flight
     -> Unit: requests
     -> DataType: Sum
     -> IsMonotonic: false
     -> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
     -> http.method: Str(GET)
     -> http.host: Str(127.0.0.1:5000)
     -> http.scheme: Str(http)
     -> http.flavor: Str(1.1)
     -> http.server_name: Str(0.0.0.0)
StartTimestamp: 2023-07-24 20:43:23.343899867 +0000 UTC
Timestamp: 2023-07-24 21:18:37.404773474 +0000 UTC
Value: 0
Metric #1
Descriptor:
     -> Name: http.server.duration
     -> Description: measures the duration of the inbound HTTP request
     -> Unit: ms
     -> DataType: Histogram
     -> AggregationTemporality: Cumulative
HistogramDataPoints #0
Data point attributes:
     -> http.method: Str(GET)
     -> http.host: Str(127.0.0.1:5000)
     -> http.scheme: Str(http)
     -> http.flavor: Str(1.1)
     -> http.server_name: Str(0.0.0.0)
     -> net.host.port: Int(5000)
     -> http.status_code: Int(200)
StartTimestamp: 2023-07-24 20:43:23.346501178 +0000 UTC
Timestamp: 2023-07-24 21:18:37.404773474 +0000 UTC
Count: 7
Sum: 16.000000
Min: 1.000000
Max: 3.000000
ExplicitBounds #0: 0.000000
ExplicitBounds #1: 5.000000
ExplicitBounds #2: 10.000000
ExplicitBounds #3: 25.000000
ExplicitBounds #4: 50.000000
ExplicitBounds #5: 75.000000
ExplicitBounds #6: 100.000000
ExplicitBounds #7: 250.000000
ExplicitBounds #8: 500.000000
ExplicitBounds #9: 750.000000
ExplicitBounds #10: 1000.000000
ExplicitBounds #11: 2500.000000
ExplicitBounds #12: 5000.000000
ExplicitBounds #13: 7500.000000
ExplicitBounds #14: 10000.000000
Buckets #0, Count: 0
Buckets #1, Count: 7
Buckets #2, Count: 0
Buckets #3, Count: 0
Buckets #4, Count: 0
Buckets #5, Count: 0
Buckets #6, Count: 0
Buckets #7, Count: 0
Buckets #8, Count: 0
Buckets #9, Count: 0
Buckets #10, Count: 0
Buckets #11, Count: 0
Buckets #12, Count: 0
Buckets #13, Count: 0
Buckets #14, Count: 0
Buckets #15, Count: 0
	{"kind": "exporter", "data_type": "metrics", "name": "logging"}
2023-07-24T21:19:37.422Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 2, "data points": 1}
2023-07-24T21:19:37.422Z	info	ResourceMetrics #0
Resource SchemaURL: 
Resource attributes:
     -> telemetry.sdk.language: Str(python)
     -> telemetry.sdk.name: Str(opentelemetry)
     -> telemetry.sdk.version: Str(1.19.0)
     -> namespace: Str(develop)
     -> service.name: Str(scrape)
     -> telemetry.auto.version: Str(0.40b0)
ScopeMetrics #0
ScopeMetrics SchemaURL: 
InstrumentationScope opentelemetry.instrumentation.flask 0.40b0
Metric #0
Descriptor:
     -> Name: http.server.active_requests
     -> Description: measures the number of concurrent HTTP requests that are currently in-flight
     -> Unit: requests
     -> DataType: Sum
     -> IsMonotonic: false
     -> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
     -> http.method: Str(GET)
     -> http.host: Str(127.0.0.1:5000)
     -> http.scheme: Str(http)
     -> http.flavor: Str(1.1)
     -> http.server_name: Str(0.0.0.0)
StartTimestamp: 2023-07-24 20:43:23.343899867 +0000 UTC
Timestamp: 2023-07-24 21:19:37.407265471 +0000 UTC
Value: 0
Metric #1
Descriptor:
     -> Name: http.server.duration
     -> Description: measures the duration of the inbound HTTP request
     -> Unit: ms
     -> DataType: Empty
	{"kind": "exporter", "data_type": "metrics", "name": "logging"}
2023-07-24T21:19:37.423Z	error	exporterhelper/queued_retry.go:391	Exporting failed. The error is not retryable. Dropping data.	{"kind": "exporter", "data_type": "metrics", "name": "prometheusremotewrite", "error": "Permanent error: invalid temporality and type combination for metric \"http.server.duration\"", "dropped_items": 1}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
	go.opentelemetry.io/collector/exporter@v0.81.0/exporterhelper/queued_retry.go:391
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send
	go.opentelemetry.io/collector/exporter@v0.81.0/exporterhelper/metrics.go:125
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1
	go.opentelemetry.io/collector/exporter@v0.81.0/exporterhelper/queued_retry.go:195
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1
	go.opentelemetry.io/collector/exporter@v0.81.0/exporterhelper/internal/bounded_memory_queue.go:47

Clarification

The error occurs because https://github.com/open-telemetry/opentelemetry-go sends metrics that have 0 DataPoints

The problem is reproduced also with VictoriaMetrics as the backend