opentelemetry-collector-contrib: Log export fails with '413 Request Entity Too Large' status code
Component(s)
exporter/datadog
What happened?
Description
I’ve observed one of our applications has stopped pushing logs via the collector, and then double checked the error log in the VM running the collector.
It appears to be failing with a 413 Request Entity Too Large
error, retrying multiple times, and then discarding the data.
Steps to Reproduce
Unknown at this point what is causing this.
Expected Result
No exceptions and logs flowing through to datadog.
Actual Result
Logs are not flowing for this particular application.
Collector version
0.62.1
Environment information
Environment
OS: (e.g., “Windows Server 2019 DataCenter 10.0.17763.3650”)
OpenTelemetry Collector configuration
receivers:
otlp:
protocols:
grpc:
hostmetrics:
collection_interval: 10s
scrapers:
paging:
metrics:
system.paging.utilization:
enabled: true
cpu:
metrics:
system.cpu.utilization:
enabled: true
disk:
filesystem:
metrics:
system.filesystem.utilization:
enabled: true
load:
memory:
network:
processes:
processors:
batch:
timeout: 10s
resource:
attributes:
- key: deployment.environment
value: "beta"
action: upsert
exporters:
datadog:
api:
key: ${DATADOG_APIKEY}
service:
pipelines:
traces:
receivers: [otlp]
processors: [resource, batch]
exporters: [datadog]
metrics:
receivers: [hostmetrics, otlp]
processors: [resource, batch]
exporters: [datadog]
logs:
receivers: [otlp]
processors: [resource, batch]
exporters: [datadog]
Log output
2022-12-09T16:05:07.076Z info exporterhelper/queued_retry.go:427 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "datadog", "error": "413 Request Entity Too Large", "interval": "18.341313917s"}
2022-12-09T16:05:14.630Z error logs/sender.go:68 Failed to send logs {"kind": "exporter", "data_type": "logs", "name": "datadog", "error": "413 Request Entity Too Large", "msg": "{\"errors\":[{\"status\":\"413\",\"title\":\"Request Entity Too Large\",\"detail\":\"Request too large\"}]}", "status_code": "413 Request Entity Too Large"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter/internal/logs.(*Sender).SubmitLogs
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/internal/logs/sender.go:68
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*logsExporter).consumeLogs
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/logs_exporter.go:101
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:65
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/common.go:203
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:388
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:132
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:206
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/internal/bounded_memory_queue.go:61
2022-12-09T16:05:25.629Z error logs/sender.go:68 Failed to send logs {"kind": "exporter", "data_type": "logs", "name": "datadog", "error": "413 Request Entity Too Large", "msg": "{\"errors\":[{\"status\":\"413\",\"title\":\"Request Entity Too Large\",\"detail\":\"Request too large\"}]}", "status_code": "413 Request Entity Too Large"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter/internal/logs.(*Sender).SubmitLogs
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/internal/logs/sender.go:68
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*logsExporter).consumeLogs
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/logs_exporter.go:101
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:65
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/common.go:203
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:388
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:132
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:206
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/internal/bounded_memory_queue.go:61
2022-12-09T16:05:25.629Z error exporterhelper/queued_retry.go:176 Exporting failed. No more retries left. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "datadog", "error": "max elapsed time expired 413 Request Entity Too Large", "dropped_items": 6604}
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).onTemporaryFailure
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:176
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:411
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:132
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:206
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1
go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/internal/bounded_memory_queue.go:61
Additional context
Nothing relevant at this point.
About this issue
- Original URL
- State: closed
- Created 2 years ago
- Comments: 17 (8 by maintainers)
I did update the collector a while ago and the problem seemed to go away.
@julealgon we have fixed this issue. Can you please try using latest collector-contrib image ?
Pinging code owners for exporter/datadog: @KSerrania @mx-psi @gbbr @knusbaum @amenasria @dineshg13. See Adding Labels via Comments if you do not have permissions to add labels yourself.