opentelemetry-collector-contrib: Instana Exporter did not work properly after the server deployed a self-signed certificate.

What happened?

Description

Instana exporter cannot send trace to instana backend after the server side deployed a self-signed https certificate. I already put the root CA in the client CA store.

The error message displayed after I triggered a http call on my application.


2022-09-30T01:36:27.694Z        info    TracesExporter  {"kind": "exporter", "data_type": "traces", "name": "logging", "#spans": 3}
2022-09-30T01:36:27.694Z        info    ResourceSpans #0
Resource SchemaURL: https://opentelemetry.io/schemas/1.12.0
Resource attributes:
     -> container.id: STRING(28)
     -> host.arch: STRING(amd64)
     -> host.name: STRING(bba-temp)
     -> os.description: STRING(Linux 4.18.0-193.28.1.el8_2.x86_64)
     -> os.type: STRING(linux)
     -> process.command_line: STRING(/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre:bin:java -javaagent:opentelemetry-javaagent.jar)
     -> process.executable.path: STRING(/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre:bin:java)
     -> process.pid: INT(219762)
     -> process.runtime.description: STRING(Red Hat, Inc. OpenJDK 64-Bit Server VM 25.312-b07)
     -> process.runtime.name: STRING(OpenJDK Runtime Environment)
     -> process.runtime.version: STRING(1.8.0_312-b07)
     -> service.name: STRING(morning)
     -> telemetry.auto.version: STRING(1.17.0)
     -> telemetry.sdk.language: STRING(java)
     -> telemetry.sdk.name: STRING(opentelemetry)
     -> telemetry.sdk.version: STRING(1.17.0)
ScopeSpans #0
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.http-url-connection 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      : 50eeea1b819b8776
    ID             : b2fbdf4151b54784
    Name           : HTTP GET
    Kind           : SPAN_KIND_CLIENT
    Start time     : 2022-09-30 01:36:31.1003281 +0000 UTC
    End time       : 2022-09-30 01:36:31.1169516 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> http.url: STRING(http://10.58.159.152:8081/test2)
     -> http.response_content_length: INT(10)
     -> net.peer.port: INT(8081)
     -> http.method: STRING(GET)
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> http.status_code: INT(200)
     -> thread.id: INT(20)
     -> net.transport: STRING(ip_tcp)
     -> net.peer.name: STRING(10.58.159.152)
     -> http.flavor: STRING(1.1)
ScopeSpans #1
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.spring-webmvc-3.1 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      : ea5c4a8ddf06d337
    ID             : 50eeea1b819b8776
    Name           : TestController.Test1
    Kind           : SPAN_KIND_INTERNAL
    Start time     : 2022-09-30 01:36:31.0407644 +0000 UTC
    End time       : 2022-09-30 01:36:31.2032473 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> thread.id: INT(20)
ScopeSpans #2
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.tomcat-7.0 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      :
    ID             : ea5c4a8ddf06d337
    Name           : /test1
    Kind           : SPAN_KIND_SERVER
    Start time     : 2022-09-30 01:36:30.824 +0000 UTC
    End time       : 2022-09-30 01:36:31.2050328 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> http.host: STRING(10.58.159.151:8080)
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> net.transport: STRING(ip_tcp)
     -> http.flavor: STRING(1.1)
     -> http.target: STRING(/test1)
     -> net.sock.peer.addr: STRING(10.195.205.37)
     -> http.response_content_length: INT(12)
     -> http.scheme: STRING(http)
     -> http.method: STRING(GET)
     -> http.status_code: INT(200)
     -> thread.id: INT(20)
     -> net.sock.peer.port: INT(51165)
     -> http.route: STRING(/test1)
     -> http.user_agent: STRING(Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.54 Safari/537.36 Edg/101.0.1210.39)
        {"kind": "exporter", "data_type": "traces", "name": "logging"}
2022-09-30T01:36:27.740Z        error   exporterhelper/queued_retry.go:361      Exporting failed. Try enabling retry_on_failure config option to retry on retryable errors   {"kind": "exporter", "data_type": "traces", "name": "instana", "error": "failed to send a request: Post \"https://instana.aiops.xxx.cloud.xxx:1444/bundle\": net/http: HTTP/1.x transport connection broken: malformed HTTP response \"\\x00\\x00\\x17\\a\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x01invalid_preface\""}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:361
go.opentelemetry.io/collector/exporter/exporterhelper.(*tracesExporterWithObservability).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:134
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:295
go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesExporter.func2
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:113
go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/consumer/traces.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/service/internal/fanoutconsumer/traces.go:75
go.opentelemetry.io/collector/processor/batchprocessor.(*batchTraces).export
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:262
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:176
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:143
2022-09-30T01:36:27.740Z        error   exporterhelper/queued_retry.go:297      Exporting failed. Dropping data. Try enabling sending_queue to survive temporary failures.   {"kind": "exporter", "data_type": "traces", "name": "instana", "dropped_items": 3}
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:297
go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesExporter.func2
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:113
go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/consumer/traces.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/service/internal/fanoutconsumer/traces.go:75
go.opentelemetry.io/collector/processor/batchprocessor.(*batchTraces).export
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:262
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:176
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:143
2022-09-30T01:36:27.740Z        warn    batchprocessor/batch_processor.go:178   Sender failed   {"kind": "processor", "name": "batch", "pipeline": "traces", "error": "failed to send a request: Post \"https://instana.aiops.xxx.cloud.xxx:1444/bundle\": net/http: HTTP/1.x transport connection broken: malformed HTTP response \"\\x00\\x00\\x17\\a\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x01invalid_preface\""}

Steps to Reproduce

The client side (where the otel-collector was running) is a Centos 8 linux system. I update the CA via the methods below.

cp rootCA.pem /etc/pki/ca-trust/source/anchors/
update-ca-trust force-enable
update-ca-trust extract

Then I simply start the otel-collector and then start my applications.

Expected Result

The trace could send to the instana backend without any warning or error message.

Actual Result

The trace could not send to the instana backend.

Collector version

latest

Environment information

Environment

OS: Centos 8 Compiler: Go v19.0

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: "10.58.159.152:4318"
      http:
        endpoint: "10.58.159.152:4319"
processors:
  batch:

exporters:
  logging:
    logLevel: debug
    #otlp:
    #endpoint: "https://instana.aiops.azure.cloud.xxx:1444"
    #tls:
    #  insecure: true
  instana:
    endpoint: https://instana.aiops.azure.cloud.xxx:1444
    agent_key: __xxxxxxx

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging,instana]

    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging]

  extensions: []

Log output

2022-09-30T01:36:27.694Z        info    TracesExporter  {"kind": "exporter", "data_type": "traces", "name": "logging", "#spans": 3}
2022-09-30T01:36:27.694Z        info    ResourceSpans #0
Resource SchemaURL: https://opentelemetry.io/schemas/1.12.0
Resource attributes:
     -> container.id: STRING(28)
     -> host.arch: STRING(amd64)
     -> host.name: STRING(bba-temp)
     -> os.description: STRING(Linux 4.18.0-193.28.1.el8_2.x86_64)
     -> os.type: STRING(linux)
     -> process.command_line: STRING(/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre:bin:java -javaagent:opentelemetry-javaagent.jar)
     -> process.executable.path: STRING(/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.312.b07-2.el8_5.x86_64/jre:bin:java)
     -> process.pid: INT(219762)
     -> process.runtime.description: STRING(Red Hat, Inc. OpenJDK 64-Bit Server VM 25.312-b07)
     -> process.runtime.name: STRING(OpenJDK Runtime Environment)
     -> process.runtime.version: STRING(1.8.0_312-b07)
     -> service.name: STRING(morning)
     -> telemetry.auto.version: STRING(1.17.0)
     -> telemetry.sdk.language: STRING(java)
     -> telemetry.sdk.name: STRING(opentelemetry)
     -> telemetry.sdk.version: STRING(1.17.0)
ScopeSpans #0
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.http-url-connection 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      : 50eeea1b819b8776
    ID             : b2fbdf4151b54784
    Name           : HTTP GET
    Kind           : SPAN_KIND_CLIENT
    Start time     : 2022-09-30 01:36:31.1003281 +0000 UTC
    End time       : 2022-09-30 01:36:31.1169516 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> http.url: STRING(http://10.58.159.152:8081/test2)
     -> http.response_content_length: INT(10)
     -> net.peer.port: INT(8081)
     -> http.method: STRING(GET)
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> http.status_code: INT(200)
     -> thread.id: INT(20)
     -> net.transport: STRING(ip_tcp)
     -> net.peer.name: STRING(10.58.159.152)
     -> http.flavor: STRING(1.1)
ScopeSpans #1
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.spring-webmvc-3.1 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      : ea5c4a8ddf06d337
    ID             : 50eeea1b819b8776
    Name           : TestController.Test1
    Kind           : SPAN_KIND_INTERNAL
    Start time     : 2022-09-30 01:36:31.0407644 +0000 UTC
    End time       : 2022-09-30 01:36:31.2032473 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> thread.id: INT(20)
ScopeSpans #2
ScopeSpans SchemaURL:
InstrumentationScope io.opentelemetry.tomcat-7.0 1.17.0-alpha
Span #0
    Trace ID       : ea8edacd8e761d28542be71388e200a4
    Parent ID      :
    ID             : ea5c4a8ddf06d337
    Name           : /test1
    Kind           : SPAN_KIND_SERVER
    Start time     : 2022-09-30 01:36:30.824 +0000 UTC
    End time       : 2022-09-30 01:36:31.2050328 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message :
Attributes:
     -> http.host: STRING(10.58.159.151:8080)
     -> thread.name: STRING(http-nio-8080-exec-1)
     -> net.transport: STRING(ip_tcp)
     -> http.flavor: STRING(1.1)
     -> http.target: STRING(/test1)
     -> net.sock.peer.addr: STRING(10.195.205.37)
     -> http.response_content_length: INT(12)
     -> http.scheme: STRING(http)
     -> http.method: STRING(GET)
     -> http.status_code: INT(200)
     -> thread.id: INT(20)
     -> net.sock.peer.port: INT(51165)
     -> http.route: STRING(/test1)
     -> http.user_agent: STRING(Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.54 Safari/537.36 Edg/101.0.1210.39)
        {"kind": "exporter", "data_type": "traces", "name": "logging"}
2022-09-30T01:36:27.740Z        error   exporterhelper/queued_retry.go:361      Exporting failed. Try enabling retry_on_failure config option to retry on retryable errors   {"kind": "exporter", "data_type": "traces", "name": "instana", "error": "failed to send a request: Post \"https://instana.aiops.xxx.cloud.xxx:1444/bundle\": net/http: HTTP/1.x transport connection broken: malformed HTTP response \"\\x00\\x00\\x17\\a\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x01invalid_preface\""}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:361
go.opentelemetry.io/collector/exporter/exporterhelper.(*tracesExporterWithObservability).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:134
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:295
go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesExporter.func2
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:113
go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/consumer/traces.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/service/internal/fanoutconsumer/traces.go:75
go.opentelemetry.io/collector/processor/batchprocessor.(*batchTraces).export
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:262
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:176
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:143
2022-09-30T01:36:27.740Z        error   exporterhelper/queued_retry.go:297      Exporting failed. Dropping data. Try enabling sending_queue to survive temporary failures.   {"kind": "exporter", "data_type": "traces", "name": "instana", "dropped_items": 3}
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/queued_retry.go:297
go.opentelemetry.io/collector/exporter/exporterhelper.NewTracesExporter.func2
        go.opentelemetry.io/collector@v0.61.0/exporter/exporterhelper/traces.go:113
go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/consumer/traces.go:36
go.opentelemetry.io/collector/service/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
        go.opentelemetry.io/collector@v0.61.0/service/internal/fanoutconsumer/traces.go:75
go.opentelemetry.io/collector/processor/batchprocessor.(*batchTraces).export
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:262
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).sendItems
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:176
go.opentelemetry.io/collector/processor/batchprocessor.(*batchProcessor).startProcessingCycle
        go.opentelemetry.io/collector@v0.61.0/processor/batchprocessor/batch_processor.go:143
2022-09-30T01:36:27.740Z        warn    batchprocessor/batch_processor.go:178   Sender failed   {"kind": "processor", "name": "batch", "pipeline": "traces", "error": "failed to send a request: Post \"https://instana.aiops.xxx.cloud.xxx:1444/bundle\": net/http: HTTP/1.x transport connection broken: malformed HTTP response \"\\x00\\x00\\x17\\a\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x00\\x01invalid_preface\""}


### Additional context

_No response_

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 16 (10 by maintainers)

Most upvoted comments

Thanks for reaching out, I will liaison with our support team to triage the issue.