opentelemetry-collector-contrib: Logs can't export to Loki

Component(s)

exporter/loki

What happened?

Description

Loki Exporter seems not work. Logs can’t export to Loki in Grafana Cloud.

Steps to Reproduce

Below is k8s manifests, kubectl apply it to cluster.

Expected Result

I can explore logs in Grafana Loki.

Actual Result

No any logs export to Loki.

Check pod logs, It looks like not much is happening in the otel collector pod from those logs after the exporter processes start.

Collector version

0.66.0

Environment information

Environment

Google Kubernetes Engine 1.22

OpenTelemetry Collector configuration

apiVersion: v1
kind: ConfigMap
metadata:
  name: otel-collector-agent
  labels:
    app: opentelemetry
    component: otel-collector
  namespace: demo
data:
  otel-collector-agent: |
    receivers:
      filelog:
        include:
          - /var/log/pods/*/*/*.log
        start_at: beginning
        include_file_path: true
        include_file_name: true

    exporters:
      logging:
        verbosity: detailed
      loki:
        endpoint: https://xxx:xxxxxx@logs-prod3.grafana.net/loki/api/v1/push
    service:
      pipelines:
        logs:
          receivers: [filelog]
          exporters: [loki, logging]
---

apiVersion: apps/v1
kind: DaemonSet
metadata:
  name: otel-collector-agent
  labels:
    app: opentelemetry
    component: otel-collector
  namespace: demo
spec:
  selector:
    matchLabels:
      app: opentelemetry
      component: otel-collector
  updateStrategy:
    type: RollingUpdate
  template:
    metadata:
      labels:
        app: opentelemetry
        component: otel-collector
        
    spec:  
      # serviceAccountName: otel-collector
      containers:
        - name: otel-collector
          command:
            - "/otelcol-contrib"
            - "--config=/conf/otel-collector-agent.yaml"
          image: "otel/opentelemetry-collector-contrib:0.66.0"
          imagePullPolicy: IfNotPresent
          ports:
            - name: otlp
              containerPort: 4317
              protocol: TCP
              hostPort: 4317
            - name: otlp-http
              containerPort: 4318
              protocol: TCP
              hostPort: 4318
          env:
            - name: MY_POD_IP
              valueFrom:
                fieldRef:
                  apiVersion: v1
                  fieldPath: status.podIP

          resources:
            limits:
              cpu: 256m
              memory: 512Mi
          volumeMounts:
            - mountPath: /conf
              name: otel-collector-conf-vol
            - mountPath: /var/log/pods
              name: varlogpods
              readOnly: true
            - mountPath: /var/lib/docker/containers
              name: varlibdockercontainers
              readOnly: true
      volumes:
        - name: otel-collector-conf-vol
          configMap:
            name: otel-collector-agent
            items:
              - key: otel-collector-agent
                path: otel-collector-agent.yaml
        - hostPath:
            path: /var/log/pods
          name: varlogpods
        - hostPath:
            path: /var/lib/docker/containers
          name: varlibdockercontainers
      hostNetwork: false

Log output

2022/12/09 02:49:10 proto: duplicate proto type registered: jaeger.api_v2.PostSpansRequest
2022/12/09 02:49:10 proto: duplicate proto type registered: jaeger.api_v2.PostSpansResponse
2022-12-09T02:49:10.632Z    info    service/telemetry.go:110    Setting up own telemetry...
2022-12-09T02:49:10.632Z    info    service/telemetry.go:140    Serving Prometheus metrics  {"address": ":8888", "level": "basic"}
2022-12-09T02:49:10.632Z    info    lokiexporter@v0.64.0/next_exporter.go:43    using the new Loki exporter {"kind": "exporter", "data_type": "logs", "name": "loki"}
2022-12-09T02:49:10.717Z    info    service/service.go:89   Starting otelcol-contrib... {"Version": "0.64.1", "NumCPU": 8}
2022-12-09T02:49:10.717Z    info    extensions/extensions.go:41 Starting extensions...
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:74   Starting exporters...
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:78   Exporter is starting... {"kind": "exporter", "data_type": "logs", "name": "loki"}
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:82   Exporter started.   {"kind": "exporter", "data_type": "logs", "name": "loki"}
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:86   Starting processors...
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:98   Starting receivers...
2022-12-09T02:49:10.717Z    info    pipelines/pipelines.go:102  Receiver is starting... {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
2022-12-09T02:49:10.717Z    info    adapter/receiver.go:55  Starting stanza receiver    {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
2022-12-09T02:49:10.718Z    info    pipelines/pipelines.go:106  Receiver started.   {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
2022-12-09T02:49:10.718Z    info    service/service.go:106  Everything is ready. Begin running and processing data.
2022-12-09T02:49:10.920Z    info    fileconsumer/file.go:159    Started watching file   {"kind": "receiver", "name": "filelog", "pipeline": "logs", "component": "fileconsumer", "path": "/hostfs/var/log/pods/agents_grafana-agent-logs-cloud-94c88_4ca3b7b0-a1e2-4e9d-a0b7-f6b75953e156/grafana-agent-logs-cloud/0.log"}

Additional context

No response

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 16 (10 by maintainers)

Most upvoted comments

Just ran the Kubernetes cluster locally and applied the manifest from the description, just changed the version of the collector to 0.70.0 And I can see logs from the pod in Grafana Cloud Loki:

Screenshot 2023-01-31 at 11 40 20

@wadexu007 could you please try to use the newer collector version? Is the problem still persist?

my issue turned out to be permission issue. I have otel contrib collector running as a service on the machine. Looks like it comes with its own user which isn’t a root user. I had filelog reading from locations where otel-collector wouldn’t be able to reach due to permission lock down.