garden: Bug: Terraform module logs output sensitive terraform outputs in plain text

Bug

Current Behavior

Using terraform outputs:

output "password" {
  value     = "somesecret"
  sensitive = true
}

and garden -l=silly -o=yaml the output will be printed in plain text in the console logs.

Expected behavior

The output should be replaced with something like <sensitive>

Reproducible example

Workaround

Suggested solution(s)

Additional context

Your environment

  • OS:
  • How I’m running Kubernetes:

garden version

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 1
  • Comments: 21 (9 by maintainers)

Most upvoted comments

Thanks @stefreak, will have a look and get back if I need any help 🤝

I think there should just be a --ci switch that does a bunch of sensible defaults

We use logger-type=basic via env var in CI

snippet from .gitlab-ci.yml

  variables:
    GARDEN_LOGGER_TYPE: basic

example error:

errors:
  - detail:
      results:
        deploy.xxx:
          type: deploy
          description: deploying service 'xxx' (from module 'xxx')
          key: deploy.xxx
          name: xxx
          error:
            shortMessage: >-
              Command failed with exit code 1:
              /root/.garden/tools/kubectl/hash123/kubectl
              --context=aaa/bbb/configuration/deployment:yyy
              apply --output=json -f -
            command: >-
              /root/.garden/tools/kubectl/hash123/kubectl
              --context=aaa/bbb/configuration/deployment:yyy
              apply --output=json -f -
            exitCode: 1
            stdout: ''
            stderr: >-
              Warning: autoscaling/v2beta1 HorizontalPodAutoscaler is deprecated
              in v1.22+, unavailable in v1.25+; use autoscaling/v2
              HorizontalPodAutoscaler
              Error from server: error when retrieving current configuration of:
              Resource: "policy/v1, Resource=poddisruptionbudgets",
              GroupVersionKind: "policy/v1, Kind=PodDisruptionBudget"
              Name: "xxx", Namespace: "zzz"
              from server for: "STDIN": GitLab Agent Server: HTTP->gRPC: failed
              to read gRPC response: rpc error: code = Canceled desc = context
              canceled. Trace ID: hash456
            all: >-
              Warning: autoscaling/v2beta1 HorizontalPodAutoscaler is deprecated
              in v1.22+, unavailable in v1.25+; use autoscaling/v2
              HorizontalPodAutoscaler
              Error from server: error when retrieving current configuration of:
              Resource: "policy/v1, Resource=poddisruptionbudgets",
              GroupVersionKind: "policy/v1, Kind=PodDisruptionBudget"
              Name: "xxx", Namespace: "zzz"
              from server for: "STDIN": GitLab Agent Server: HTTP->gRPC: failed
              to read gRPC response: rpc error: code = Canceled desc = context
              canceled. Trace ID: hash456
            failed: true
            timedOut: false
            isCanceled: false
            killed: false
          startedAt: '2023-02-13T08:50:21.143Z'
          completedAt: '2023-02-13T08:50:24.077Z'
          batchId: 676a0b31-7788-42d7-95f7-8a7ff10444db
          version: v-e73ec71ba6
    type: runtime

I don’t have that error captured without -o=yaml, its not reproducible it just happens sometimes with the gitlab cluster agent.

Yep I’ll see if I can create a reproducible example

I see, the thing is that the provider outputs need to contain the sensitive values, and users need to be able to use them e.g. in a kubernetes module later, for example to connect to the database.

So I think we need to find all the places where it might be logged, or add a way to mark provider outputs as sensitive and prevent logging them in the framework. Thank you for the investigation, and I’m glad to continue to help, but feel free to pick another issue as well (maybe the ones tagged with “good first issue”) if this one gets too complicated. Thank you so much for the effort! 🥇

would like to have a look and help out if no one else is already working on it 🙌🏻