zarf: New CRC Hash not being calculated the same?

Environment

Device and OS: Virtualbox 6.1 with Guest OS: RHEL 8.7 App version: zarf_v0.26.4_Linux_amd64 Kubernetes distro being used: k3s version v1.24.1+k3s1 (0581808f) Other: go version go1.18.1

Steps to reproduce

  1. run zarf package create to create a Zarf .ZST Package that includes Images Component and related Charts Component within the zarf.yaml. Example:
components:
 - name: required-images
    description: "All images for required services"
    images:
      - some.registry.com/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5
      - some.registry.com/ironbank-mirror/ironbank/opensource/strimzi/kafka:0.28.0-kafka-3.1.0
      - some.registry.com/ironbank-mirror/ironbank/opensource/strimzi/operator:0.28.0
 
 - name: istio-operator
    description: Installs the istio-operator
    charts:
      - url: https://wiremind.github.io/wiremind-helm-charts
        name: istio-operator
        releaseName: istio-operator
        version: 1.12.0
        namespace: istio-operator
        valuesFiles:
          - ./services/system/istio-operator/values.yaml

Once the package is created, copy to target system for testing…

  1. Run Zarf Init, Y - k3s, N - logging, N - gitea
  2. Run Zarf package deploy <package.tar.zst>

Expected result

The docker registry running on the local k3s cluster has the appropriate images and tags from the images component, that allow the subsequent chart components to successfully complete/

Actual Result

Component Deployment ‘hangs’ with WARNING: Unable to complete helm chart install/upgrade, waiting 10 seconds and trying again.

Looking at the kubectl describe <pod> the tag referenced does not match the tag that was created for the images component with the appended CRC hash.

Inside the local k3s registry, the two tags exist: “1.12.5” and “1.12.5-zarf-3644333575”

[root@localhost ~]# curl http://zarf-push:z7UCjyyUdM5I~yJUHFsk6MRU@127.0.0.1:31999/v2/ironbank-mirror/ironbank/opensource/istio/operator/tags/list
{"name":"ironbank-mirror/ironbank/opensource/istio/operator","tags":["1.12.5-zarf-3644333575","1.12.5"]}

However, the helm chart deployment is attempting to pull “1.12.5-zarf-1021806520”

Since “1.12.5-zarf-1021806520” != “1.12.5-zarf-3644333575” The helm deployment fails.

Visual Proof (screenshots, videos, text, etc)

Events:
  Type     Reason     Age                     From               Message
  ----     ------     ----                    ----               -------
  Normal   Scheduled  9m38s                   default-scheduler  Successfully assigned istio-operator/istio-operator-8666dd87cf-22fn2 to localhost.localdomain
  Normal   Pulling    8m7s (x4 over 9m38s)    kubelet            Pulling image "127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520"
  Warning  Failed     8m7s (x4 over 9m38s)    kubelet            Failed to pull image "127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520": rpc error: code = NotFound desc = failed to pull and unpack image "127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520": failed to resolve reference "127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520": 127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520: not found
  Warning  Failed     8m7s (x4 over 9m38s)    kubelet            Error: ErrImagePull
  Warning  Failed     7m54s (x6 over 9m38s)   kubelet            Error: ImagePullBackOff
  Normal   BackOff    4m26s (x21 over 9m38s)  kubelet            Back-off pulling image "127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520"

Severity/Priority

Blocker for me

Additional Context

To quick fix that single component, I ran the following commands and with a matching tag in the registry, the stalled component completed in the zarf package deploy script.

# docker pull 127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-3644333575
# docker tag 127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-3644333575 127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520
# docker push 127.0.0.1:31999/ironbank-mirror/ironbank/opensource/istio/operator:1.12.5-zarf-1021806520

However, i have several components and this manual process is not viable for me. I cannot change the helm deployment via upgrade in the background, because there is another operation in progress

[root@localhost istio-operator]# helm -n istio-system upgrade istio-operator charts/istio-operator-1.12.0.tgz -f values/istio-operator-1.12.0-0
Error: UPGRADE FAILED: another operation (install/upgrade/rollback) is in progress

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 18 (8 by maintainers)

Most upvoted comments

Resolved!

So in the upstream chart, the syntax for the image is “hub:”, when i changed from 127.0.0.1 -> harbor.mydomain.com/… i used the ‘image’ key instead of ‘hub’. So while I was chasing the first issue, and figured out how the CRC is generated for the image… i inadvertently triggered a separate issue with how the chart is traversed. (hub key vs image key, vs image w/ tag included vs seperate tag key…etc)

Now the checksums are matched up and the deployment worked!

Huge thanks to @Racer159 and also @Noxsios for helping me through this. I now know more about how zarf does the checksum during packaging and deployment as a result! I hope this thread can be of value to another. Feel free to scrape anything from this thread for a blog or docs down the road.

Yea, I’m getting there… I wrote this script:

package main

import (
    "fmt"
    "hash/crc32"
    "os"
    "bufio"
    "strconv"
)

func main() {

    readFile, err := os.Open("/tmp/strings")
      if err != nil {
        fmt.Println(err)
    }
    fileScanner := bufio.NewScanner(readFile)
     fileScanner.Split(bufio.ScanLines)
      for fileScanner.Scan() {
        fmt.Println(fileScanner.Text() + " HAS CRC32: " + strconv.FormatUint(uint64(GetCRCHash(fileScanner.Text())), 10))
    }
      readFile.Close()
}

// GetCRCHash returns the computed CRC32 Sum of a given string
func GetCRCHash(text string) uint32 {
	table := crc32.MakeTable(crc32.IEEE)
	return crc32.Checksum([]byte(text), table)
}

Then pasted all my possible variations into /tmp/strings and ran it. I found that the CRC being generated in the Zarf repo is based on the string defined in my zarf.yaml

So the issue is how the values.yaml CRC is generated, which I now see will never match what is in zarf.yaml for the packaging!

Gotta run, but will re-test tomorrow.

Image