build-push-action: gha Cache not Used With Tagged Releases
I’m having some issues with using the gha cache and don’t know if there’s something I’m missing. It is apparently exporting the cache.
Export Cache
#18 exporting to image
#18 pushing layers 9.3s done
#18 pushing manifest for ***/nr_drainage:v0.1.8@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b
#18 pushing manifest for ***/nr_drainage:v0.1.8@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b 1.8s done
#18 pushing layers 0.6s done
#18 pushing manifest for ***/nr_drainage:latest@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b
#18 pushing manifest for ***/nr_drainage:latest@sha256:293d8187614fe5ae73af1fa30e18ec276bb37772b8e0442e5d86bb3a7a00616b 0.7s done
#18 DONE 56.3s
#22 exporting cache
#22 preparing build cache for export done
#22 writing layer sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd
#22 writing layer sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd 4.6s done
#22 writing layer sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f
#22 writing layer sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 0.2s done
#22 writing layer sha256:192ba9b3221fa4b50acfd5f0d1410a085379b00a5c7c63f1af5c1990897acce4
#22 writing layer sha256:192ba9b3221fa4b50acfd5f0d1410a085379b00a5c7c63f1af5c1990897acce4 2.2s done
#22 writing layer sha256:39e80151150276578c2b94a27bafb5b4b78025702699b428e7b4d14df909393e
#22 writing layer sha256:39e80151150276578c2b94a27bafb5b4b78025702699b428e7b4d14df909393e 0.2s done
#22 writing layer sha256:3a38a5065324eb257788446643418385bc807cd7c4379f6bace227e9745e82d5
#22 writing layer sha256:3a38a5065324eb257788446643418385bc807cd7c4379f6bace227e9745e82d5 0.2s done
#22 writing layer sha256:51b56d12332dedcd8ee37b12bea1f414dada566781a0c1ee6175ec54e5c403d9
#22 writing layer sha256:51b56d12332dedcd8ee37b12bea1f414dada566781a0c1ee6175ec54e5c403d9 2.3s done
#22 writing layer sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783
#22 writing layer sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783 5.2s done
#22 writing layer sha256:668caffbdcc129d34ace9aaa01f52844b50aa81ec38dd28369f0d57b7ae8c0c8
#22 writing layer sha256:668caffbdcc129d34ace9aaa01f52844b50aa81ec38dd28369f0d57b7ae8c0c8 0.2s done
#22 writing layer sha256:7b65d78f479465d24844da2bd0898bddcea6d27d2bd3a6964f88cced87604f84
#22 writing layer sha256:7b65d78f479465d24844da2bd0898bddcea6d27d2bd3a6964f88cced87604f84 0.2s done
#22 writing layer sha256:838b2dcfb9e4aa0a53a8f968f696b8437bc7451b79813a33b101552d8957d588
#22 writing layer sha256:838b2dcfb9e4aa0a53a8f968f696b8437bc7451b79813a33b101552d8957d588 0.2s done
#22 writing layer sha256:99ea233aafd83fc32f3c32cbec66cfd60ed781d5c26fd74c33c4320ea44b5669
#22 writing layer sha256:99ea233aafd83fc32f3c32cbec66cfd60ed781d5c26fd74c33c4320ea44b5669 0.2s done
#22 writing layer sha256:a70d879fa5984474288d52009479054b8bb2993de2a1859f43b5480600cecb24
#22 writing layer sha256:a70d879fa5984474288d52009479054b8bb2993de2a1859f43b5480600cecb24 1.8s done
#22 writing layer sha256:b50df580e5e95d436d9bc707840266404d5a20c079f0873bd76b4cece327cf0d
#22 writing layer sha256:b50df580e5e95d436d9bc707840266404d5a20c079f0873bd76b4cece327cf0d 0.2s done
#22 writing layer sha256:c4394a92d1f8760cf7d17fee0bcee732c94c5b858dd8d19c7ff02beecf3b4e83
#22 writing layer sha256:c4394a92d1f8760cf7d17fee0bcee732c94c5b858dd8d19c7ff02beecf3b4e83 0.2s done
#22 writing layer sha256:d614cfe64e795d7cd4437846ccc4b8e7da7eac49597a10b8b46b5c0ced4b2c19
#22 writing layer sha256:d614cfe64e795d7cd4437846ccc4b8e7da7eac49597a10b8b46b5c0ced4b2c19 2.6s done
#22 writing layer sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c
#22 writing layer sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c 0.2s done
#22 writing layer sha256:dfc8455ab52d21e8800fb4aa291af841849410a129649971bd8296b817fab489
#22 writing layer sha256:dfc8455ab52d21e8800fb4aa291af841849410a129649971bd8296b817fab489 1.7s done
#22 DONE 23.1s
But then when trying to build again, it doesn’t appear to use it. Note that the cache was exported by tag v0.1.7 and the one to use the cache is v0.1.8. They are constructed from the exact same commit.
Doesn't use cache
/usr/bin/docker buildx build --tag ***/nr_drainage:v0.1.8 --tag ***/nr_drainage:latest --iidfile /tmp/docker-build-push-ORC2bc/iidfile --cache-from type=gha, mode=max, scope=Dev Deploy to ACR --cache-to type=gha, mode=max, scope=Dev Deploy to ACR --push .
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.27kB done
#1 DONE 0.0s
#2 [internal] load .dockerignore
#2 transferring context: 2B done
#2 DONE 0.0s
#3 [internal] load metadata for docker.io/rocker/shiny:4.0.3
#3 DONE 0.8s
#8 [internal] load build context
#8 DONE 0.0s
#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 resolve docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a done
#17 DONE 0.0s
#4 importing cache manifest from gha:3101350370151987365
#4 DONE 0.2s
#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 ...
#8 [internal] load build context
#8 transferring context: 187.66kB 0.0s done
#8 DONE 0.0s
#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 0B / 187B 0.2s
#17 sha256:0f0203ecafcf0ac029c2198191ae8028ca7ae7230dbb946a307cff31753583bd 6.29MB / 214.86MB 0.2s
#17 sha256:d74d771661688e157d0402fa439f318240dcb070f26632407c20669d70dd1e9c 0B / 21.29kB 0.2s
#17 sha256:565a55e28edd0cd645d1ee09a2eb1174eb90056cd9e62046ec131c92951ff783 0B / 287.72MB 0.2s
#17 sha256:10e6159c56c084c858f5de2416454ac0a49ddda47b764e4379c5d5a147c9bf5f 187B / 187B 0.3s done
If I reuse the same tag (e.g. release Tag v0.1.7 and then delete it and re-release Tag v0.1.7) then it grabs from the cache as intended.
Does use the cache
/usr/bin/docker buildx build --tag ***/nr_drainage:v0.1.8 --tag ***/nr_drainage:latest --iidfile /tmp/docker-build-push-mwHmsK/iidfile --cache-from type=gha, mode=max, scope=Dev Deploy to ACR --cache-to type=gha, mode=max, scope=Dev Deploy to ACR --push .
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 1.27kB done
#1 DONE 0.0s
#2 [internal] load .dockerignore
#2 transferring context: 2B done
#2 DONE 0.0s
#3 [internal] load metadata for docker.io/rocker/shiny:4.0.3
#3 DONE 0.7s
#8 [internal] load build context
#8 DONE 0.0s
#17 [ 1/12] FROM docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a
#17 resolve docker.io/rocker/shiny:4.0.3@sha256:8a194c3a17b565a14e79f307ef1afbe9ca67569c4211f965773d7edc6a54f31a done
#17 DONE 0.0s
#4 importing cache manifest from gha:6286612600847900197
#4 DONE 0.3s
#8 [internal] load build context
#8 transferring context: 187.65kB 0.0s done
#8 DONE 0.0s
Behaviour
Steps to reproduce this issue
- Deploy using a tag trigger
- Deploy using a new tag
Expected behaviour
Second deploy should use cache from previous one
Actual behaviour
It doesn’t use the cache
deploy yml
# Dev deployment (all tags get pushed)
name: Dev Deploy to ACR
# Controls when the workflow will run
on:
push:
tags:
# Limits to all versions! Imagine that.
- v*.*.*
- v*.*.*-*
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-20.04
# Specify environment (dev/test/etc)
environment: dev
env:
docker_repo_name: nr_drainage
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
- name: Get the version
id: get_version
run: echo ::set-output name=VERSION::$(echo $GITHUB_REF | cut -d / -f 3)
# Get release version
- name: Checkout triggered release
uses: actions/checkout@v2
with:
ref: '${{ github.ref }}'
# Put all packages here. Could even make your own github actions to handle this.
- name: Checkout and tarball up gen azurestorefuns
uses: actions/checkout@v2
with:
repository: '***'
ssh-key: '${{ secrets.AZURESTOREFUNS_READ_KEY }}'
path: 'tmppkg'
ref: 'v0.1.6-1'
- run: mkdir -p gen-packages
- run: tar -czvf gen-packages/azurestorefuns.tar.gz ./tmppkg
- run: rm -rf ./tmppkg
# End packages
# Connect into ACR
- name: Connect to ACR
uses: azure/docker-login@v1
with:
login-server: ${{ secrets.REGISTRY_LOGIN_SERVER }}
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_PASSWORD }}
# This is the a separate action that sets up buildx runner
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
with:
version: v0.6.1
- name: Build and push
uses: docker/build-push-action@v2
with:
context: .
builder: ${{ steps.buildx.outputs.name }}
push: true
tags: ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:${{ steps.get_version.outputs.VERSION }}, ${{ secrets.REGISTRY_LOGIN_SERVER }}/${{ env.docker_repo_name }}:latest
cache-from: type=gha, mode=max, scope=${{ github.workflow }}
cache-to: type=gha, mode=max, scope=${{ github.workflow }}
About this issue
- Original URL
- State: closed
- Created 3 years ago
- Reactions: 5
- Comments: 23 (10 by maintainers)
I’ve produced a repo demonstrating the behaviour.
https://github.com/nik-humphries/buildx-cache-test/actions
First run - v0.0.1 builds from scratch Second run - v0.1.2 builds from scratch, but shouldn’t Third run - v0.1.2 builds from cache
I can confirm caching in tagged releases doesn’t work for some reason, however it works with
on: pushworkflows.Our workflow for tagged releases:
@alextes Thank you for the clarification. So your workflow is similar to ours, make a
buildstep that builds the image and pushes as latest (where the cache works). Then have a follow up workflow that triggers on tag to pull and re-tag with the version number. The piece I missed originally is that you added thewait-onto remedy to possibility of both workflows being triggered at the same time, and the re-tag pulling an older version of the image and re-tagging before the current build run finishes.Originally I thought about using sha tags so that I can make sure which image is being re-tagged, however that means that there will be a significant amount of images being pushed up to the registry and will need a separate maintenance process to clean things up.
Will give your approach a try in the meantime as it is a cleaner solution for the time being.
https://github.com/tonistiigi/go-actions-cache/blob/master/cache.go#L196-L198 can be used to show what scopes gh provides access to.