cache: Extremely slow cache on self-hosted from time to time

It usually takes a few seconds to load the cache. But from time to time it takes a very long time. In my opinion, this has something to do with the slow operation of the Github API. But I can’t prove it yet.

I would like to have some transparency about what happens when the action returns a response like Received 0 of XXX (0.0%), 0.0 MBs/sec. This will allow us to localize and solve the problem completely, or find a workaround.

Unknown-4

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Reactions: 39
  • Comments: 43 (5 by maintainers)

Most upvoted comments

Hey everyone,

I’m Adam, co-founder of BuildJet. Our customers often complain about reliability and speed issues like the one reported here. Today, we launched BuildJet Cache, a GitHub Cache alternative thats reliable, fast and free.

It’s fully compatible with GitHub’s Cache, all you need to do is replace action/cache with buildjet/cache. It works on any runner - be it official, self-hosted, or BuildJet. Whichever runner you choose, BuildJet Cache doubles your free storage, offering a generous 20 GB/repo/week storage space at no cost.

Head over to our launch post for more details: https://buildjet.com/for-github-actions/blog/launch-buildjet-cache

Hey, We are seeing this happening fairly regularly with our customers at BuildJet for GitHub Actions.

Most of the reports in this thread are from people self-hosting, but this issue doesn’t seem to be unique to self-hosting, the hosted runners have the same issue. I don’t know whether you have more insights into workflow runs that use native runners, but here is one example from Google Chrome Lighthouse where the actions/cache@v3 step is failing exactly like described in this thread.

As we have a lot of traffic running on our infrastructure, we really notice when bugs happen with commonly used actions, and we’d be more than happy giving providing you with insights into how users are failing. You can reach me at adam@buildjet.com

@vsvipul

@Sytten, I hear you, and I apologize if my prior message was too promotional, my intention was simply to help people solve a long-standing issue with the actions/cache.

Regarding the github-actions.cache-buildjet package, I totally get your concerns about it being closed source. To ensure full insight, we’ve open sourced it. You can review the repository here.

Lastly, as we wrote in the launch post, if people are interested in using their own object storage, whywaita/actions-cache-s3 is perfect for that. We simply wanted to alleviate that for the users.

Thanks for pushing back and calling out where we need to improve.

Still an issue

Our workaround

    - name: Cache mvn repository
      uses: actions/cache@v2
      timeout-minutes: 5
      continue-on-error: true
      with:
        path: ~/.m2/repository
        key: cache-m2-${{ hashFiles('pom.xml') }}
        restore-keys: |
          cache-m2-

This issue is stale because it has been open for 200 days with no activity. Leave a comment to avoid closing this issue in 5 days.

I also started my own solution, forked the actions/cache code and modified it to place a tarball on a local volume instead upload/download all the cache everytime. You can use it as a drop-in replacement for actions/cache@v3: https://github.com/maxnowack/local-cache

I can only say: I’m also a happy BuildJet customer and long-time sufferer of the cache issue and I’m glad this was posted here. Probably because I’m affected, I did not find this inappropriate; the opposite, I’m glad I learned about this and can’t wait to wake-up tomorrow and try it out 👌🏼

@thinkafterbefore Since you hijacked the post for promotion let me push back a little.

Your cache fork is really not well done, it’s just a build on top of the existing cache action with a random closed source https://www.npmjs.com/package/github-actions.cache-buildjet. Fix your shit please, we have enough supply chain attacks without legit stuff looking sus like that. Ideally do proper tagging too.

I would be much more supportive of an action that actually allowed people to not be dependant on any one provider similar to actions-cache-s3. Devs of the the github cache don’t seem to care about that.

@definelicht

We have added an environment variable SEGMENT_DOWNLOAD_TIMEOUT_MINS to timeout the download after their segment download crosses a certain amount of time. You can refer the readme for more info.

There have been no problems in the last month. Знімок екрана 2022-05-12 о 15 40 53