cache: Yarn restores cache, but does not install from it

Hey team!

First, awesome job on this feature, it will immensely help our CI speed for our JavaScript projects, kudos!

I’ve been running on the “over the limit” error for a yarn project with workspaces enabled:

Post job cleanup.
/bin/tar -cz -f /home/runner/work/_temp/3c08f6f0-f11f-4d8f-bed5-d491e7d8d443/cache.tgz -C /home/runner/.cache/yarn .
##[warning]Cache size of 231440535 bytes is over the 200MB limit, not saving cache.

But when I run the same tar command locally, I get a 100.3 MB bundle. Is there anything I’m missing here?

Here’s my workflow:

name: Test
on:
  push:
    branches:
      - '**'
    tags:
      - '!**'
jobs:
  test:
    name: Test, lint, typecheck and build
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@master
      - name: Dump GitHub context
        env:
          GITHUB_CONTEXT: ${{ toJson(github) }}
        run: echo "$GITHUB_CONTEXT"
      - name: Use Node.js 10.16.0
        uses: actions/setup-node@v1
        with:
          node-version: 10.16.0
      - name: Cache yarn node_modules
        uses: actions/cache@v1
        with:
          path: ~/.cache/yarn
          key: ${{ runner.OS }}-yarn-${{ hashFiles('**/yarn.lock') }}
          restore-keys: |
            ${{ runner.os }}-yarn-
      - name: Install
        run: yarn install --frozen-lockfile
        # ...

Thanks a lot!

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 15 (4 by maintainers)

Most upvoted comments

@teohhanhui to be confirmed, we only use yarn so I can’t tell. I’d say this is the subject for another issue.

In any case, yarn caching works to an extent here, so I’ll close this issue. Thanks again for your help @joshmgross 👍

You can definitely use a node_modules in your cache instead of caching the yarn cache directly, that’s how we handle caching with npm (npm has a cache but it’s not meant to be populated by the user).

I’d recommend trying it out and seeing which is faster

Trying it out right now, thanks a lot for the fast reply 👍