jest: jest-circus memory leak

๐Ÿ› Bug Report

First offโ€ฆ thanks heaps for creating and maintaining Jest. ๐Ÿ™ JavaScript testing is in a better state because of all you do! ๐Ÿƒ

It looks like a regression was introduced between versions 22.4.4 and 23.1.0 of Jest Circus that is causing a rather substantial memory leak (I wasnโ€™t able to run any tests with errors on any versions between those two versions):

$ npm run jest-circus

PASS  __tests__/index-21.jest.js (24 MB heap size)
...
PASS  __tests__/index-79.jest.js (251 MB heap size)
PASS  __tests__/index-49.jest.js (253 MB heap size)
PASS  __tests__/index-65.jest.js (256 MB heap size)
PASS  __tests__/index-73.jest.js (258 MB heap size)

Test Suites: 100 passed, 100 total
Tests:       100 passed, 100 total
Snapshots:   0 total
Time:        13.991s
Ran all test suites.

I had a look at https://github.com/facebook/jest/issues/7274#issuecomment-437510510 as well as #6965. I added this.stack = this.stack to https://github.com/facebook/jest/blob/master/packages/jest-util/src/ErrorWithStack.ts (via https://github.com/facebook/jest/blob/master/packages/jest-circus/src/index.ts) and this.stack = this.stack after every new Error() invocation in https://github.com/facebook/jest/blob/master/packages/jest-circus/src/utils.ts, which didnโ€™t seem to impact the leaks at all.

Other than narrowing down the version that introduced the regression and trying what was mentioned in #6965, I havenโ€™t really had time to make any more progress. I also noticed #7274 but that seemed to be focused on memory leaks from graceful-fs, which I confirmed independently, so I thought I would create a new issue that focuses purely on jest-circus.

To Reproduce

I created a sandbox repository with a simple test and 100 test files. I then ran the tests with jest-circus as well as the default runner for (matching) versions: 24.8.0, 23.1.0, and 22.4.4 and recorded the results: https://github.com/BuildingConnected/jest-circus-memory-leak.

Expected behavior

The memory footprint when running tests via jest-circus should match that of the default test runner.

Link to repl or repo (highly encouraged)

https://github.com/BuildingConnected/jest-circus-memory-leak

Run 100 test files:

npm run jest # default runner
npm run jest-circus # jest-circus

Run npx envinfo --preset jest

System:
  OS: macOS 10.14.6
  CPU: (8) x64 Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
Binaries:
  Node: 10.15.1 - ~/.nvm/versions/node/v10.15.1/bin/node
  Yarn: 1.17.3 - /usr/local/bin/yarn
  npm: 6.4.1 - ~/.nvm/versions/node/v10.15.1/bin/npm
npmPackages:
  jest: 24.8.0 => 24.8.0 

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Reactions: 1
  • Comments: 15 (10 by maintainers)

Most upvoted comments

Is there any progress on this one?

Ah, that is super helpful. Thanks for that!

I looked again after #6176 and, you guys may already be aware, it looks like the next memory leak was introduced with this commit: https://github.com/facebook/jest/commit/c5994de495aceb99eeec5d24bd1a3143ca74b9bb.

I think it may be this line: https://github.com/facebook/jest/blob/94067089dd25c77463e24e981fe979fb6cc0b2fb/packages/jest-circus/src/legacy-code-todo-rewrite/jestAdapter.ts#L28

Calling global.gc() and then logging process.memoryUsage().heapUsed immediately before and after the runtime.requireInternalModule(FRAMEWORK_INITIALIZER) invocation, shows memory usage increase by almost the exact amount it increases for each test file before staying almost consistent until the jestAdapter function returns.

I apologize for the crudeness, but here is a simplified example of the logging pattern that I was using followed by some sample outputs:

// jestAdapter.ts

const FRAMEWORK_INITIALIZER = require.resolve('./jestAdapterInit');

const jestAdapter = async (
  // ...
): Promise<TestResult> => {
  global.gc();
  console.log(`1. START ${filename}: process.memoryUsage().heapUsed MB`);

  const {
    initialize,
    runAndTransformResultsToJestFormat,
  } = runtime.requireInternalModule(FRAMEWORK_INITIALIZER);

  global.gc();
  console.log(`2. AFTER INVOCATION ${filename}: process.memoryUsage().heapUsed MB`);

  // ...

  global.gc();
  console.log(`3. END ${filename}: process.memoryUsage().heapUsed MB`);

  return _addSnapshotData(results, snapshotState);
}

Output:

1. START index-44.jest.js: 220.00968170166016 MB
2. AFTER INVOCATION index-44.jest.js: 222.1803207397461 MB
3. END index-44.jest.js: 222.3431625366211 MB
PASS  __sandbox__/__tests__/index-44.jest.js (222 MB heap size)

1. START index-28.jest.js: 222.53646850585938 MB
2. AFTER INVOCATION index-28.jest.js: 224.7444076538086 MB
3. END index-28.jest.js: 224.8832015991211 MB
PASS  __sandbox__/__tests__/index-28.jest.js (224 MB heap size)

@bcmarinacci Amazing findings! I also tried attacking error stacks and the sourcemap install but (also) wasnโ€™t able to see this resolve.

I have an idea what might be causing this with the framework install. I wonder if the runner state, which is placed on a global, needs to be cleaned upโ€ฆ It really wouldnโ€™t surprise me.

Iโ€™ll spend some time tomorrow on this. Thank you!

I just started getting into the runtime.requireInternalModule() code as the sun was coming up so I decide to stop there. ๐Ÿ˜… I have some rather time-sensitive work to complete at the moment so I might not be able to spend more time on this for the next few days but I will post any new findings in this thread. ๐Ÿ™‚