jest: jest-circus memory leak
๐ Bug Report
First offโฆ thanks heaps for creating and maintaining Jest. ๐ JavaScript testing is in a better state because of all you do! ๐
It looks like a regression was introduced between versions 22.4.4
and 23.1.0
of Jest Circus that is causing a rather substantial memory leak (I wasnโt able to run any tests with errors on any versions between those two versions):
$ npm run jest-circus
PASS __tests__/index-21.jest.js (24 MB heap size)
...
PASS __tests__/index-79.jest.js (251 MB heap size)
PASS __tests__/index-49.jest.js (253 MB heap size)
PASS __tests__/index-65.jest.js (256 MB heap size)
PASS __tests__/index-73.jest.js (258 MB heap size)
Test Suites: 100 passed, 100 total
Tests: 100 passed, 100 total
Snapshots: 0 total
Time: 13.991s
Ran all test suites.
I had a look at https://github.com/facebook/jest/issues/7274#issuecomment-437510510 as well as #6965. I added this.stack = this.stack
to https://github.com/facebook/jest/blob/master/packages/jest-util/src/ErrorWithStack.ts (via https://github.com/facebook/jest/blob/master/packages/jest-circus/src/index.ts) and this.stack = this.stack
after every new Error()
invocation in https://github.com/facebook/jest/blob/master/packages/jest-circus/src/utils.ts, which didnโt seem to impact the leaks at all.
Other than narrowing down the version that introduced the regression and trying what was mentioned in #6965, I havenโt really had time to make any more progress. I also noticed #7274 but that seemed to be focused on memory leaks from graceful-fs
, which I confirmed independently, so I thought I would create a new issue that focuses purely on jest-circus
.
To Reproduce
I created a sandbox repository with a simple test and 100 test files. I then ran the tests with jest-circus
as well as the default runner for (matching) versions: 24.8.0
, 23.1.0
, and 22.4.4
and recorded the results: https://github.com/BuildingConnected/jest-circus-memory-leak.
Expected behavior
The memory footprint when running tests via jest-circus
should match that of the default test runner.
Link to repl or repo (highly encouraged)
https://github.com/BuildingConnected/jest-circus-memory-leak
Run 100 test files:
npm run jest # default runner
npm run jest-circus # jest-circus
Run npx envinfo --preset jest
System:
OS: macOS 10.14.6
CPU: (8) x64 Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
Binaries:
Node: 10.15.1 - ~/.nvm/versions/node/v10.15.1/bin/node
Yarn: 1.17.3 - /usr/local/bin/yarn
npm: 6.4.1 - ~/.nvm/versions/node/v10.15.1/bin/npm
npmPackages:
jest: 24.8.0 => 24.8.0
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Reactions: 1
- Comments: 15 (10 by maintainers)
Is there any progress on this one?
Ah, that is super helpful. Thanks for that!
I looked again after #6176 and, you guys may already be aware, it looks like the next memory leak was introduced with this commit: https://github.com/facebook/jest/commit/c5994de495aceb99eeec5d24bd1a3143ca74b9bb.
I think it may be this line: https://github.com/facebook/jest/blob/94067089dd25c77463e24e981fe979fb6cc0b2fb/packages/jest-circus/src/legacy-code-todo-rewrite/jestAdapter.ts#L28
Calling
global.gc()
and then loggingprocess.memoryUsage().heapUsed
immediately before and after theruntime.requireInternalModule(FRAMEWORK_INITIALIZER)
invocation, shows memory usage increase by almost the exact amount it increases for each test file before staying almost consistent until thejestAdapter
function returns.I apologize for the crudeness, but here is a simplified example of the logging pattern that I was using followed by some sample outputs:
Output:
@bcmarinacci Amazing findings! I also tried attacking error stacks and the sourcemap install but (also) wasnโt able to see this resolve.
I have an idea what might be causing this with the framework install. I wonder if the runner state, which is placed on a global, needs to be cleaned upโฆ It really wouldnโt surprise me.
Iโll spend some time tomorrow on this. Thank you!
I just started getting into the
runtime.requireInternalModule()
code as the sun was coming up so I decide to stop there. ๐ I have some rather time-sensitive work to complete at the moment so I might not be able to spend more time on this for the next few days but I will post any new findings in this thread. ๐