next.js: Memory Leak with Next.js's global fetch. Tested against http module
Link to the code that reproduces this issue
https://github.com/m-rphy/nextMemoryLeak
To Reproduce
-
In one terminal and
cd
into the/express_server
and runnpm start
-
In a sperate teminal
cd
in/next_app
and runnpm run inspect
-
Using any browser go to either
http://localhost:3000/start-fetch
orhttp://localhost:3000/start-custom-fetch
to begin the requests. -
Then open chrome inspect (
chrome://inspect
) or use any other debugging tools.
I believe this is ticket is also relevant - but this repo reproduces it with and avoids it -> #54708
Current vs. Expected behavior
Next.js’s global fetch is holding onto performance metrics or some other data that is leading to heap growth after every requests.
The heap should not grow after the scope closes.
Provide environment information
Operating System:
Platform: darwin
Arch: arm64
Version: Darwin Kernel Version 23.4.0: Fri Mar 15 00:12:49 PDT 2024; root:xnu-10063.101.17~1/RELEASE_ARM64_T6020
Available memory (MB): 16384
Available CPU cores: 10
Binaries:
Node: 20.10.0
npm: 10.2.3
Yarn: N/A
pnpm: N/A
Relevant Packages:
next: 14.2.0-canary.62 // Latest available version is detected (14.2.0-canary.62).
eslint-config-next: 14.1.4
react: 18.2.0
react-dom: 18.2.0
typescript: 5.4.4
Next.js Config:
output: N/A
Which area(s) are affected? (Select all that apply)
Not sure, Data fetching (gS(S)P, getInitialProps)
Which stage(s) are affected? (Select all that apply)
next dev (local), next build (local), next start (local), Other (Deployed)
Additional context
I’ve tested this repo against different canary releases (canary-32 and canary-62), as well as Next.js lts. I haven’t been able to find a version that works.
About this issue
- Original URL
- State: closed
- Created 3 months ago
- Reactions: 20
- Comments: 18 (5 by maintainers)
Commits related to this issue
- Clean-up fetch metrics tracking (#64746) This ensures we only track fetch metrics in development mode as that's the only time we report them currently, this also adds an upper limit on how many met... — committed to vercel/next.js by ijjk 2 months ago
- Clean-up fetch metrics tracking (#64746) This ensures we only track fetch metrics in development mode as that's the only time we report them currently, this also adds an upper limit on how many metri... — committed to vercel/next.js by ijjk 2 months ago
hey folks, we think we identified the source of the leak, we’ll try to fix it by next week
Using next.js fetch
Using Axios (implements nodes http module)
We’re still investigating!
Hi, this has been updated in
v14.3.0-canary.11
of Next.js, please update and give it a try!@m-rphy in your screenshot, are the graph drops due to a server restart? What I can see for now is that there’s some fetch data being collected as part of the performance observer metrics, but I think those get flushed, and some others we’re collecting that is being retained.
We switched to using axios for request handling instead of the native fetch. That’s it. These charts, produced by Grafana for our production environment, illustrate the severity of the memory leak issue we faced. Our application, which processes thousands of outgoing requests per second, suffered significantly due to this memory leak. This repository is capable of reproducing the memory leak, as well as bypassing it by simply changing the HTTP handler. While we use axios in our production app, in this repo’s toy model, I implemented a simple “fetch” with Node.js’ http module. However, the effect remains the same. If you by pass next.js fetch and use one that implement node’s http module, the leak goes away.