next.js: [NEXT-841] FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory

What version of Next.js are you using?

12.0.7

What version of Node.js are you using?

16.6.2

What browser are you using?

Chrome / safari

What operating system are you using?

Mac os

How are you deploying your application?

other

Describe the Bug

We have a monorepo with nx wherein we are using next for ssr We have been on next 11 and wanted to move to the next 12 with swc On doing so and making the neccessary changes, our app crashes with

We have tried adding more memory but we feel that the issue lies elsewhere

--- Last few GCs --->

[66122:0x7fe502d00000]   544670 ms: Mark-sweep (reduce) 4060.1 (4143.2) -> 4059.7 (4144.0) MB, 5936.8 / 0.1 ms  (average mu = 0.080, current mu = 0.001) allocation failure scavenge might not succeed
[66122:0x7fe502d00000]   550506 ms: Mark-sweep (reduce) 4060.8 (4144.0) -> 4060.4 (4144.7) MB, 5834.7 / 0.1 ms  (average mu = 0.042, current mu = 0.000) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0x108960ae5 node::Abort() (.cold.1) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 2: 0x1076563a9 node::Abort() [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 3: 0x10765651f node::OnFatalError(char const*, char const*) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 4: 0x1077d5137 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 5: 0x1077d50d3 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 6: 0x10798c0b5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 7: 0x10798aa79 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 8: 0x107996c9a v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
 9: 0x107996d21 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
10: 0x10796539c v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
11: 0x107d1680e v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
12: 0x10809fab9 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/Users/n0s00jx/.volta/tools/image/node/16.6.2/bin/node]
13: 0x10c684c2e 
14: 0x10c6847f5 

Expected Behavior

Should work

To Reproduce

  • upgrade to next 12.0.7 / 12.0.4 and try running the dev server

NEXT-841

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 127
  • Comments: 191 (45 by maintainers)

Commits related to this issue

Most upvoted comments

Can you reopen this?

  • I can positively say that esmExternals: false does not help in all cases (it does not in ours)
  • esmExternals: false is not a solution. Neither is increasing the memory. These are workarounds. This problem should not arise in the first place

I have the same problem:

  • project with next@11.1.3 works fine
  • project with next@12.0.7 crashes with the error above (tried next@12.0.6-canary.6 and the error remains)

In my case the project runs with docker-compose using the image node:16.13-alpine3.14, if the project is run on my machine (Intel Mac OS with Big Sur) it works fine, but within the container it crashes. Other infos that could help is that we use next-transpile-modules and treat/webpack-plugin in the next.config.js.


I resolved my issue adding the following to my next.config.js:

experimental: {
  esmExternals: false,
}

So I can confirm @ryne2010 theory with named exports connection to this issue. Over past week issue started rising again (We manage to tame it with dynamic imports on our reducers, since we do need them only at client atm).

This week our biggest page reached ~2min for first render. Generally issue is much worse on pages with dynamic routing, with static routing it’s still 5-15s.

Measurement results

I’ve started rewriting this page exports to default, with 5 measurements while rewriting. Got from original Next.js-route-change-to-render: 126012ms to Next.js-route-change-to-render: 30346ms. So it’s about 4x improvement by just rewriting ~20 components from named export to default. There are still plenty more generic components (buttons, links…), hooks, selectors which are using named exports.

Measurement details

  • tested it on my Macbook pro 2020 32GB ram, but similar results on Linux machine
  • all tested on localhost, but issue starts appear on prod as well on first request only (which is weird since we have only SSG and not single SSR page)
  • first rendered homepage, since first render of our biggest page meant out of heap error, then i did go to biggest page, and back to homepage
  • used homepage measurements (non affected by any changes made) to determine average deviation to be 40%
  • tested on "next": "12.1.5" with SWR (but there was no difference with babel fallback in past)

We can provide more information, however provide minimal reproducible repo is not possible since it’s scale with code quantity and we can’t provide all code because of NDA.

I’m seeing this pretty regularly with Next.js 13 and appDir. During next dev, memory usage grows steadily until an OOM happens, so I find myself regularly restarting next dev.

Note: I never experienced this problem prior to using Next.js 13 and appDir, so I’m guessing this is related to the new experimental features — which means that it’s likely a separate issue from the original one in this thread.

Here’s an example stack trace from next dev running on https://github.com/transitive-bullshit/next-movie:

  • next 13.0.4
  • node v16.18.0
  • mac os 12.6
<--- Last few GCs --->

[30966:0x7ff719200000] 13966206 ms: Scavenge 4065.3 (4102.1) -> 4061.7 (4103.8) MB, 2.6 / 0.0 ms  (average mu = 0.329, current mu = 0.316) allocation failure 
[30966:0x7ff719200000] 13966216 ms: Scavenge 4066.1 (4103.8) -> 4063.7 (4106.1) MB, 4.6 / 0.0 ms  (average mu = 0.329, current mu = 0.316) task 
[30966:0x7ff719200000] 13966228 ms: Scavenge 4069.3 (4106.1) -> 4066.4 (4124.6) MB, 5.9 / 0.0 ms  (average mu = 0.329, current mu = 0.316) allocation failure 


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x10e8a15f5 node::Abort() (.cold.1) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 2: 0x10d596f49 node::Abort() [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 3: 0x10d59712e node::OOMErrorHandler(char const*, bool) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 4: 0x10d70e300 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 5: 0x10d70e2c3 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 6: 0x10d8b1fa5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 7: 0x10d8b5fed v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 8: 0x10d8b28cd v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
 9: 0x10d8afded v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
10: 0x10d8aeb08 v8::internal::Heap::HandleGCRequest() [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
11: 0x10d8589a1 v8::internal::StackGuard::HandleInterrupts() [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
12: 0x10dc415e8 v8::internal::Runtime_StackGuard(int, unsigned long*, v8::internal::Isolate*) [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
13: 0x10dfeab39 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/Users/tfischer/.nvm/versions/node/v16.18.0/bin/node]
14: 0x11338a6e5 

Posting here because this issue is tagged with please add a complete reproduction, however let me know if I should be opening a new issue.

I am able to consistently reproduce a build failure due to JS heap OOM, with a large amount of pages:

In most CI environments this will of course fail as well. The build passes linting, but fails on the ‘Creating an optimized production build’ step.

I can confirm that the repo submitted by @transitive-bullshit is also affected, it seems that most of the time webpack incrementally bumps the files watched and crashes. As mentioned above there is a bug when people are trying to use components living outside the pages and the node_modules directories.

There are some local files used for caching in the .next/cache folder, but I cannot tell if that’s the case. Also, this issue is mentioned in the webpack-dev-server repo as well.

All in all, I tried to find what’s wrong by watching for file or size changes, but still no luck. The only indicator is the console shouting for an enormous number of modules used (1000 modules for a simple 404 pages does not make sense). I would love to try to solve this one, but spotting the culprit is hard, the webpack configuration is humongous and spread across the next.js core package.

Maybe someone from the core team could actually shed some more light on, as this issue affects both the 12.x and the 13.x upstream releases. Fortunately, since moving towards the app folder is the new norm for the project, this issue will get more attention.

I am facing the same issue in next.js version 13.0.3

Hi, we recently landed some changes (https://github.com/vercel/next.js/pull/37397) to canary that might help fix this issue, without setting esmExternals: false.

Please try it out by installing next@canary and let us know!

reopened again!

Sorry about that, our stale bot accidentally closed it as it didn’t have the right labels.

We’re currently working on a refactor of how the server works that isolates running application code from Next.js itself. This will allow us to further narrow down memory issues

We recently upgraded to Next.js 12.1.5 and React 18.1.0 and encountered this issue when importing a component without direct default import syntax.


Importing a default function directly did NOT cause this memory issue

import TaggingWorkflowInput from '@components/input/TaggingWorkflowInput';

However, importing this way

import { TaggingWorkflowInput } from '@components/input';

via an “index.ts” file in ‘@components/input’ containing

export { default as TaggingWorkflowInput } from './TaggingWorkflowInput';

DID cause this issue to occur.


I’ve since switched our import syntax for complex components to use default imports and have not had this issue occur.

Same situation here. It happens randomly after the update to 13.5.

Thanks to @federico-moretti setting esmExternals: false solves it for me. Even with the default ESM setting in Next12, not all the ESM packages I use compile well. I still had to use the next-transpile-modules library to support them.

experimental: {
  esmExternals: false,
}

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory

Getting this after upgrading to Next 13.5

I’m also facing this issue on NextJS v13.0.1 and react v18.2.0. Faced a bunch of times during local dev already.

Here’s the dump:


<--- Last few GCs --->

[20978:0x148040000]  6327416 ms: Mark-sweep (reduce) 4093.8 (4130.8) -> 4090.9 (4129.4) MB, 134.0 / 0.0 ms  (average mu = 0.319, current mu = 0.233) allocation failure; scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0x1031cc43c node::Abort() [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 2: 0x1031cd6ec node::ModifyCodeGenerationFromStrings(v8::Local<v8::Context>, v8::Local<v8::Value>, bool) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 3: 0x10331ea14 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 4: 0x10331e9c0 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 5: 0x10348e1b0 v8::internal::TimedHistogramScope& v8::base::Optional<v8::internal::TimedHistogramScope>::emplace<v8::internal::TimedHistogram*&, v8::internal::Isolate*&>(v8::internal::TimedHistogram*&, v8::internal::Isolate*&) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 6: 0x10348cd74 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 7: 0x103482858 v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 8: 0x103483034 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
 9: 0x10346c9cc v8::internal::Factory::AllocateRaw(int, v8::internal::AllocationType, v8::internal::AllocationAlignment) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
10: 0x103465a08 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawArray(int, v8::internal::AllocationType) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
11: 0x1034658f0 v8::internal::FactoryBase<v8::internal::Factory>::NewFixedArrayWithFiller(v8::internal::Handle<v8::internal::Map>, int, v8::internal::Handle<v8::internal::Oddball>, v8::internal::AllocationType) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
12: 0x1036871ec v8::internal::Handle<v8::internal::NameDictionary> v8::internal::HashTable<v8::internal::NameDictionary, v8::internal::NameDictionaryShape>::New<v8::internal::Isolate>(v8::internal::Isolate*, int, v8::internal::AllocationType, v8::internal::MinimumCapacity) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
13: 0x103687758 v8::internal::Handle<v8::internal::NameDictionary> v8::internal::HashTable<v8::internal::NameDictionary, v8::internal::NameDictionaryShape>::EnsureCapacity<v8::internal::Isolate>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::NameDictionary>, int, v8::internal::AllocationType) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
14: 0x1036880e8 v8::internal::Handle<v8::internal::NameDictionary> v8::internal::Dictionary<v8::internal::NameDictionary, v8::internal::NameDictionaryShape>::Add<v8::internal::Isolate>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::NameDictionary>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyDetails, v8::internal::InternalIndex*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
15: 0x10375f548 v8::internal::Runtime_AddDictionaryProperty(int, unsigned long*, v8::internal::Isolate*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
16: 0x10303504c Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/opt/homebrew/Cellar/node/18.11.0/bin/node]
17: 0x10acb8c20
18: 0x10aacef80
19: 0x102ff1998 Builtins_AsyncFunctionAwaitUncaught [/opt/homebrew/Cellar/node/18.11.0/bin/node]
20: 0x10ac56e4c
21: 0x10b10907c
22: 0x10c467808
23: 0x102ff1ef4 Builtins_AsyncFunctionAwaitResolveClosure [/opt/homebrew/Cellar/node/18.11.0/bin/node]
24: 0x1030806f8 Builtins_PromiseFulfillReactionJob [/opt/homebrew/Cellar/node/18.11.0/bin/node]
25: 0x102fe3c4c Builtins_RunMicrotasks [/opt/homebrew/Cellar/node/18.11.0/bin/node]
26: 0x102fbe3a4 Builtins_JSRunMicrotasksEntry [/opt/homebrew/Cellar/node/18.11.0/bin/node]
27: 0x103424eac v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
28: 0x103425468 v8::internal::(anonymous namespace)::InvokeWithTryCatch(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
29: 0x103446944 v8::internal::MicrotaskQueue::RunMicrotasks(v8::internal::Isolate*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
30: 0x103446770 v8::internal::MicrotaskQueue::PerformCheckpointInternal(v8::Isolate*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
31: 0x102fc1a34 Builtins_CallApiCallback [/opt/homebrew/Cellar/node/18.11.0/bin/node]
32: 0x10bd4cb10
33: 0x102fbe4d0 Builtins_JSEntryTrampoline [/opt/homebrew/Cellar/node/18.11.0/bin/node]
34: 0x102fbe164 Builtins_JSEntry [/opt/homebrew/Cellar/node/18.11.0/bin/node]
35: 0x103424edc v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
36: 0x103424468 v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, int, v8::internal::Handle<v8::internal::Object>*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
37: 0x103334dc8 v8::Function::Call(v8::Local<v8::Context>, v8::Local<v8::Value>, int, v8::Local<v8::Value>*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
38: 0x103100b94 node::InternalCallbackScope::Close() [/opt/homebrew/Cellar/node/18.11.0/bin/node]
39: 0x103100fd4 node::InternalMakeCallback(node::Environment*, v8::Local<v8::Object>, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
40: 0x10311b694 node::AsyncWrap::MakeCallback(v8::Local<v8::Function>, int, v8::Local<v8::Value>*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
41: 0x1031d1fc8 node::fs::FSReqCallback::Resolve(v8::Local<v8::Value>) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
42: 0x1031d3410 node::fs::AfterNoArgs(uv_fs_s*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
43: 0x1031c6830 node::MakeLibuvRequestCallback<uv_fs_s, void (*)(uv_fs_s*)>::Wrapper(uv_fs_s*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
44: 0x10577eff8 uv__work_done [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
45: 0x1057823d0 uv__async_io [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
46: 0x1057921e0 uv__io_poll [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
47: 0x1057827d0 uv_run [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
48: 0x103101878 node::SpinEventLoop(node::Environment*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
49: 0x10320bf88 node::NodeMainInstance::Run(int*, node::Environment*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
50: 0x10320bbdc node::NodeMainInstance::Run() [/opt/homebrew/Cellar/node/18.11.0/bin/node]
51: 0x103196038 node::LoadSnapshotDataAndRun(node::SnapshotData const**, node::InitializationResult const*) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
52: 0x1031961cc node::Start(int, char**) [/opt/homebrew/Cellar/node/18.11.0/bin/node]
53: 0x10580108c
                           

It seems I had imported wrongly

import React from 'react'; import DashboardComponent from '../components/Dashboard/index' const Dashboard = () => { return ( <> <Dashboard/> </> ); } export default Dashboard;

instead of

import React from 'react'; import DashboardComponent from '../components/Dashboard/index' const Dashboard = () => { return ( <> <DashboardComponent/> </> ); } export default Dashboard;

so this fixed my issue and it’s building well now

I just upgraded to 12.0.8 hoping this was solved but I am still having the FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory error.

Running:

  • Next.js 12.0.8
  • Node 16.13.0
  • MacOS 12.1

the application now compiles if adding the

experimental: {
        esmExternals: false,
    },

we also use external packages from our monorepo via next-transpile-module

We had that same issue while deploying in docker on RHEL 7.9 Next.JS v12.0.4 NodeJS: v16.13.0

This is part of our original package.json

“scripts”: { “analyze”: “cross-env ANALYZE=true next build”, “dev”: “next dev”, “build”: "set NODE_OPTIONS=–max-old-space-size=8192 && next build ", “start”: “next start”, },

We tried changing the max-old-space size to 12192 but it didnt help, we raised it some more to 16192 and it started working again. This is what it looks like now

“scripts”: { “analyze”: “cross-env ANALYZE=true next build”, “dev”: “next dev”, “build”: "set NODE_OPTIONS=–max-old-space-size=16192 && next build ", “start”: “next start”, },

I got the same issue in Next.js 13.4.6. I didn’t face such issues in Next.js 12!

  • Try esmExternals [FAILED]
  • Try switch from SWC to babel [FAILED]
  • Try switch from node 16 to 14 [FAILED]
  • export NODE_OPTIONS=--max_old_space_size=4096 before start helps. [SUCCESS]

It’s just on new ubuntu ec2 with 1Gb memory + 4Gb swap.

On MAC works fine without any magic.

Also started getting the error after 13.5 upgrade. Bumping heap size for build seemed to fix it

@shuding I tried out 13.2.5-canary.12 and compared with and without the experimental webpackBuildWorker flag, but I sadly didn’t see any noticeable differences.

Memory is usually stable when navigating between pages for us, or when performing a page reload or doing hot reloading. The problem we’re having usually shows up when you change some code, hot reloading kicks in, and then you do a hard page reload. At that point, memory goes up by about 400mb (depending on the page) and never goes down again. And devs do this a lot because hot reloading is generally something that does not get much trust 😕

Same issue in Next.js 13.0.5

I’m not seeing significant improvements between 12.1.6 and 12.2.2.

I did some profiling with 1000 generated components and 60 generated pages, no build cache, using node v16.13.1.
These pages all render the same thing (all the components inside a div), but may import the components differently.
The components are just divs with their component name as text.

12.1 - 1.85GB peak all-imports

12.2 - 2GB peak all-imports


Doubling the pages to 120:

12.1 - 3GB peak all-imports

12.2 - 2.8GB peak all-imports

Seems much higher than it should be? Or is this what should be expected?

Repo available here

Version 12.2 fixed our problem.

I can confirm that upgrading to nextjs 12.2 fixed the issue for us

I’m also having this issue.

  • Setting esmExternals: false does not help at all
  • Using --max_old_space_size=8192 allows the server to boot, but it runs very slowly and after a few page loads all available RAM is consumed and the server crashes
  • Reverting to version 12.0.1 fixes it completely as far as I can tell

My project:

  • Newly created Next.js app with only a few pages
  • Created using Nx with its Next template
  • Imports a large amount of existing TypeScript and React code from modules in a monorepo

For those using next-transpile-module, you might want to follow #35150.

next@13.4.7

I know it sounds simple, but I solved this problem by removing the .next folder and running the build again. All methods mentioned here did not work. Only worked for my case was when I removed the folder.

Same than @EvilaMany . However, switching to Next 13 and the native transpilePackages option didn’t fix the issue. The only improvement is that now the server automatically restarts before crashing.

As there are a lot of comments on this issue, I’m not sure if it was originally about next build or next dev. On our side, our issue is with next dev, where the memory keeps going up when reloading pages and the server crashes after a while. After upgrading to Next 13.2, the server gets restarted automatically but something is still wrong.

@TkDodo webpackBuildWorker will mostly improve memory usage for next build. I’ll investigate the HMR-related case then!

having same essue next 12.3.1-canary.2 my workaround just to make the build finalize i had to disable type checking during build for production its a bad idea but i dont have the choice at the moment. FYI mine works fine on dev mode only break during production build. this is what i have added to my next.config.js make it build in production

 {
eslint:{
    ignoreDuringBuilds:true
  },
  typescript:{
    ignoreBuildErrors:true
  }
}

the bug seams to be caused by type checking step. I insist its a bad idea am only doing this way because i know my code works ok during dev and no type errors are found there.

+1 for @ryne2010 theory with named exports.

We were facing exactly this issue that our production docker pods (on EKS, 8G mem) restarted over 60 times in a couple of days and setting esmExternals: false didn’t help. The codebase uses a mix of named exports and default exports which is obviously for no good reason ┐(゚~゚)┌

After updating all default exports to named exports this problem disappeared.

Next.js version is 12.0.4.

I have the same issue with 12.1

And setting:

experimental: {
    esmExternals: false, 
  },

Does not change the issue

I’ll try to do a reproduction repo

I also use next-transpile-module

@timneutkens I have the same problem after upgrading from v11 to v12. Used memory on Mac goes up to 2.5GB before it bombs. Previously it was consuming < 1GB on v11. Happy to provide a full reproduction to you via private github. I’m using next-transpile-modules (required for amcharts4).

Please anyone here knows the solution to the above problem, kindly help me out as I had been on this since four days ago

Hello @dbrxnds , i sadly no have any exact values. But i give a round about:

Present were at first 120 circular dependencies. Around 30 of them spanned more than 20 modules.

The good thing was that pretty much all were caused by one big bad barrell file. Simply by fully specifying the exact module (instead of only providing the barrel file path) in like 5 places already eliminated around 20 of those really long circular dependencies.

Afterwards I were already able to start the dev server again. Since I had already written above script to easily find and track these bad circulars I continued to eliminate more circular dependencies, starting with the longest ones.

I decreased them to now around 30 circulars. In the project there are still some circular dependencies present, but they only span 2 or 3 modules each - not ideal, but also not a dealbreaker.

The dev server is now very snappy again, a huge increase in developer experience!

Another benefit of this 2 hour excourse was that the average page bundle size decreased by 50%.

I’m facing also the same problems with the version: “next”: “^12.3.1”, “react”: “18.2.0”, any workaround? thanks

<--- JS stack trace --->

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
 1: 0x100f142dc node::Abort() [/Users//.nvm/versions/node/v16.15.0/bin/node]
 2: 0x100f14464 node::errors::TryCatchScope::~TryCatchScope() [/Users//.nvm/versions/node/v16.15.0/bin/node]
 3: 0x101063bc0 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/
/.nvm/versions/node/v16.15.0/bin/node]
 4: 0x101063b54 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users//.nvm/versions/node/v16.15.0/bin/node]
 5: 0x1011e726c v8::internal::Heap::GarbageCollectionReasonToString(v8::internal::GarbageCollectionReason) [/Users//.nvm/versions/node/v16.15.0/bin/node]
 6: 0x1011e5d8c v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users//.nvm/versions/node/v16.15.0/bin/node]
 7: 0x1011f10d4 v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users//.nvm/versions/node/v16.15.0/bin/node]
 8: 0x1011f1168 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users//.nvm/versions/node/v16.15.0/bin/node]
 9: 0x1011c3ffc v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/Users//.nvm/versions/node/v16.15.0/bin/node]
10: 0x1014fc100 v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
11: 0x10181078c Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/Users//.nvm/versions/node/v16.15.0/bin/node]
12: 0x106b56f14 
13: 0x1070fa45c 
14: 0x106d8a94c 
15: 0x106cc9f0c 
16: 0x106ecbc2c 
17: 0x10797ba34 
18: 0x106fba7d0 
19: 0x106b6ea68 
20: 0x10775b00c 
21: 0x1072cc658 
22: 0x107bd1f00 
23: 0x1072f8ea0 
24: 0x1017a4418 Builtins_InterpreterEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
25: 0x10678e3bc 
26: 0x106e18ee8 
27: 0x1017a1418 construct_stub_create_deopt_addr [/Users//.nvm/versions/node/v16.15.0/bin/node]
28: 0x106e2bec4 
29: 0x106e1bde8 
30: 0x106e356b0 
31: 0x101823508 Builtins_ArrayForEach [/Users//.nvm/versions/node/v16.15.0/bin/node]
32: 0x106ea52ac 
33: 0x1017a4418 Builtins_InterpreterEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
34: 0x106138a14 
35: 0x106948954 
36: 0x107aa63fc 
37: 0x1017a4418 Builtins_InterpreterEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
38: 0x106c91b68 
39: 0x107424330 
40: 0x10790cdc4 
41: 0x10795d864 
42: 0x107424330 
43: 0x10790cdc4 
44: 0x106c90670 
45: 0x1017a4418 Builtins_InterpreterEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
46: 0x106161e74 
47: 0x1017a4418 Builtins_InterpreterEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
48: 0x106ea54c8 
49: 0x105ddb0a0 
50: 0x1017a220c Builtins_JSEntryTrampoline [/Users//.nvm/versions/node/v16.15.0/bin/node]
51: 0x1017a1ea4 Builtins_JSEntry [/Users//.nvm/versions/node/v16.15.0/bin/node]
52: 0x101173a54 v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/Users//.nvm/versions/node/v16.15.0/bin/node]
53: 0x1011730e8 v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, int, v8::internal::Handle<v8::internal::Object>*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
54: 0x101080578 v8::Function::Call(v8::Local<v8::Context>, v8::Local<v8::Value>, int, v8::Local<v8::Value>*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
55: 0x100e61eec node::InternalCallbackScope::Close() [/Users//.nvm/versions/node/v16.15.0/bin/node]
56: 0x100e6248c node::InternalMakeCallback(node::Environment*, v8::Local<v8::Object>, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/Users/=/.nvm/versions/node/v16.15.0/bin/node]
57: 0x100e62774 node::MakeCallback(v8::Isolate*, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/Users//.nvm/versions/node/v16.15.0/bin/node]
58: 0x100ebd2c4 node::Environment::CheckImmediate(uv_check_s*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
59: 0x10178c674 uv__run_check [/Users//.nvm/versions/node/v16.15.0/bin/node]
60: 0x1017863d0 uv_run [/Users//.nvm/versions/node/v16.15.0/bin/node]
61: 0x100e62ccc node::SpinEventLoop(node::Environment*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
62: 0x100f4d6e0 node::NodeMainInstance::Run(int*, node::Environment*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
63: 0x100f4d3ac node::NodeMainInstance::Run(node::EnvSerializeInfo const*) [/Users//.nvm/versions/node/v16.15.0/bin/node]
64: 0x100ee72e0 node::Start(int, char**) [/Users//.nvm/versions/node/v16.15.0/bin/node]
65: 0x105aad088 
error Command failed with signal "SIGABRT".
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.

A developer in our team experienced this issue recently. Everything worked ok in Next.js 13.2.4 but yarn dev started crashing after an upgrade to 13.3.0 / 13.3.1.

Crash log
ready - started server on 0.0.0.0:3000, url: http://localhost:3000

info - Loaded env from /home/username/path/to/project/.env.local info - Loaded env from /home/username/path/to/project/.env

<— Last few GCs —>

[7227:0x653f3d0] 39156 ms: Scavenge (reduce) 1794.9 (1976.4) -> 1794.3 (1966.9) MB, 17.8 / 0.0 ms (average mu = 0.736, current mu = 0.811) allocation failure; [7227:0x653f3d0] 40113 ms: Mark-sweep (reduce) 2050.5 (2218.3) -> 2014.8 (2160.5) MB, 562.9 / 0.0 ms (+ 98.0 ms in 207 steps since start of marking, biggest step 23.2 ms, walltime since start of marking 1027 ms) (average mu = 0.699, current mu = 0.663

<— JS stacktrace —>

FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory 1: 0xb7b3e0 node::Abort() [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 2: 0xa8c8aa [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 3: 0xd69100 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 4: 0xd694a7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 5: 0xf46ba5 [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 6: 0xf5908d v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 7: 0xfc80e4 v8::internal::ScavengeJob::Task::RunInternal() [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 8: 0xe3883b non-virtual thunk to v8::internal::CancelableTask::Run() [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 9: 0xbe71c4 [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 10: 0xbea62e node::PerIsolatePlatformData::FlushForegroundTasksInternal() [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 11: 0x166aff6 [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 12: 0x167d534 [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 13: 0x166b95e uv_run [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 14: 0xabda2d node::SpinEventLoop(node::Environment*) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 15: 0xbc1874 node::NodeMainInstance::Run() [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 16: 0xb36434 node::LoadSnapshotDataAndRun(node::SnapshotData const**, node::InitializationResult const*) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 17: 0xb3a02f node::Start(int, char**) [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] 18: 0x7f8c3083c790 [/usr/lib/libc.so.6] 19: 0x7f8c3083c84a __libc_start_main [/usr/lib/libc.so.6] 20: 0xaba37e _start [/home/username/.asdf/installs/nodejs/18.15.0/bin/node] error Command failed with exit code 1.

We use mui.com, so have a lot if named imports from @mui/material and @mui/icons-material. These two entry points are quite bulky so may be challenging to parse without bloating memory consumption. Thankfully, we managed to find a workaround like this: https://github.com/vercel/next.js/discussions/37614#discussioncomment-3036716

const nextConfig = {
  /* ... */
  modularizeImports: {
    '@mui/icons-material': {
      transform: '@mui/icons-material/{{member}}',
    },
    // TODO: Consider enabling modularizeImports for material when https://github.com/mui/material-ui/issues/36218 is resolved
    // '@mui/material': {
    //   transform: '@mui/material/{{member}}',
    // },
  },
};

Hope this helps investigate the behavior of swc

Still an issue with latest (13.3.1)

@mdovn Have found out any solution on this one? I tried increasing the Node.js memory threshold, it’s a bit better, but still failing once in a while.

Not yet, seems it’s not related to next-transpile-module since I use experimental.transpilePackages then see the issue still persist. The dev compile is quite slow too. When I build my app. NextJS show this weird result: image

@balazsorban44 I’ve following this conversation since the beginning. We had serious deployment issues in our staging server, which has a pretty limited memory (Amazon EC2 T2 small instance t2.small - 2Gb RAM) - shared with our JAVA back-end as well. Our start-up was failing, even with the mentioned solutions above.

I’ve tried the canary version 12.1.7-canary.4. And it seems our issues has been eliminated. More than ten deployments was success since we changed to this. Thank you!

@johnson-lau @balazsorban44 problem is solved now for me after removing esmExternals: false and installing next@canary

Hi, we recently landed some changes (#37397) to canary that might help fix this issue, without setting esmExternals: false.

Please try it out by installing next@canary and let us know!

This seems to have fixed the issue in our environment. Any estimate on when this might roll into a stable version?

@aceisScope - did you mean that you fixed the issue by updating all named exports to default exports? Otherwise your statement seems like the reverse of @ryne2010’s post where they state that

The only overlapping theme seems to be “unifying” the imports one way.

We encountered this issue and our app started to crash and restart loop just after we deleted our .babelrc and migrated our app to swc (with Next.js 12.1.4, compiler.styledComponents and w/o next-transpile-modules in node:16-alpine). After reverted the change, everything backed to normal.

Disabling esmExternals fix the issue too:

  experimental: {
    esmExternals: false,
  },
  1. npm i

  2. add "type": "module" to node_modules/@amcharts/amcharts4/package.json and node_modules/@amcharts/amcharts4-geodata/package.json

  3. npm run build

@LukasBombach

Just to be clear and help this thread.

To suppress this error try export NODE_OPTIONS=\"--max_old_space_size=4096 --trace-warnings\" && <your-node-command> where is 4096 means how much RAM node can use, by default is 512MB, this must be in MB.

Error happening after updating to v13.5.2

I have the same problem. None of the solutions helped me. I deleted the @mui/icons-material library, also deleted the .next folder and node_modules, then npm install again. The error has not gone away.

next 13.4.19, typescript 5.1.6

I got the same issue in Next.js 13.4.6. I didn’t face such issues in Next.js 12!

same, I downgraded back to the original version.

Now, I fixed the issue, It happens when you use top level imports cause it imports many unnecessary modules for a single page. As a result, Next.js needs to load thousands of modules, especially when using Mui icons. The simple solution is to try using modularizeImports for Mui icons in your Next.js project. simply add the code below, and you will witness the actual magic of reducing unnecessary imports. In my case, it reduced the number of modules to around 20k. codes:

modularizeImports: {
    '@mui/icons-material/?(((\\w*)?/?)*)': {
          transform: '@mui/icons-material/{{ matches.[1] }}/{{member}}'
    }
}

simply paste it to the root level of the next.config.js file.

Personally, I think only the influence of @mui/icons-material was discharged, right?

May be, I have encountered this issue with @mui/icons-material so far.

I got the same issue in Next.js 13.4.6. I didn’t face such issues in Next.js 12!

same, I downgraded back to the original version.

Now, I fixed the issue, It happens when you use top level imports cause it imports many unnecessary modules for a single page. As a result, Next.js needs to load thousands of modules, especially when using Mui icons. The simple solution is to try using modularizeImports for Mui icons in your Next.js project. simply add the code below, and you will witness the actual magic of reducing unnecessary imports. In my case, it reduced the number of modules to around 20k. codes:

modularizeImports: {
    '@mui/icons-material/?(((\\w*)?/?)*)': {
          transform: '@mui/icons-material/{{ matches.[1] }}/{{member}}'
    }
}

simply paste it to the root level of the next.config.js file.

Personally, I think only the influence of @mui/icons-material was discharged, right?

I started to face this issue recently and I managed to pinpoint it to the Edge runtime. See #51298

We think this issue manifested by adding carbon multi select but not adding to next-transpile-modules.

We added it in additon to

experimental: {
  esmExternals: false,
}

The difference is astounding

before image

after image

I started having this issue on an M1 MacBook Pro with 16GB ram running:

    "next": "13.0.6",
    "react": "18.2.0",

Full error output:

wait  - compiling /page (client and server)...
event - compiled successfully in 441 ms (1734 modules)
wait  - compiling...

<--- Last few GCs --->

[21264:0x130008000]   921015 ms: Scavenge (reduce) 4088.7 (4110.5) -> 4088.3 (4111.2) MB, 0.8 / 0.0 ms  (average mu = 0.178, current mu = 0.020) allocation failure 
[21264:0x130008000]   921020 ms: Scavenge (reduce) 4089.1 (4111.2) -> 4088.3 (4111.2) MB, 0.7 / 0.0 ms  (average mu = 0.178, current mu = 0.020) allocation failure 
[21264:0x130008000]   921025 ms: Scavenge (reduce) 4089.2 (4111.2) -> 4088.5 (4111.2) MB, 0.7 / 0.0 ms  (average mu = 0.178, current mu = 0.020) allocation failure 


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x100f471dc node::Abort() [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 2: 0x100f4747c node::ModifyCodeGenerationFromStrings(v8::Local<v8::Context>, v8::Local<v8::Value>, bool) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 3: 0x10106d9b8 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 4: 0x10106d97c v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 5: 0x1011a3c0c v8::internal::Heap::GarbageCollectionReasonToString(v8::internal::GarbageCollectionReason) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 6: 0x1011a64cc v8::internal::Heap::MarkCompactPrologue() [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 7: 0x1011a41d0 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 8: 0x1011a2680 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
 9: 0x1011aaef0 v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
10: 0x1011aaf70 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
11: 0x101188ff8 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
12: 0x101412e9c v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
13: 0x10168b5ac Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
14: 0x1016ba0fc Builtins_CreateRegExpLiteral [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
15: 0x1084ab5c4 
16: 0x101698388 Builtins_ArrayEvery [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
17: 0x1084a4948 
18: 0x108490490 
19: 0x1084abe10 
20: 0x108bb7f60 
21: 0x108bb7a88 
22: 0x1090dc6b4 
23: 0x108fb04a4 
24: 0x1084b0734 
25: 0x108277bd4 
26: 0x1085af874 
27: 0x1016d4d04 Builtins_PromiseResolveThenableJob [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
28: 0x101640e24 Builtins_RunMicrotasks [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
29: 0x10161cf04 Builtins_JSRunMicrotasksEntry [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
30: 0x10114b2e4 v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
31: 0x10114b698 v8::internal::(anonymous namespace)::InvokeWithTryCatch(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
32: 0x10114b784 v8::internal::Execution::TryRunMicrotasks(v8::internal::Isolate*, v8::internal::MicrotaskQueue*, v8::internal::MaybeHandle<v8::internal::Object>*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
33: 0x101167cc0 v8::internal::MicrotaskQueue::RunMicrotasks(v8::internal::Isolate*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
34: 0x101167b44 v8::internal::MicrotaskQueue::PerformCheckpointInternal(v8::Isolate*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
35: 0x101621914 Builtins_CallApiCallback [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
36: 0x10866d388 
37: 0x108db1794 
38: 0x108033420 
39: 0x10161d02c Builtins_JSEntryTrampoline [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
40: 0x10161ccc4 Builtins_JSEntry [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
41: 0x10114b31c v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
42: 0x10114aa74 v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, int, v8::internal::Handle<v8::internal::Object>*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
43: 0x101081b40 v8::Function::Call(v8::Local<v8::Context>, v8::Local<v8::Value>, int, v8::Local<v8::Value>*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
44: 0x100e9d180 node::InternalMakeCallback(node::Environment*, v8::Local<v8::Object>, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
45: 0x100e9d484 node::MakeCallback(v8::Isolate*, v8::Local<v8::Object>, v8::Local<v8::Function>, int, v8::Local<v8::Value>*, node::async_context) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
46: 0x100ef91b8 node::Environment::CheckImmediate(uv_check_s*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
47: 0x10337f8e0 uv__run_check [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
48: 0x10337a800 uv_run [/opt/homebrew/Cellar/libuv/1.44.2/lib/libuv.1.dylib]
49: 0x100e9d93c node::SpinEventLoop(node::Environment*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
50: 0x100f7cd6c node::NodeMainInstance::Run(int*, node::Environment*) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
51: 0x100f7ca58 node::NodeMainInstance::Run() [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
52: 0x100f1d040 node::Start(int, char**) [/opt/homebrew/Cellar/node@16/16.19.0/bin/node]
53: 0x196b27f28 start [/usr/lib/dyld]
error Command failed with signal "SIGABRT".

I also have the same issue; dev keeps taking more memory and then crashes+auto restarts. build also takes too much memory and causes the system to crash.

btw using the trace method as suggested and pruning the imports as solved this issue for me. I really cannot think everyone enough but ill try – THANK YOU!

Was struggling with this all day. I hope that sharing my experience will help others. After upgrading several packages, I was starting to get this error during build. In my case yarn build would take around 5 minutes (instead of the usual 1.5-2) to build the project and then fail with that (or similar) message. Ended up on this thread. I set the --max-old-space-size=16192 variable as suggested in a previous comment, and sure enough after 5 minutes I got a reference error. It was next-i18next in my case. The Trans component had a redundant t attribute, which created a circular reference (infinite type definition tree, or something to that effect, according to i18next). Removing the attribute solved the problem. Now it works like before, without the need for the --max-old-space-size variable. Good luck!

Hi @ValentinH, my case is a bit different, I don’t have any API routes, just a health check API. The main reason for failing is that we have a ton of UI modules within a monorepo. I guess the memory leak is caused by the watchers for rebuilding the app on development mode. As a temporary workaround, you can tweak the Next.js compiler modules watcher. There are some movement from the maintainers so keep an eye on the canary releases.

In 13.0.6 the issue still occurs, there is a bugfix in the canary upstream, but I am not sure if the fix is related to this issue #43859

I think It is highly related to this problem but #43859 didn’t solve the issue. But, the vercel team are sending a new fix(#43958) to the canary. Talks goes on here

Git action failed with heap limit Allocation failed when building the server image

  • The build works locally in my machine.
  • I am using a git workflow to build the docker image.

I tried to added --max-old-space-size to the docker file but it didn’t solve the issue I tried esmExternals: false, it din’t work

What worked for me is removing sentry

I had created the next.js application with the command yarn create next-app --typescript. After that when I ran yarn build I got this error:

╰─ yarn build                                                                                                                                     ─╯
yarn run v1.18.0
$ next build

<--- Last few GCs --->

[36975:0x140008000]      130 ms: Scavenge 14.3 (28.1) -> 12.5 (29.1) MB, 0.5 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure
[36975:0x140008000]      178 ms: Mark-sweep 24.6 (40.0) -> 22.5 (39.9) MB, 0.8 / 0.0 ms  (+ 0.5 ms in 18 steps since start of marking, biggest step 0.1 ms, walltime since start of marking 10 ms) (average mu = 1.000, current mu = 1.000) finalize incrementa

<--- JS stacktrace --->

FATAL ERROR: wasm code commit Allocation failed - process out of memory
 1: 0x104839d18 node::Abort() [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 2: 0x104839e88 node::OnFatalError(char const*, char const*) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 3: 0x10494170c v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 4: 0x1049416a0 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 5: 0x104d70cdc v8::internal::wasm::WasmCodeManager::TryAllocate(unsigned long, void*) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 6: 0x104d718f8 v8::internal::wasm::NativeModule::CreateEmptyJumpTableInRegion(unsigned int, v8::base::AddressRegion) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 7: 0x104d70ef4 v8::internal::wasm::NativeModule::AddCodeSpace(v8::base::AddressRegion) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 8: 0x104d7174c v8::internal::wasm::NativeModule::NativeModule(v8::internal::wasm::WasmEngine*, v8::internal::wasm::WasmFeatures const&, bool, v8::internal::VirtualMemory, std::__1::shared_ptr<v8::internal::wasm::WasmModule const>, std::__1::shared_ptr<v8::internal::Counters>, std::__1::shared_ptr<v8::internal::wasm::NativeModule>*) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
 9: 0x104d73674 v8::internal::wasm::WasmCodeManager::NewNativeModule(v8::internal::wasm::WasmEngine*, v8::internal::Isolate*, v8::internal::wasm::WasmFeatures const&, unsigned long, bool, std::__1::shared_ptr<v8::internal::wasm::WasmModule const>) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
10: 0x104d798e8 v8::internal::wasm::WasmEngine::NewNativeModule(v8::internal::Isolate*, v8::internal::wasm::WasmFeatures const&, unsigned long, bool, std::__1::shared_ptr<v8::internal::wasm::WasmModule const>) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
11: 0x104d7982c v8::internal::wasm::WasmEngine::NewNativeModule(v8::internal::Isolate*, v8::internal::wasm::WasmFeatures const&, std::__1::shared_ptr<v8::internal::wasm::WasmModule const>) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
12: 0x104d581cc v8::internal::wasm::AsyncCompileJob::CreateNativeModule(std::__1::shared_ptr<v8::internal::wasm::WasmModule const>) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
13: 0x104d5f5a8 v8::internal::wasm::AsyncCompileJob::PrepareAndStartCompile::RunInForeground(v8::internal::wasm::AsyncCompileJob*) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
14: 0x104d5fd30 v8::internal::wasm::AsyncCompileJob::CompileStep::Run(v8::internal::wasm::AsyncCompileJob*, bool) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
15: 0x104d5fc48 v8::internal::wasm::AsyncCompileJob::CompileTask::RunInternal() [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
16: 0x104891910 node::PerIsolatePlatformData::RunForegroundTask(std::__1::unique_ptr<v8::Task, std::__1::default_delete<v8::Task> >) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
17: 0x104890b48 node::PerIsolatePlatformData::FlushForegroundTasksInternal() [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
18: 0x104fb1028 uv__async_io [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
19: 0x104fc059c uv__io_poll [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
20: 0x104fb1440 uv_run [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
21: 0x10486fe9c node::NodeMainInstance::Run() [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
22: 0x104816cdc node::Start(int, char**) [/opt/homebrew/Cellar/node@12/12.22.12/bin/node]
23: 0x1067a508c

Node: v12.22.12 yarn: 1.18.0 next: 12.2.0 OS: MacOS Monterey Chip: Apple M1 Max Memory: 32 GB

After that I’ve changed my node version to v14.19.3 and it worked!

yarn run v1.18.0
$ next build
info  - Linting and checking validity of types
info  - Creating an optimized production build
info  - Compiled successfully
info  - Collecting page data
info  - Generating static pages (3/3)
info  - Finalizing page optimization

Page                                       Size     First Load JS
┌ ○ /                                      5.7 kB           83 kB
├   └ css/149b18973e5508c7.css             655 B
├   /_app                                  0 B            77.3 kB
├ ○ /404                                   194 B          77.5 kB
└ λ /api/hello                             0 B            77.3 kB
+ First Load JS shared by all              77.3 kB
  ├ chunks/framework-7dc8a65f4a0cda33.js   45.2 kB
  ├ chunks/main-25e5079ab4bd6ecd.js        30.8 kB
  ├ chunks/pages/_app-0da7391c528d976d.js  504 B
  ├ chunks/webpack-69bfa6990bb9e155.js     769 B
  └ css/27d177a30947857b.css               194 B

λ  (Server)  server-side renders at runtime (uses getInitialProps or getServerSideProps)
○  (Static)  automatically rendered as static HTML (uses no initial props)

✨  Done in 3.51s.

Set --max_old_space_szie=8192 seem to fix the issue for me. image

I fixed mine, i notice that the name of my imported component was the same as the function name where i imported it to. and also i did, since am on kali linux 2022 i did export NODE_OPTIONS=–max_old_space_size=4096 and i was able to resolve it

In our case, we found out that the problem was caused by Swiper.js (https://swiperjs.com/) that started using ES modules from version 7. When we downgraded Swiper to v6, the problem disappeared.

We also use index.ts barrel files in our project for re-exporting all our components. Curiously, we found out that if we delete the reexports for the Swiper component and import it directly from its source file (e.g. import { Carousel } from "src/components/Carousel/Carousel";), the problem no longer exists and we can keep using Swiper v8. So this initially fixed the problem for us.

Now we tried the next@canary version and it seems to fix the problem completely even if we import Swiper via barrel file. So everything is OK now.

@balazsorban44 this is without adding the esmExternals option or changing our code in any particular way in that regard

@rahulgi and @patroza

in my case updating all default exports to named exports solved the issue. so i think it works either way, default or named, as long as there’s only one kind and not mixed exports in the codebase. “unifying” is the right term.

We also do experience this issue too on v12. esmExternals: false didn’t help.

It might be caused by some cyclic import some dependency during SSG based on info from other related issues getStaticProps with { fallback: true } is very slow when spa routing and Dev mode on Next.js v12 breaks with combination of features we also experience. It would be great if Next did detect those kind of issues because it is really hard to debug without access to implementation.

We did not experienced this kind of issues on v10 so it might be also related to SWC.

I have created a repository nextjs-dynamic-amcharts with the example of the smallest reproduction of the problem…

@balazsorban44 @timneutkens take a look.

it works fine, but within the container it crashes.

We tackled that, the dev team was mounting in the nodejs modules when they used docker-compose or our K8S tilt dev environments. We had to exclude those from all of our docker builds + only mounting source code since NextJS 12 SWC (Is that true @programbo?) compiler in rust is architecture-dependent when built. Either way we never faced that on 11.

make sure you have node_modules/ in your .dockerignore or **node_modules/ in your .tiltignore.

@pmbanugo not at the moment.

@balazsorban44 The error is persistent and consistent having said that, I had a couple of questions,

We were using the babel loader The procedure we followed to upgrade was

  1. install next 12 (upgrade)
  2. install @swc/core and swc-loader
  3. Replace babel-loader with swc-loader
  4. remove the existing babel config

now do we need to install @swc/core seperately, because i saw next install swc

Seems to be a duplicate of #31962