webpack: Using webpack 4 on a large project (or, how to avoid "JavaScript heap out of memory" with production mode)

Do you want to request a feature or report a bug?

Maybe this a feature request. I’m really looking for suggestions on using webpack (4) with many entry points (~165), a large set of dependencies, and a lot of shared code between entry points.

I guess it could also be considered a bug since webpack --mode production fails and webpack --mode development --watch succeeds.

In this case, we are in the midst of migrating the OpenLayers project to ES modules, and would like to use webpack to build our examples (in development and for production).

What is the current behavior?

Given a webpack config that looks like this:

module.exports = {
  context: src, // absolute path to example directory
  entry: entry, // an object with name: relative path for each of 165 examples
  output: {
    filename: '[name].js',
    path: path.join(__dirname, 'build')
  }
};

Running webpack in development mode works as expected:

$ webpack --mode development --config path/to/webpack/config.js --watch

Webpack is watching the files…

Hash: e7a7d7a44dc1b2048de7
Version: webpack 4.0.0-beta.0
Time: 4663ms

(4.6 seconds is awesome for 165 entry points.)

However, when I try to run the same in production mode, I get out of memory errors.

$ webpack --config examples/webpack/config.js --mode production
...

<--- Last few GCs --->

[59757:0x103000000]    32063 ms: Mark-sweep 1393.5 (1477.7) -> 1393.5 (1477.7) MB, 109.0 / 0.0 ms  allocation failure GC in old space requested
[59757:0x103000000]    32204 ms: Mark-sweep 1393.5 (1477.7) -> 1393.5 (1427.7) MB, 141.3 / 0.0 ms  last resort GC in old space requested
[59757:0x103000000]    32331 ms: Mark-sweep 1393.5 (1427.7) -> 1393.5 (1423.2) MB, 126.7 / 0.0 ms  last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x24d9482a5ec1 <JSObject>
    2: replace(this=0x24d9a5886dd1 <Very long string[3439790]>,0x24d99ac9b771 <JSRegExp <String[18]: [<>\/\u2028\u2029]>>,0x24d99ac9b7e9 <JSFunction escapeUnsafeChars (sfi = 0x24d903ffafa1)>)
    3: serialize(aka serialize) [/Users/tschaub/projects/openlayers/node_modules/serialize-javascript/index.js:~30] [pc=0x31c5d0bfec51](this=0x24d979482311 <undefined>,obj=0x24d98baf7569 <Object map = 0x24d...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/Users/tschaub/.nvm/versions/node/v8.9.1/bin/node]
 2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/Users/tschaub/.nvm/versions/node/v8.9.1/bin/node]
 3: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/Users/tschaub/.nvm/versions/node/v8.9.1/bin/node]
 4: v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [/Users/tschaub/.nvm/versions/node/v8.9.1/bin/node]
 5: v8::internal::Runtime_StringBuilderConcat(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/tschaub/.nvm/versions/node/v8.9.1/bin/node]
 6: 0x31c5d0b8463d
 7: 0x31c5d0be6e2e
 8: 0x31c5d0be65a8
 9: 0x31c5d0bfec51
Abort trap: 6

If the current behavior is a bug, please provide the steps to reproduce.

Here is a tree with everything: https://github.com/openlayers/openlayers/tree/323a56b06c69af9ef56c2624e877b45ad2dd8fae

Here are the steps to get things set up:

git clone --depth=50 https://github.com/openlayers/openlayers.git openlayers/openlayers
cd openlayers-webpack
git fetch origin +refs/pull/7740/merge
git checkout -qf 323a56b06c69af9ef56c2624e877b45ad2dd8fae
npm install
npm run build

Note that npm run build -- --mode development succeeds, but production mode fails.

What is the expected behavior?

Since development mode works with many entry points, and production mode works with a few entry points, I was hoping that production mode would work with many entry points.

If this is a feature request, what is motivation or use case for changing the behavior?

Production builds fail on my machine and on Travis CI. I’m hoping that to get a production build to succeed, it doesn’t require OS configuration changes for everyone involved.

Please mention other relevant information such as the browser version, Node.js version, webpack version and Operating System.

node@8.9.1 webpack@4.0.0-beta.0 macOS@10.13.2 (16 GB Memory, 3.5 GHz intel Core i7)

Thanks for the amazing work on webpack 4. Really looking forward to making use of it.

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 54
  • Comments: 41 (22 by maintainers)

Most upvoted comments

For those looking to increase the memory used by webpack, the solution is to not use the webpack batch file/shell script but call this instead:

node --max_old_space_size=4096 ./node_modules/webpack/bin/webpack.js <the rest of your command line options go here>

That command gives 4GB of memory to webpack.

@skirankumar7 - any time I’ve run into a case where 8gb wasn’t enough (with any node script, not just webpack) it was due to a misconfiguration on my part.

I recently had a very similar issue occurring with the current latest stable release of webpack (v4.29.5). The issue indeed came from the minimizer plugin which is by default UglifyJS. After doing some research I discovered that this was a known issue and that the next version of webpack (v.5) is intending to use Terser instead as it offers better performance and robustness (thread can be viewed here).

In summary, I recommend moving to Terser and hopefully it will also solve your issue. Setup steps: 1 - Install the module using $ npm install terser-webpack-plugin --save-dev 2 - Edit your webpack.config.js such that it includes:

var TerserPlugin = require('terser-webpack-plugin')

module.exports = (env, options) => {
  ...
  optimization: {
    minimizer: [new TerserPlugin()]
  },
  ...
}

So yeah, if you find yourself on this thread, I hope this helps you!

For those looking to increase the memory used by webpack, the solution is to not use the webpack batch file/shell script but call this instead:

node --max_old_space_size=4096 ./node_modules/webpack/bin/webpack.js <the rest of your command line options go here>

That command gives 4GB of memory to webpack.

I have allocated max old space to 8192 but still same error any idea or comments…

@boyanio Just note - I recommend to migrate on official CSS minimizer plugin https://github.com/webpack-contrib/css-minimizer-webpack-plugin, instead OptimizeCSSAssetsPlugin, css-minimizer-webpack-plugin have caches and parallel + working good with source maps and other features

12.x is not a guarantee: I’ve been using 12.x for months and frequently see OOM crashes.

Some further findings: setting transpileOnly: false to ts-loader’s options and using fork-ts-checker-webpack-plugin seaprately seems to work fine

Reducing parallelism (default=100) worked for me. I was seeing an OOM during the build process. It would OOM somewhere above 30 active modules. (keep an eye on this line and the number active - 37% building 231/269 modules 38 active /path/to/module.js). I dropped parallelism to 10 and it got through the build without OOMing, and there was no noticeable change in build time.

This is specific to solving an OOM during the module build phase. This doesn’t address OOMs during minification.

Please update terser-webpack-plugin to latest version

Update Node versión from v8.16.0 to v12.6.0 and that issue is gone!

An still the issue continues

Thanks for the help with this!

Source https://github.com/webpack-contrib/uglifyjs-webpack-plugin/blob/f66f6f0d0fe1012238743c9a05d6f493e1b3c8f3/src/index.js#L158

@evilebottnawi Wouldn’t a lot of memory be saved if a hash of the cache identifier is used as the task.cacheKey? Something along the lines of https://github.com/webpack/webpack/pull/5997.

I am also experiencing this issue with a reasonably small project. I have two configs exported in webpack.config.js:

  • TypeScript (ts-loader)
  • Less (less-loader -> css-loader -> mini-css-extractor-loader)

I get OOM when using devtool: "source-map" in production mode. Actually, I started getting OOM when added the Less configuration, before it was fine. Neither of the suggested workarounds worked for me so far. It fails at

[webpack.Progress] 82% [0] additional asset processing
[webpack.Progress] 82% [0] chunk asset optimization
[webpack.Progress] 82% [0] chunk asset optimization TerserPlugin

I am using "terser-webpack-plugin": "^4.2.0"