webpack: large webpack build almost hangs at 91% on an "additional asset processing" step

Do you want to request a feature or report a bug? possible bug

I have already asked on StackOverflow and it seems as if this behavior would be quite common but there have been no possible solutions.

What is the current behavior? I have a large webpack build that almost hangs at 91% on an “additional asset processing” step. To complete processing take about 8 minutes but the step “additional asset processing” consumes at lwat half of the time. Webpack does not give me a lot more information and I would like to better understand if this is “normal”, a bug or what can be done to eventually optimize my build?

56205ms building modules
31ms sealing
0ms optimizing
0ms basic module optimization
15ms module optimization
0ms advanced module optimization
0ms basic chunk optimization
0ms chunk optimization
16ms advanced chunk optimization
14487ms building modules
0ms module and chunk tree optimization
31ms module reviving
0ms module order optimization
16ms module id optimization
0ms chunk reviving
16ms chunk order optimization
31ms chunk id optimization
140ms hashing
0ms module assets processing
265ms chunk assets processing
0ms additional chunk assets processing
0ms recording
206740ms additional asset processing
79781ms chunk asset optimization
1ms asset optimization
906ms emitting

If the current behavior is a bug, please provide the steps to reproduce.

What is the expected behavior? faster or more information on what is currently being done

If this is a feature request, what is motivation or use case for changing the behavior?

Please mention other relevant information such as the browser version, Node.js version, webpack version and Operating System. node: 6.10.0 webpack: 2.3.1 OS: Windows 7 x64

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 56
  • Comments: 41 (1 by maintainers)

Commits related to this issue

Most upvoted comments

Why was this closed? I’m still having this issue after trying several different workarounds posted. I’m still seeing a lot of people commenting on various threads that are experiencing the same thing. The only way I can resolve is setting the webpack devtool to ‘eval’, but that’s not supposed to be used for production.

I’ve been debugging this issue and found out it is probably a bug in UglifyJS, see: https://github.com/mishoo/UglifyJS2/issues/2609

I’m able to avoid the 91% hang by passing { compress: false } to UglifyJSPlugin, eg:

new UglifyJSPlugin({
      parallel: true,
      uglifyOptions: {
        ecma: 6,
        compress: false // hangs without this
      },
      cache: path.join(__dirname, 'webpack-cache/uglify-cache'),
    })

I upgrade to latest webpack due to this issue, but still same it usually takes around 10 minutes for dist build and most of this time is haning on 91% additional asset processing

I seem to be having the same issue on Webpack 4 no matter which plugin I’m using, UglifyJS, babel-minify etc, they all hang at chunk assets processing and sit there until I run out of RAM

I have this issue too. Finally found out that webpack 3.5.5 and Extract text plugin 3.0.0 combo will cause this problem. Each incremental dev build will take about 20 seconds which is unacceptable. Seems like webpack is busy processing and extracting css which is not modified at all. After I change webpack to 2.7.0 and Extract text plugin to 2.1.2, the incremental build took about 3 seconds. 😜

Its probably UglifyJSPlugin or UglifyJsParallelPlugin. Try commenting that out and see if that makes things faster.

@EthanStandel I can also confirm that I just removed the UglifyJSPlugin and it fixed the 91% hang in my project

My team’s project always hangs on 91% and then sits for a bit on 92%, but removing UglifyJSPlugin fixed the 91% hang.

Happens for me only when using babili-webpack-plugin

I found a solution. Simple add more memory for running with --max-old-space-size param node ... --max-old-space-size=8192

Same problem when I deploy on server. In my case, I just fix it by increasing the swap ram area.

sudo fallocate -l 4G /swapfile && \
sudo chmod 600 /swapfile && \
sudo mkswap /swapfile && \
sudo swapon /swapfile && \
sudo swapon -s && \
sudo cp /etc/fstab /etc/fstab.bak && \
echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab

@yantakus - not sure how you are chunking, but I’ve found it really easy to duplicate a ton of dependencies if you don’t start the plugin chain with a something like

new webpack.optimize.CommonsChunkPlugin({
            children: true,
            async: true,
            minChunks: 3
        })

Adding this reduced my chunked out size from 208Mb to 6.3Mb

Here are results of my experiments with "uglifyjs-webpack-plugin" -V1.1.6:

Webpack 3.10.0 with -p flag, no uglifyjs-webpack plugin: 326 000 ms build time, 150 mb bundle size

“uglifyjs-webpack-plugin” -V1.1.6 (removed the -p flag), no config: 496 000 ms build time, 183 mb bundle size

uglifyjs-webpack-plugin with the following config:

{
      parallel: true,
      uglifyOptions: {
        ecma: 6,
      },
      cache: path.join(__dirname, 'webpack-cache/uglify-cache'),
    }

315 000 ms build time, 183 mb bundle size

Another build with same config for the cache to take effect: 83 000 ms build time, 183 mb bundle size

The same config, but with compress: false,: 157 000 ms build time, 185 mb bundle size.

So caching makes it much faster, compress: false seems to make it slower and increases the bundle. But the question is - why does pure webpack -p produces the smallest bundle? Am I missing some bundle size optimization options in the config?

This has been fixed in uglify-es for a few versions now as far as I can tell. This was merged in December: https://github.com/mishoo/UglifyJS2/pull/2614

Make sure your webpack, or uglifyjs-webpack-plugin is using the latest uglify-es version.

At the latest v2.3.3 update wrote that

Fix performance issue with cheap-source-maps

But I still have the same problem.

I originally opened this SR and for me the solution was to always use the latest webpack version and invoke node with the --max_old_space_size=4096 option. This permanently solved the problem for me!

my webpack version:@2.7.0 . I also confirm that commenting out UglifyJsPlugin(or removing) solved my problem.Before it takes out 25 seconds on " hangs at 91% on an “additional asset processing” step,now it takes out 1-3 seconds. /* new UglifyJsPlugin({ uglifyOptions: { compress: true }, sourceMap: true, }),*/

We are still running into this problem even though our uglifyjs-webpack-plugin is using the most recent uglify-es (version 3.3.8)

is this still a thing? the solution “compress: false”, works but it is accompanied by much larger files 😕