webpacker: Inconsistent file names when building on multiple servers

Hey

Apologies if this is the wrong place for this type of issue.

We have 2 web servers, each running behind a load balancer. During deployment we run the following on each server:

bundle exec rake assets:precompile

This has worked totally fine until we have added webpacker into the mix. Now each of the manifest.json files have different file hashes inside them:

{
  "header.css": "/packs/header-436715b1b24e023140032846f44ebb5c.css",
  "header.js": "/packs/header-39a13196ce8946d39988.js",
}
{
  "header.css": "/packs/header-436715b1b24e023140032846f44ebb5c.css",
  "header.js": "/packs/header-9c53e3fbb9e8b6312123.js",
}

Note that the CSS file hashes are the same, but the JS are not.

Where do these hashes come from? Shouldn’t they be consistent like all other assets?

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 6
  • Comments: 27 (11 by maintainers)

Most upvoted comments

@scottrobertson You’re not crazy. [chunkhash] isn’t stable by default.

As you can see the bundle’s name now reflects its content (via the hash). If we run another build without making any changes, we’d expect that filename to stay the same. However, if we were to run it again, we may find that this is not the case […] This is because webpack includes certain boilerplate, specifically the runtime and manifest, in the entry chunk.

@kjleitz ha, well that is a rather drastic change to fix 1 issue 😄

We got it!

Thank you very much to @tyvdh, everyone in here and Cloud66!

const environment = require('./environment')
const WebpackMd5Hash = require('webpack-md5-hash')

environment.plugins.append(
  'WebpackMd5Hash',
  new WebpackMd5Hash()
)

environment.config.set('output.filename', '[name]-[chunkhash].js')
environment.config.set('devtool', 'sourcemap')

module.exports = environment.toWebpackConfig()

Switching to use https://github.com/erm0l0v/webpack-md5-hash seems to be the key here.

Can this be closed as #2094 is merged?

I encountered this issue after upgrading from webpacker 3.5.5 to 4.0.2. The [chunkhash] naming was inconsistent between servers on my OpsWorks stack, even after creating new instances. Using [contenthash] fixed my problem.

// config/webpack/production.js

process.env.NODE_ENV = process.env.NODE_ENV || 'production';

const environment = require('./environment');

// use [contenthash] naming to ensure uniform names across servers
environment.config.output.filename = 'js/[name]-[contenthash].js';
environment.plugins.get('MiniCssExtract').options.filename = 'css/[name]-[contenthash].css';

module.exports = environment.toWebpackConfig();

For anyone coming to this page searching for a fix and using cloud66 and load balancers, this was my answer: (bottom of page: Nominating a dedicated compilation server)

https://help.cloud66.com/rails/how-to-guides/deployment/enable-disable-asset-pipeline.html

It compiles assets on one server then syncs the assets to the other servers so you don’t have multiple compiles and the possibility of generating different fingerprints.

@kjleitz We ended up using something similar to your commit hash solution.

However, we have since moved to Kubernetes, and therefor just build the image once so it always has the same ID.

you wrote the files have the same size but have a different chunkhash, what is the diff between the files? The only thing that I can think of is a different order in the modules list that webpack maintains. But it wouldn’t make any sense that the md5 is the same then…

Are you using yarn or npm in production?

Please have a look at the diff and share it here (if it does not contain anything secret).

@scottrobertson My interpretation of the docs above is a little different. The example described explains how the [chunkhash] of several chunks can change even if you’ve only modified only one. If you’re not building multiple chunks, or haven’t modified any files, then [chunkhash] should be stable across builds and across servers.

I tend to agree with @renchap — that there may be some differences within or across the environments on your servers causing hashes to be inconsistent. Curious—do the hashes change across successive builds on the same server (without changes in file contents)?