serverless-webpack: JavaScript heap out of memory when packaging many functions
This is a Bug Report
Description
I’m in the process of trying to upgrade serverless-webpack version from 2.2.3, where I do not experience the following issue. Our serverless configuration has package: invididually: true set, and about 40 functions. When I try to upgrade to a later version of serverless-webpack and run sls webpack, the build will run for about a minute and then I get the following error:
lambda:daniel.cottone $ npm run build
> expert-api-lambda@0.1.0 build /Users/daniel.cottone/Projects/expert-api/lambda
> sls webpack --stage dev
Serverless: Bundling with Webpack...
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
<--- Last few GCs --->
[42611:0x104001600] 55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms allocation failure GC in old space requested
[42611:0x104001600] 57889 ms: Mark-sweep 1405.7 (1508.8) -> 1405.5 (1487.3) MB, 1923.4 / 0.0 ms (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1923 ms) last resort
[42611:0x104001600] 59801 ms: Mark-sweep 1405.5 (1487.3) -> 1405.4 (1486.8) MB, 1903.6 / 0.0 ms last resort
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x37341f01ba79 <JS Object>
1: set [native collection.js:~247] [pc=0x29d828934f21](this=0x332730f95301 <a Map with map 0x23d2df14319>,p=0x3dd499abec41 <String[11]: MediaSource>,x=0x2589b9b1c819 <a SymbolObject with map 0x399abfecde11>)
2: /* anonymous */(aka /* anonymous */) [/Users/daniel.cottone/Projects/expert-api/lambda/node_modules/typescript/lib/typescript.js:~23166] [pc=0x29d828ba5830](this=0x37341f002241 <...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::Abort() [/usr/local/bin/node]
2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/usr/local/bin/node]
3: v8::Utils::ReportOOMFailure(char const*, bool) [/usr/local/bin/node]
4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/usr/local/bin/node]
5: v8::internal::Factory::NewFixedArray(int, v8::internal::PretenureFlag) [/usr/local/bin/node]
6: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Allocate(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [/usr/local/bin/node]
7: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Rehash(v8::internal::Handle<v8::internal::OrderedHashMap>, int) [/usr/local/bin/node]
8: v8::internal::Runtime_MapGrow(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
9: 0x29d827e840bd
10: 0x29d828934f21
11: 0x29d828ba5830
12: 0x29d827e86bbb
13: 0x29d828f85beb
Abort trap: 6
If I change my serverless config to not package individually, package: individually: false then this error goes away. I have tested this with version 3.0.0 and the latest, 4.1.0 with the same results. Don’t have this issue with 2.2.3.
Additional Data
- Serverless-Webpack Version you’re using: 4.1.0
- Webpack version you’re using: 3.10.0
- Serverless Framework Version you’re using: 1.24.0
- Operating System: macOS 10.12.6
- Stack Trace (if available): see above
About this issue
- Original URL
- State: open
- Created 7 years ago
- Reactions: 74
- Comments: 99 (30 by maintainers)
Commits related to this issue
- Serialized compile to address #299 — committed to takeshape/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to takeshape/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to takeshape/serverless-webpack by asprouse 5 years ago
- Remove serverless-offline references. This plugin eats a lot of RAM, triggering 'out of memory' error. Webpack is not required locally, only when deploying to AWS. https://github.com/serverless-heav... — committed to sav-valerio/serverless-webpack by sav-valerio 4 years ago
- Serialized compile to address #299 — committed to takeshape/serverless-webpack by asprouse 5 years ago
- Merge pull request #517 from takeshape/serialized-compile Serialized compile to address #299 — committed to serverless-heaven/serverless-webpack by miguel-a-calles-mba 4 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to serverless-heaven/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
- Serialized compile to address #299 — committed to vicary/serverless-webpack by asprouse 5 years ago
An update: it works when I set
transpileOnly: truefor ts-loader.I have implemented a fix (#570) that uses multiple process to compile functions when package individually is on. This guarantees that memory is cleaned up after every compile, since we kill the process, and can compile multiple functions at once. It improves performance by quite a bit in the testing I have done.
Ran into the same situation in our project where we are using serverless-webpack to individually package 28 lambdas with typescript. We finally hit the same error - Javascript heap out of memory - that’s already been reported.
Tried the PR from @asprouse - https://github.com/serverless-heaven/serverless-webpack/pull/517 - and can confirm that it fixed the issue for us. Any ETA on when this PR might be reviewed and merged?
Hmmm… that sounds like a memory leak somewhere when using individual packaging. We also have a project with more than 30 functions which works, but I did not check how the memory consumption is there (i.e. if we’re about to hit a limit).
What you can try is, to increase node’s heap memory limit (which is at 1.7GB by default) with:
node --max-old-space-size=4096 node_modules/serverless/bin/serverless packageto 4GB and check if it then passes with the full amount of functions.If that works, we have to find out, where exactly the memory leak comes from and if it can be fixed by reusing objects.
Bought a new laptop with I8 quad core and 16 gb of ram and this issue is happening more often than on my I5 duo with 8 gb of ram??
Hi everyone, I spend couple of hours trying to debug this problem. And my conclusion is memory leak in webpack or something else below webpack. I tried with ts-loader, awesome-typescript-loader, thread-loader, cache-loader, happypack, fork-ts-checker-webpack-plugin in any combination.
I wrote test webpack-test.js to debug only webpack, and try in every possible way to lost references to preform GC. Screenshot from node-gc-viewer below.
I see possible workaround, but it’s nasty… Invoke child node process (but please not like fork-ts-checker-webpack-plugin) to compile ts with webpack or … fix webpack 😄
My setup:
tsconfig.json
webpack.config.js
webpack-test.js
package.json
I ran into this problem as well, here’s my experience with several of the alternatives discussed in this thread:
serverless-offline, I managed to fix that for our system and submitted the fix in a PR (soda0289/serverless-webpack#2). Note that to get this to work we had to stop usingwebpack-node-externalson our webpack config, as that causes the config to be serialized with an empty array forexternalswhich causes lambdas to compile seemingly fine but then fail when deployed.Hope this is useful to someone and they don’t have to spend a whole day on it like I did 😄
In my case, I’ve got around 30 lambdas, and I have two problems:
The only way I’m able to use individually packaging is turning on
transpileOnlyin ts-loader.If I use
fork-ts-checker-webpack-plugin, my machine dies as the plugin spawns like 30 workers in parallel and it eats my 16GB RAM/swap in few seconds…IMHO the only solution is to compile all functions in series, one after the other, by default or with setting. How’s that going? Any ETA?
Thanks!
Much appreciated effort, Grumpy! When somebody fixes this, instead of all my lambdas weighing 30MB each, most of them will go below 1MB. So trust me, I appreciate efforts like this.
@andrewrothman The workaround that worked for my project is by turning off
package.individually: true. I get bigger deployment bundles but at least everything works.I still would want to package functions individually to get more optimized bundles but it is not my priority at the moment.
One thing I would try is to use
babel(andbabel-loader) for transpiling Typescript instead ofawesome-typescript-loaderorts-loader. If you don’t have any other option, maybe you can try this out.I think changing the title to “JavaScript heap out of memory when packaging many functions” makes more sense now that it has been isolated to just the packaging process and not the deployment process.
It’s worth checking if you are importing somewhere in code
aws-sdkinstead specific client e.gaws-sdk/clients/s3. We have 3 lambdas, one is express app, the other two are the simple cron jobs. When we added the third lambda, we got heap out of memory problem and with 16GB on mac it wasn’t enough then we checked if somewhere is imported some extensive library. It was dynamoose which had as its dependency the whole aws-sdk which resulted with >100MB bundle size plus in one other file was importing the whole aws-sdk module instead of the client. After these changes everything was good.I have not seen improvements with 5.4.0. I was helping out a friend on his project and I had to rollback to 5.3.5 to see some stability with the out-of-memory issue.
I also had to roll back to an older webpack (4.46.0).
cache-loader and thread-loader significantly helped for me
YMMV, but I’m currently testing what’s in this article about using
cache-loaderandthread-loader.Initial results are fine so far though I have only tested on my MacBook with 16GB of RAM and will still have to test on our CI which only has 3GB RAM 😃.
Working config so far…
Maybe a solution would be to provide a PR for the ts-checker plugin that limits the number of spawned processes when using multi-compiles in webpack.
The handlers look good. However, version 2.x did not support individual packaging (in fact it only copied the whole artifact per function). So you should, as next step, add node externals to your webpack configuration to let the externals be automatically determined by webpack, so that individual packaging can make use of it:
Additionally, webpack > 3.0.0 now uses a
module: rulesstructure instead ofmodule: loaders. You should change that too.Please also check if you have set
custom: webpackIncludeModules: truein your serverless.yml.Then do a
serverless packageto test, if it works. You’ll find the zip packages that would be uploaded in the.serverlessdirectory.If aws-sdk should be packaged, you can either put it into your devDependencies or use
to keep it outside of your packages.
We were hitting this issue in a 100+ function project. Can confirm that @vicary 's solution of using serverless-layers to provide dependencies + webpack-node-externals to avoid parsing node_modules quartered our RAM usage during build and halved the build time (thank you!).
Outsourcing the typechecking to fork-ts-checker-webpack-plugin helped further, but using serverless-layers + node externals was by far the biggest gain in our situation.
I am the author of #681, my project is on-and-off dealing with 200 lambda functions.
Recent updates in minor versions introduced this again, subsequent builds in the same process does linear increases in bundle time. This is further confirmed when tested with
thread-loader, the timer increases individually in each thread. Upgrading webpack from 5.11 to 5.37.1 slows down the increments, but, still, it is surely increasing gradually from 70s to 700s+ at the 50th entry.Using the
serverless-layersplugin and excluding withwebpack-node-externalswithout usingmodulesFromFileoptions stops the build times of subsequent entries time from increasing.My educated guess is that packages in node_modules contains side effects that webpack has no way to cleanup after bundling. Try to avoid having webpack to dip its toes into node_modules when Lambda Function Layers are available, otherwise pushing for https://github.com/serverless-heaven/serverless-webpack/pull/570 and helps rebasing maybe your only choice.
EDIT: Also make sure you read https://github.com/webpack/webpack/issues/6389 if you are thinking of downgrading to webpack 4.
We were able to get round this issue setting a Node env variable on our cloud build server, and locally.
export NODE_OPTIONS=--max_old_space_size=8192https://github.com/serverless/serverless/issues/6503
Most feasible workaround for this right now is simply to turn off individual packaging.
I did, still crashed with these loaders
I thought a bit about the issue. A workaround could be that the plugin would run the compiles in batches of some functions at once. However I do not know, if the webpack library will free the allocated resources after the compile again. But it could be worth a try.
According to the crash trace it already happened after 7 compiled - if every ts-loader line is for one function - and was at 1500 MB. [42611:0x104001600] 55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms allocation failure GC in old space requested
The first try should be to disable some plugins in the webpack.config and check if the ts-loader might allocate all the memory.
The slower runtime is expected, because it takes each webpack compile’s output to determine the modules that are really needed for each function and assembles only these for the function package. That takes some time (when using
--verboseyou should see the exact steps including their timing). The longer build outweighs the better startup behavior (if the lambdas are cold started) and if some big dependencies are only used by one function.That definitely seems to be the problem. I got much further along, looks like about 50% of the way through. If I bump it up to 12GB then the process finishes after about 8-10 minutes.