serverless-webpack: JavaScript heap out of memory when packaging many functions

This is a Bug Report

Description

I’m in the process of trying to upgrade serverless-webpack version from 2.2.3, where I do not experience the following issue. Our serverless configuration has package: invididually: true set, and about 40 functions. When I try to upgrade to a later version of serverless-webpack and run sls webpack, the build will run for about a minute and then I get the following error:

lambda:daniel.cottone $ npm run build

> expert-api-lambda@0.1.0 build /Users/daniel.cottone/Projects/expert-api/lambda
> sls webpack --stage dev

Serverless: Bundling with Webpack...
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json
ts-loader: Using typescript@2.5.2 and /Users/daniel.cottone/Projects/expert-api/lambda/tsconfig.json

<--- Last few GCs --->

[42611:0x104001600]    55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms  allocation failure GC in old space requested
[42611:0x104001600]    57889 ms: Mark-sweep 1405.7 (1508.8) -> 1405.5 (1487.3) MB, 1923.4 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1923 ms) last resort 
[42611:0x104001600]    59801 ms: Mark-sweep 1405.5 (1487.3) -> 1405.4 (1486.8) MB, 1903.6 / 0.0 ms  last resort 


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x37341f01ba79 <JS Object>
    1: set [native collection.js:~247] [pc=0x29d828934f21](this=0x332730f95301 <a Map with map 0x23d2df14319>,p=0x3dd499abec41 <String[11]: MediaSource>,x=0x2589b9b1c819 <a SymbolObject with map 0x399abfecde11>)
    2: /* anonymous */(aka /* anonymous */) [/Users/daniel.cottone/Projects/expert-api/lambda/node_modules/typescript/lib/typescript.js:~23166] [pc=0x29d828ba5830](this=0x37341f002241 <...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/usr/local/bin/node]
 2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/usr/local/bin/node]
 3: v8::Utils::ReportOOMFailure(char const*, bool) [/usr/local/bin/node]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/usr/local/bin/node]
 5: v8::internal::Factory::NewFixedArray(int, v8::internal::PretenureFlag) [/usr/local/bin/node]
 6: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Allocate(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [/usr/local/bin/node]
 7: v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, v8::internal::JSMapIterator, 2>::Rehash(v8::internal::Handle<v8::internal::OrderedHashMap>, int) [/usr/local/bin/node]
 8: v8::internal::Runtime_MapGrow(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
 9: 0x29d827e840bd
10: 0x29d828934f21
11: 0x29d828ba5830
12: 0x29d827e86bbb
13: 0x29d828f85beb
Abort trap: 6

If I change my serverless config to not package individually, package: individually: false then this error goes away. I have tested this with version 3.0.0 and the latest, 4.1.0 with the same results. Don’t have this issue with 2.2.3.

Additional Data

  • Serverless-Webpack Version you’re using: 4.1.0
  • Webpack version you’re using: 3.10.0
  • Serverless Framework Version you’re using: 1.24.0
  • Operating System: macOS 10.12.6
  • Stack Trace (if available): see above

About this issue

  • Original URL
  • State: open
  • Created 7 years ago
  • Reactions: 74
  • Comments: 99 (30 by maintainers)

Commits related to this issue

Most upvoted comments

An update: it works when I set transpileOnly: true for ts-loader.

I have implemented a fix (#570) that uses multiple process to compile functions when package individually is on. This guarantees that memory is cleaned up after every compile, since we kill the process, and can compile multiple functions at once. It improves performance by quite a bit in the testing I have done.

Ran into the same situation in our project where we are using serverless-webpack to individually package 28 lambdas with typescript. We finally hit the same error - Javascript heap out of memory - that’s already been reported.

Tried the PR from @asprouse - https://github.com/serverless-heaven/serverless-webpack/pull/517 - and can confirm that it fixed the issue for us. Any ETA on when this PR might be reviewed and merged?

Hmmm… that sounds like a memory leak somewhere when using individual packaging. We also have a project with more than 30 functions which works, but I did not check how the memory consumption is there (i.e. if we’re about to hit a limit).

What you can try is, to increase node’s heap memory limit (which is at 1.7GB by default) with: node --max-old-space-size=4096 node_modules/serverless/bin/serverless package to 4GB and check if it then passes with the full amount of functions.

If that works, we have to find out, where exactly the memory leak comes from and if it can be fixed by reusing objects.

Bought a new laptop with I8 quad core and 16 gb of ram and this issue is happening more often than on my I5 duo with 8 gb of ram??

Hi everyone, I spend couple of hours trying to debug this problem. And my conclusion is memory leak in webpack or something else below webpack. I tried with ts-loader, awesome-typescript-loader, thread-loader, cache-loader, happypack, fork-ts-checker-webpack-plugin in any combination.

I wrote test webpack-test.js to debug only webpack, and try in every possible way to lost references to preform GC. Screenshot from node-gc-viewer below.

I see possible workaround, but it’s nasty… Invoke child node process (but please not like fork-ts-checker-webpack-plugin) to compile ts with webpack or … fix webpack 😄

My setup:

tsconfig.json

{
  "compilerOptions": {
    "sourceMap": true,
    "target": "es6",
    "types": [
      "node"
    ],
    "moduleResolution": "node"
  }
}

webpack.config.js

module.exports = {
    entry: {},
    target: 'node',
    output: {
        libraryTarget: 'commonjs2',
        path: '<absolute patch to project>/.webpack',
        filename: '[name].js',
    },
    module: {
        rules: [
            {
                test: '\.ts(x?)$',
                use: [
                    {
                        loader: 'ts-loader'
                    }
                ],
            }
        ]
    },
    externals: {
        'crypto': true,
        'aws-sdk': true
    },
    resolve: {
        extensions: [
            '.js',
            '.jsx',
            '.json',
            '.ts',
            '.tsx'
        ],
        alias: {
            'handlebars': 'handlebars/dist/handlebars.js'
        }
    }
};

webpack-test.js

const webpackConfig = require('./webpack.config');

const entries = {
    'src/handler/AuthorizerHandler': './src/handler/AuthorizerHandler.ts',
    'src/handler/yyy/LoginHandler': './src/handler/yyy/LoginHandler.ts',
    'src/handler/xxx/GetAllHandler': './src/handler/xxx/GetAllHandler.ts',
    'src/handler/xxx/GetOneHandler': './src/handler/xxx/GetOneHandler.ts',
    'src/handler/xxx/CreateHandler': './src/handler/xxx/CreateHandler.ts',
    'src/handler/xxx/UpdateHandler': './src/handler/xxx/UpdateHandler.ts',
    'src/handler/xxx/DeleteHandler': './src/handler/xxx/DeleteHandler.ts',
    'src/handler/zzz/GetAllHandler': './src/handler/zzz/GetAllHandler.ts',
    'src/handler/zzz/GetOneHandler': './src/handler/zzz/GetOneHandler.ts',
    'src/handler/zzz/CreateHandler': './src/handler/zzz/CreateHandler.ts',
    'src/handler/zzz/UpdateHandler': './src/handler/zzz/UpdateHandler.ts',
    'src/handler/zzz/DeleteHandler': './src/handler/zzz/DeleteHandler.ts',
    'src/handler/zzz/PublishHandler': './src/handler/zzz/PublishHandler.ts',
    // 'src/handler/qqq/GetAllHandler': './src/handler/qqq/GetAllHandler.ts',
    // 'src/handler/qqq/GetOneHandler': './src/handler/qqq/GetOneHandler.ts',
    // 'src/handler/qqq/CreateHandler': './src/handler/qqq/CreateHandler.ts',
    // 'src/handler/qqq/UpdateHandler': './src/handler/qqq/UpdateHandler.ts',
    // 'src/handler/qqq/DeleteHandler': './src/handler/qqq/DeleteHandler.ts',
    // 'src/handler/aaa/GetAllHandler': './src/handler/aaa/GetAllHandler.ts',
    // 'src/handler/aaa/GetOneHandler': './src/handler/aaa/GetOneHandler.ts',
    // 'src/handler/aaa/CreateHandler': './src/handler/aaa/CreateHandler.ts',
    // 'src/handler/aaa/UpdateHandler': './src/handler/aaa/UpdateHandler.ts',
    // 'src/handler/aaa/DeleteHandler': './src/handler/aaa/DeleteHandler.ts'
};


const queue = [];

for (const key of Object.keys(entries)) {
    const value = entries[key];

    queue.push([key, value]);
}

let working = false;

let webpack = null;
let compiler = null;
let config = null;
const configJson = JSON.stringify(webpackConfig);

const interval = setInterval(intervalF, 1000);

function intervalF() {
    if (working) {
        return;
    }

    if (queue.length === 0) {
        console.log('DONE!');
        clearInterval(interval);
        return;
    }

    working = true;

    const [key, value] = queue.pop();


    config = null;
    webpack = null;
    compiler = null;

    config = JSON.parse(configJson);

    config.module.rules[0].test = new RegExp(config.module.rules[0].test);

    config.entry[key] = value;

    console.log(config);

    webpack = require('webpack');
    console.log('COMPILING', key, value);

    compiler = webpack(config, (err, stats) => {
        working = false;
        console.log('COMPILED', key, value);
    });
}

package.json

"devDependencies": {
    "@types/aws-lambda": "0.0.22",
    "@types/handlebars": "^4.0.36",
    "@types/jsonwebtoken": "^7.2.5",
    "@types/node": "^8.0.57",
    "@types/uuid": "^3.4.3",
    "awesome-typescript-loader": "^3.4.1",
    "aws-sdk": "^2.176.0",
    "cache-loader": "^1.2.0",
    "fork-ts-checker-webpack-plugin": "^0.3.0",
    "happypack": "^4.0.1",
    "serverless": "^1.25.0",
    "serverless-domain-manager": "^2.0.2",
    "serverless-dynamodb-local": "^0.2.26",
    "serverless-kms-secrets": "^1.0.2",
    "serverless-offline": "^3.16.0",
    "serverless-webpack": "^4.0.0",
    "thread-loader": "^1.1.2",
    "ts-loader": "^2.3.7",
    "typescript": "^2.5.2",
    "webpack": "^3.6.0"
  },
"dependencies": {
    "handlebars": "^4.0.11",
    "jsonwebtoken": "^8.1.0",
    "uuid": "^3.1.0"
  }

image

I ran into this problem as well, here’s my experience with several of the alternatives discussed in this thread:

  • Adding additional memory to the process worked for a while, but, when the complexity of my system grew, the system reached a point where I had to provision more than 12GB for the process not to trigger any faults (and I’d have had to keep increasing it whenever new functions were added).
  • Applying #517 would let us compile more functions than without it but eventually we’d also get a fault. The number of functions we managed to compile depended on the memory allocated to the process, so eventually this would lead to the same problem of having to continually increase the memory forever.
  • Applying #570 would solve our problem but would break serverless-offline, I managed to fix that for our system and submitted the fix in a PR (soda0289/serverless-webpack#2). Note that to get this to work we had to stop using webpack-node-externals on our webpack config, as that causes the config to be serialized with an empty array for externals which causes lambdas to compile seemingly fine but then fail when deployed.

Hope this is useful to someone and they don’t have to spend a whole day on it like I did 😄

In my case, I’ve got around 30 lambdas, and I have two problems:

  • If I turn off individual packaging, then my package exceeds Lambda’s ~250MB code limit
  • If I turn it on, I get the error discuted in this issue (JS heap out of memory)

The only way I’m able to use individually packaging is turning on transpileOnly in ts-loader.

If I use fork-ts-checker-webpack-plugin, my machine dies as the plugin spawns like 30 workers in parallel and it eats my 16GB RAM/swap in few seconds…

IMHO the only solution is to compile all functions in series, one after the other, by default or with setting. How’s that going? Any ETA?

Thanks!

Much appreciated effort, Grumpy! When somebody fixes this, instead of all my lambdas weighing 30MB each, most of them will go below 1MB. So trust me, I appreciate efforts like this.

@andrewrothman The workaround that worked for my project is by turning off package.individually: true. I get bigger deployment bundles but at least everything works.

I still would want to package functions individually to get more optimized bundles but it is not my priority at the moment.

One thing I would try is to use babel (and babel-loader) for transpiling Typescript instead of awesome-typescript-loader or ts-loader. If you don’t have any other option, maybe you can try this out.

I think changing the title to “JavaScript heap out of memory when packaging many functions” makes more sense now that it has been isolated to just the packaging process and not the deployment process.

It’s worth checking if you are importing somewhere in code aws-sdk instead specific client e.g aws-sdk/clients/s3. We have 3 lambdas, one is express app, the other two are the simple cron jobs. When we added the third lambda, we got heap out of memory problem and with 16GB on mac it wasn’t enough then we checked if somewhere is imported some extensive library. It was dynamoose which had as its dependency the whole aws-sdk which resulted with >100MB bundle size plus in one other file was importing the whole aws-sdk module instead of the client. After these changes everything was good.

I have not seen improvements with 5.4.0. I was helping out a friend on his project and I had to rollback to 5.3.5 to see some stability with the out-of-memory issue.

I also had to roll back to an older webpack (4.46.0).

cache-loader and thread-loader significantly helped for me

YMMV, but I’m currently testing what’s in this article about using cache-loader and thread-loader.

Initial results are fine so far though I have only tested on my MacBook with 16GB of RAM and will still have to test on our CI which only has 3GB RAM 😃.

Working config so far…

'use strict'

const os = require('os')
const path = require('path')
const ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin')
const slsw = require('serverless-webpack')
const webpack = require('webpack')
const nodeExternals = require('webpack-node-externals')


module.exports = {
    context: __dirname,
    entry: slsw.lib.entries,
    target: 'node',
    output: {
        libraryTarget: 'commonjs2',
        path: path.join(__dirname, 'build'),
        filename: '[name].js',
    },
    module: {
        rules: [{
            test: /\.ts$/,
            use: [{
                loader: 'cache-loader',
            }, {
                loader: 'thread-loader',
                options: {
                    // There should be 1 cpu for the
                    // fork-ts-checker-webpack-plugin
                    workers: os.cpus().length - 1,
                },
            }, {
                loader: 'ts-loader',
                options: {
                    // IMPORTANT! use happyPackMode mode to speed-up
                    // compilation and reduce errors reported to webpack
                    happyPackMode: true,
                },
            }],
        }],
    },
    externals: [nodeExternals()],
    resolve: {
        extensions: ['.ts', '.js'],
    },
    devtool: 'source-map',
    plugins: [
        new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true }),
    ],
}

Maybe a solution would be to provide a PR for the ts-checker plugin that limits the number of spawned processes when using multi-compiles in webpack.

The handlers look good. However, version 2.x did not support individual packaging (in fact it only copied the whole artifact per function). So you should, as next step, add node externals to your webpack configuration to let the externals be automatically determined by webpack, so that individual packaging can make use of it:

// webpack config
const nodeExternals = require('webpack-node-externals');
...
  externals: [ nodeExternals() ]
...

Additionally, webpack > 3.0.0 now uses a module: rules structure instead of module: loaders. You should change that too.

Please also check if you have set custom: webpackIncludeModules: true in your serverless.yml.

Then do a serverless package to test, if it works. You’ll find the zip packages that would be uploaded in the .serverless directory.

If aws-sdk should be packaged, you can either put it into your devDependencies or use

# serverless.yml
custom:
  webpackIncludeModules:
    forceExclude:
      - aws-sdk

to keep it outside of your packages.

We were hitting this issue in a 100+ function project. Can confirm that @vicary 's solution of using serverless-layers to provide dependencies + webpack-node-externals to avoid parsing node_modules quartered our RAM usage during build and halved the build time (thank you!).

Outsourcing the typechecking to fork-ts-checker-webpack-plugin helped further, but using serverless-layers + node externals was by far the biggest gain in our situation.

I am the author of #681, my project is on-and-off dealing with 200 lambda functions.

Recent updates in minor versions introduced this again, subsequent builds in the same process does linear increases in bundle time. This is further confirmed when tested with thread-loader, the timer increases individually in each thread. Upgrading webpack from 5.11 to 5.37.1 slows down the increments, but, still, it is surely increasing gradually from 70s to 700s+ at the 50th entry.

Using the serverless-layers plugin and excluding with webpack-node-externals without using modulesFromFile options stops the build times of subsequent entries time from increasing.

My educated guess is that packages in node_modules contains side effects that webpack has no way to cleanup after bundling. Try to avoid having webpack to dip its toes into node_modules when Lambda Function Layers are available, otherwise pushing for https://github.com/serverless-heaven/serverless-webpack/pull/570 and helps rebasing maybe your only choice.

EDIT: Also make sure you read https://github.com/webpack/webpack/issues/6389 if you are thinking of downgrading to webpack 4.

We were able to get round this issue setting a Node env variable on our cloud build server, and locally.

export NODE_OPTIONS=--max_old_space_size=8192

https://github.com/serverless/serverless/issues/6503

Most feasible workaround for this right now is simply to turn off individual packaging.

I did, still crashed with these loaders

 module: {
        rules: [
            {
                test: /\.tsx?$/,
                use: [
                    { loader: 'cache-loader' },
                    {
                        loader: 'thread-loader',
                        options: {
                            // there should be 1 cpu for the fork-ts-checker-webpack-plugin
                            workers: require('os').cpus().length - 1,
                        },
                    },
                    {
                      loader: 'ts-loader',
                      options: {
                        happyPackMode: true,
                        transpileOnly: true
                      }
                    }
                  ]
             }
        ]
    }
Serverless: Bundling with Webpack...

<--- Last few GCs --->

[4920:0x391c8b0]   175834 ms: Mark-sweep 1155.1 (1502.1) -> 1155.0 (1503.1) MB, 695.4 / 0.0 ms  allocation failure GC in old space requested
[4920:0x391c8b0]   176489 ms: Mark-sweep 1155.0 (1503.1) -> 1154.9 (1450.6) MB, 655.1 / 0.0 ms  last resort GC in old space requested
[4920:0x391c8b0]   177153 ms: Mark-sweep 1154.9 (1450.6) -> 1154.9 (1437.6) MB, 663.7 / 0.0 ms  last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x1f3b69da5501 <JSObject>
    1: /* anonymous */ [/mnt/c/AccountServices/node_modules/webpack/node_modules/webpack-sources/node_modules/source-map/lib/source-node.js:~342] [pc=0x3dab59846f57](this=0x3597b2d8c389 <JSGlobal Object>,chunk=0x2c2f5e74b4c1 <String[60]\:     xxx('_applySerializers: excludeFields', excludeFields);\n>,original=0x3added1b95b9 <Object map = 0x17ffa9905dd1>)
    2: SourceNode_walk [/mnt/c...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [node]
 2: 0x11f155c [node]
 3: v8::Utils::ReportOOMFailure(char const*, bool) [node]
 4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [node]
 5: v8::internal::Factory::NewUninitializedFixedArray(int) [node]
 6: 0xdf2313 [node]
 7: v8::internal::Runtime_GrowArrayElements(int, v8::internal::Object**, v8::internal::Isolate*) [node]
 8: 0x3dab58f842fd
Aborted (core dumped)

I thought a bit about the issue. A workaround could be that the plugin would run the compiles in batches of some functions at once. However I do not know, if the webpack library will free the allocated resources after the compile again. But it could be worth a try.

According to the crash trace it already happened after 7 compiled - if every ts-loader line is for one function - and was at 1500 MB. [42611:0x104001600] 55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms allocation failure GC in old space requested

The first try should be to disable some plugins in the webpack.config and check if the ts-loader might allocate all the memory.

The slower runtime is expected, because it takes each webpack compile’s output to determine the modules that are really needed for each function and assembles only these for the function package. That takes some time (when using --verbose you should see the exact steps including their timing). The longer build outweighs the better startup behavior (if the lambdas are cold started) and if some big dependencies are only used by one function.

That definitely seems to be the problem. I got much further along, looks like about 50% of the way through. If I bump it up to 12GB then the process finishes after about 8-10 minutes.