chromium: browser.close() just hangs

Environment

  • chromium Version: 112.0.2
  • puppeteer / puppeteer-core Version: 19.8.5
  • Node.js Version: 16
  • Lambda / GCF Runtime: node 16

Expected Behavior

browser.close() should resolve or reject

Current Behavior

it just hangs, it doesn’t when i pin to 19.7.2/111.0.0

Steps to Reproduce

I’m doing very standard stuff that is mostly right out of the examples. the one thing I’m doing slightly out of the ordinary is that I am making a PDF of the page, all that stuff still works though it’s just the call to close that isn’t working.

Possible Solution

not sure possibly related to #80 ?

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Reactions: 28
  • Comments: 31 (5 by maintainers)

Most upvoted comments

Had the same problem on AWS Lambda using node v18.

I noticed the following comment on a related bug report, but in the puppeteer GitHub: https://github.com/puppeteer/puppeteer/issues/7922#issuecomment-1549052725

So I went ahead and tried

const pages = await browser.pages();
for (const page of pages) {
  await page.close();
}
await browser.close();

and it seems to solve the problem of browser.close hanging. I guess this could potentially be a better option than killing the process, but I wonder if it actually closes the browser correctly.

Same issue with AWS Lambda NodeJS v18.

Adding chromium.setGraphicsMode = false; helped me:

  chromium.setGraphicsMode = false; // <= [*] added this line
  const browser = await puppeteer.launch({
    args: chromium.args,
    defaultViewport: chromium.defaultViewport,
    executablePath: await chromium.executablePath(),
    headless: chromium.headless,
  });
  /* ... */
  await browser.close(); // <= hangs here WITHOUT [*]

I was hitting this problem with v111.0.0 and adding --disable-gpu fixed it for me:

      playwright.launch({
        args: [...chromium.args, '--disable-gpu'],
        executablePath: await chromium.executablePath(),
      });

I tried this on v112.0.2 and browser.close() appears to work again on that too.

Thank you @aaronce for pointing me in this direction

Update / Edit

Per @Sparticuz suggestion below, I changed to this, which works on v111 and v112:

      chromium.setGraphicsMode = false;
      playwright.launch({
        args: chromium.args,
        executablePath: await chromium.executablePath(),
      });

I can confirm that not only the call hangs, but the resources used by the browser are also not cleaned up. As a workaround you can kill the browser process manually:

browser = puppeteer.launch(...)
browserPid = browser.process()?.pid
...
if (browserPid) {
    process.kill(browserPid)
}

I was hitting this problem with v111.0.0 and adding --disable-gpu fixed it for me:

If you use chromium.setGraphicsMode = false before calling chromium.executablePath() instead of spreading the chromium.args with --disable-gpu, you’ll actually gain some extract speed because the webgl drivers won’t be extracted.

chromium.setGraphicsMode = false;
playwright.launch({
        args: chromium.args,
        executablePath: await chromium.executablePath(),
      });

On Bitbucket Pipelines, using the official node Docker containers node:16, node:18, and node:20, and on macOS 13.5.2, the issue exists. await browser.close() never resolves.

The setGraphicsMode = false solution doesn’t do didly on either platform for me.

On macOS, the issue only occurs if the browser.connected === false when calling browser.close().

This was resolved with

…
   const connected = browser.connected
   if (connected) {
      …
    } else {
      try {
        const killResult = browser.process().kill()
        console.log(`kill: ${killResult}`)
      } catch (err) {
        console.log('kill failed. giving up')
      }
    }
…

On Bitbucket Pipelines, using the official node Docker containers node:16, node:18, and node:20, the issue also exists when browser.connected === true.

I found out that there seems to be 1 page, even if no page has been created. Closing that (and any other pages that would hang around) resolved the issue on Bitbucket Pipelines, using the official node Docker containers node:16, node:18, and node:20.

…
   const connected = browser.connected
   if (connected) {
      const pages = await browser.pages()
      await Promise.all(pages.map(p => p.close()))
      await browser.close()
   }
…

My mocha tests on Bitbucket Pipelines now all pass, BUT the mocha process does not end! I suspect there are still dangling browsers as child processes that would keep the mocha process from concluding.

heads up, we just recently got a bunch of ENOMEM errors on production because the chromium isn’t properly being cleaned up. We are trying a singleton approach to re-use the browser for now.

@aldenquimby thanks for that information. I had considered a pattern like that myself, but was worried about leakage across invocations. I will monitor for that error now.

I’ve also not been able to get this working (ever), but since the lambda instance terminates chromium after the request anyways, I’m not sure it matters much to me.

puppeteer says that it’s a void promise, so await browser.close(); should be all you need.