bullmq: Queue error: Connection is closed.

Hey there,

we we are adding jobs to a queue within an expressjs application. Therefore we are initializing the queue at startup.

Now we see almost regularly console log messages Connection is closed. If adding an Error-Listener this error is catched (stating the same error message). Nevertheless new jobs can be added to the queue successfully.

The same behaviour can be seen in a plain NodeJS application. This error does not appear for workers.

My question is: how to deal with this error. Adding an error listener with queue.on(“error”,…) and supressing log messages feels quite bad. Can anyone give me a hint to search for the cause? Is this a bug in BullMQ or an issue in IORedis?

Thank you so much and best regards!

import { Queue } from 'bullmq'

(async () => {
     const queue = new Queue("test-queue",
        {
            connection: {
                host: "host",
                port: 6380,
                password: "topsecret",
                tls: { servername: "host" },
            },
        });

    queue.on("error", (error: Error) => {
        // Suppressing it feels bad...
        // console.log(error.message);
    });
})();

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 6
  • Comments: 17 (6 by maintainers)

Commits related to this issue

Most upvoted comments

same issue for me, local redis works, managed redis on digitalocean doesn’t.

the console is spammed with these error messages since 1.80.4 as far as I can tell, since reverting to 1.80.3 resulted in 0 of these error messages thereafter and bumping the version up to 1.80.4 instantly caused these errors to reappear.

full stack trace for anything after 1.80.3:

Error: Connection is closed.
    at EventEmitter.RedisConnection.handleClientClose (/some/masked/path/node_modules/.pnpm/bullmq@1.80.4/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
    at EventEmitter.emit (node:events:527:28)
    at EventEmitter.emit (node:domain:475:12)
    at processTicksAndRejections (node:internal/process/task_queues:78:11)

the commit/change that I assume causes this issue can be found here.

thanks for taking the time and adressing the issue.

I can confirm this occures first with 1.80.4 but not with 1.80.3

@manast I see you likely found the cause and fixed it, great! thanks 🙏

edit: pr is from @roggervalf so kudos there! 😉

We are seeing this error too, using Heroku-Redis. Noticed it a couple of weeks ago, and we haven’t changed anything in our implementation.

A log snippet:

2022-05-16T14:14:48.039175+00:00 app[worker.1]:     at Redis.emit (node:events:394:28)
2022-05-16T14:14:48.039175+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:14:48.039175+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)
2022-05-16T14:14:48.138458+00:00 app[worker.1]: Error: Connection is closed.
2022-05-16T14:14:48.138460+00:00 app[worker.1]:     at RedisConnection.handleClientClose (/app/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
2022-05-16T14:14:48.138460+00:00 app[worker.1]:     at Redis.emit (node:events:406:35)
2022-05-16T14:14:48.138460+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:14:48.138461+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)
2022-05-16T14:16:21.000000+00:00 app[heroku-redis]: source=REDIS addon=redis-reticulated-83337 sample#active-connections=12 sample#load-avg-1m=0.395 sample#load-avg-5m=0.245 sample#load-avg-15m=0.185 sample#read-iops=0 sample#write-iops=37.643 sample#memory-total=15618680kB sample#memory-free=10019304kB sample#memory-cached=3923728kB sample#memory-redis=16115072bytes sample#hit-rate=0.51211 sample#evicted-keys=0
2022-05-16T14:19:07.000000+00:00 app[heroku-redis]: source=REDIS addon=redis-reticulated-83337 sample#active-connections=12 sample#load-avg-1m=0.33 sample#load-avg-5m=0.225 sample#load-avg-15m=0.18 sample#read-iops=0 sample#write-iops=37.434 sample#memory-total=15618680kB sample#memory-free=10004524kB sample#memory-cached=3934120kB sample#memory-redis=16115088bytes sample#hit-rate=0.51211 sample#evicted-keys=0
2022-05-16T14:19:49.076886+00:00 app[worker.1]: Error: Connection is closed.
2022-05-16T14:19:49.076896+00:00 app[worker.1]:     at RedisConnection.handleClientClose (/app/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
2022-05-16T14:19:49.076897+00:00 app[worker.1]:     at Redis.emit (node:events:406:35)
2022-05-16T14:19:49.076897+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:19:49.076898+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)
2022-05-16T14:19:49.077240+00:00 app[worker.1]: Error: Connection is closed.
2022-05-16T14:19:49.077241+00:00 app[worker.1]:     at RedisConnection.handleClientClose (/app/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
2022-05-16T14:19:49.077241+00:00 app[worker.1]:     at Redis.emit (node:events:394:28)
2022-05-16T14:19:49.077241+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:19:49.077242+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)
2022-05-16T14:19:49.177073+00:00 app[worker.1]: Error: Connection is closed.
2022-05-16T14:19:49.177075+00:00 app[worker.1]:     at RedisConnection.handleClientClose (/app/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
2022-05-16T14:19:49.177075+00:00 app[worker.1]:     at Redis.emit (node:events:406:35)
2022-05-16T14:19:49.177075+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:19:49.177076+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)
2022-05-16T14:19:49.177527+00:00 app[worker.1]: Error: Connection is closed.
2022-05-16T14:19:49.177527+00:00 app[worker.1]:     at RedisConnection.handleClientClose (/app/node_modules/bullmq/dist/cjs/classes/redis-connection.js:54:32)
2022-05-16T14:19:49.177528+00:00 app[worker.1]:     at Redis.emit (node:events:406:35)
2022-05-16T14:19:49.177528+00:00 app[worker.1]:     at Redis.emit (node:domain:475:12)
2022-05-16T14:19:49.177528+00:00 app[worker.1]:     at processTicksAndRejections (node:internal/process/task_queues:78:11)

What Azure tier is it? How many connections are you opening? My hunch is that it’s automatically closing connections during periods of no activity.

The behavior occurs with both the Standard C0 (256 connections) and Premium P1 (7500 connections) tiers. The Redis instance is exclusively used for BullMQ.