nodejs-pubsub: DEADLINE_EXCEEDED on subscriber

Environment details

  • OS: Node 12.8.2:alpine docker image
  • npm version:
  • @google-cloud/pubsub version: 2.11.0

Steps to reproduce

All the message being received by a subscriber throws this error

{"serviceName":"notification","ackIds":["id1", "id2",...],"code":4,"details":"Deadline Exceeded","metadata":{"_internal_repr":{},"flags":0},"stack":"Error: Failed to \"modifyAckDeadline\" for 3 message(s). Reason: 4 DEADLINE_EXCEEDED: Deadline Exceeded\n    at /app/node_modules/@google-cloud/pubsub/src/message-queues.ts:258:15","message":"Failed to \"modifyAckDeadline\" for 3 message(s). Reason: 4 DEADLINE_EXCEEDED: Deadline Exceeded","severity":"error"}

I have tried to use new PubSub({ grpc }); with grpc version “^1.24.7” and new PubSub({}) and I get the same error in both cases. I have made sure that the IAM policies are correct.

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 1
  • Comments: 19 (3 by maintainers)

Most upvoted comments

It sounds to me like this is a case of the client getting overwhelmed and not being able to send acks and modacks back. Let me mention some things about the properties discussed so far:

maxStreams controls the number of streaming pull streams that are open to the client. This number should not really be set to a higher value unless it is believed that the client itself can handle more messages, but a single stream is not able to deliver enough. A single stream can deliver 10MB/s as per the resource limits. Limiting to a single stream will not limit one to receiving a single message at a time.

It is possible that the subscriber is becoming overwhelmed with the amount of data it is receiving. If this is the case, then increasing maxStreams will likely make the problem worse. What you want to do is to try to limit the number of messages your subscriber is handling at once. Reducing maxStreams is a more advanced way to do this. The best way to do it is to change the flow control settings. With flow control settings reduced, there would be fewer messages delivered to the client at the same time. This would help if the issue is that overload due to how the messages are being processed is what is causing the failed RPCs to the server.

There could be many different things that could contribute to an overloaded subscriber. It may not even be the subscriber itself that is overloaded, it could be that it is on an overloaded machine. Does the docker container contain anything else that is running and could be using up CPU, RAM, or network resources? What about the machine it runs on? Also, depending on where and how you are running the subscriber, it could be that the type of VM has limited throughput and therefore the requests are unable to get sent.

There are some Medium posts that may be of interest:

I was asking one of our service folks, and he said that the publisher and subscriber should ideally be completely decoupled. So disregard my publisher flow control comment, sorry.

I reproduce error with this minimal sample.

Here the publisher:

// variable.js
const {PubSub} = require("@google-cloud/pubsub");

const pubsub = new PubSub({  projectId: "mazen-158916",  keyFile: "key.json",});
const topicName = "alexandre-testing";

async function getTopic() {
  const [topic] = await pubsub.topic(topicName).get({autoCreate: true});
  return topic;
}

module.exports = {  pubsub,  getTopic,};
// publish.js
const {getTopic} = require("./variables");

async function publishMessage(topic) {
  const dataBuffer = Buffer.from(JSON.stringify({content: "Hello World", date: new Date()}));
  await topic.publish(dataBuffer);
  process.stdout.write(".");
}

async function main() {
  const topic = await getTopic();
  setInterval(() => publishMessage(topic), 1000);
}

main().catch(console.error);

and the subscriber. I intentionnaly wait 20 seconds to responds (default ackDeadline is 10s)

// subscribe.js
const {getTopic, pubsub} = require("./variables");

async function main() {
  const topic = await getTopic();

  const [subscription1] = await pubsub
    .subscription(`alexandre-testing-subscription-${new Date().getTime()}`, { topic,})
    .get({autoCreate: true});

  subscription1.on("message", (message) => {
    console.log(
      `get message #${message.id} -- receive %o`,
      JSON.parse(message.data.toString())
    );

    setTimeout(() => {
      console.log(`ack message #${message.id}`);
      message.ack();
    }, 20_000);
  });
}

main().catch(console.error);

Output let me think that everything goes well

get message #2574927903389364 -- receive { content: 'Hello World', date: '2021-06-23T15:05:08.726Z' }
...
ack message #2574927903389364

But message are stuck

image

If I restart node subscribe.js, I can’t get them again…

Any update on this? It’s a bit disappointing that this library does not work out of the box given that pubsub is a stable product and nodejs is one of the most widely used frameworks.