kombu: Latest working celery/redis cannot inspect: Error: No nodes replied within time constraint.

  • I have read the relevant section in the contribution guide on reporting bugs.
  • I have checked the issues list for similar or identical bug reports.
  • I have checked the pull requests list for existing proposed fixes.
  • I have checked the commit log to find out if the bug was already fixed in the master branch.
  • I have included all related issues and possible duplicate issues in this issue (If there are none, check this box anyway).

Mandatory Debugging Information

  • I have included the output of celery -A proj report in the issue. (if you are not able to do this, then at least specify the Celery version affected).
  • I have verified that the issue exists against the master branch of Celery.
  • I have included the contents of pip freeze in the issue.
  • I have included all the versions of all the external dependencies required to reproduce this bug.

Related Issues

Possible Duplicates

  • None

Environment & Settings

Celery version: 4.3.0

celery report Output:

software -> celery:4.3.0 (rhubarb) kombu:4.6.4 py:2.7.16
            billiard:3.6.1.0 redis:3.2.1
platform -> system:Linux arch:64bit
            kernel version:3.10.0-957.27.2.el7.x86_64 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:sentinel results:disabled

CELERY_QUEUES:
    (<unbound Queue celery -> <unbound Exchange celery(direct)> -> celery>,
 <unbound Queue fast -> <unbound Exchange fast(direct)> -> fast>,
 <unbound Queue slow -> <unbound Exchange slow(direct)> -> slow>,
 <unbound Queue mp-fast -> <unbound Exchange mp-fast(direct)> -> mp-fast>,
 <unbound Queue mp-slow -> <unbound Exchange mp-slow(direct)> -> mp-slow>)
BROKER_TRANSPORT_OPTIONS: {
    'master_name': 'staging'}
BROKER_URL: u'sentinel://redis-s1.example.domain.com:26379//'
CELERY_ALWAYS_EAGER: False
CELERY_DISABLE_RATE_LIMITS: True
CELERY_ACCEPT_CONTENT: ['json']
CELERYD_MAX_TASKS_PER_CHILD: 2000
CELERY_IMPORTS:
    ('tasks',)
CELERY_EAGER_PROPAGATES_EXCEPTIONS: True
CELERY_STORE_ERRORS_EVEN_IF_IGNORED: True
CELERY_IGNORE_RESULT: True
CELERY_TASK_SERIALIZER: 'json'

Steps to Reproduce

Required Dependencies

  • Minimal Python Version: 2.7.16
  • Minimal Celery Version: 4.3.0
  • Minimal Kombu Version: 4.6.4
  • Minimal Broker Version: redis 3.0.6
  • Minimal Result Backend Version: N/A or Unknown
  • Minimal OS and/or Kernel Version: N/A or Unknown
  • Minimal Broker Client Version: N/A or Unknown
  • Minimal Result Backend Client Version: N/A or Unknown

Python Packages

pip freeze Output:

ABN==0.4.2
address==0.1.1
akismet==1.0.1
amqp==2.5.1
asn1crypto==0.24.0
attrs==19.1.0
Authlib==0.11
Authomatic==0.0.13
awesome-slugify==1.6.2
Babel==2.6.0
backports.functools-lru-cache==1.5
billiard==3.6.1.0
bleach==1.5.0
boto==2.38.0
cachetools==3.1.1
cas-client==1.0.0
celery==4.3.0
certifi==2017.7.27.1
cffi==1.12.3
chardet==3.0.4
click==6.7
configparser==3.8.1
contextlib2==0.5.5
coverage==4.5.4
cryptography==2.0.3
cssselect==0.9.2
cycler==0.10.0
datadog==0.11.0
ddtrace==0.25.0
decorator==4.4.0
dnspython==1.16.0
docopt==0.4.0
docutils==0.15.2
elasticsearch==6.3.1
enum34==1.1.6
filelock==3.0.12
funcsigs==1.0.2
future==0.17.1
google-auth==1.6.2
hiredis==0.2.0
html5lib==0.9999999
httplib2==0.13.1
idna==2.8
importlib-metadata==0.19
ipaddress==1.0.22
isodate==0.5.4
itsdangerous==0.24
Jinja2==2.7.1
kafka-python==1.4.6
kiwisolver==1.1.0
kombu==4.6.4
lmtpd==6.0.0
lockfile==0.12.2
loginpass==0.2.1
lxml==3.6.1
mandrill==1.0.57
Markdown==2.2.1
MarkupSafe==0.18
matplotlib==2.2.4
mock==1.0.1
more-itertools==5.0.0
mysqlclient==1.3.9
netaddr==0.7.19
numpy==1.16.4
oauth2==1.9.0.post1
packaging==19.1
passlib==1.6.1
pathlib2==2.3.4
paypalrestsdk==0.6.2
Pillow==2.8.1
pluggy==0.6.0
psutil==5.6.3
py==1.8.0
pyasn1==0.4.6
pyasn1-modules==0.2.6
PyBabel-json==0.2.0
pybreaker==0.5.0
pycountry==18.2.23
pycparser==2.19
pycryptodome==3.8.2
PyJWT==0.4.1
pylibmc==1.6.0
pyparsing==2.4.2
pytest==3.5.0
pytest-cov==2.4.0
python-daemon==2.1.2
python-dateutil==2.1
pytz==2014.4
PyYAML==3.12
raven==5.31.0
redis==3.2.1
regex==2018.11.3
requests==2.7.0
rsa==4.0
salmon-mail==3.0.0
scandir==1.10.0
simple-db-migrate==3.0.0
simplejson==3.10.0
six==1.11.0
SQLAlchemy==1.0.6
subprocess32==3.5.4
sudz==1.0.3
termcolor==1.1.0
toml==0.10.0
tox==3.13.2
Unidecode==0.4.21
urllib3==1.25.3
uWSGI==2.0.17.1
vine==1.3.0
virtualenv==16.7.2
Werkzeug==0.11.15
WTForms==1.0.5
zipp==0.5.2

Other Dependencies

N/A

Minimally Reproducible Test Case

Expected Behavior

I expect celery -A app inspect ping (as well as other subcommands of celery inspect) to return output.

Actual Behavior

This configuration and version of celery/redis/sentinel has been working fine until just recently and I’m not sure what might have changed. I’m guessing it might have something to do with conflicting packages (given how many there are in this python env 👀 ) but I’m not sure what else to check. I can verify by looking at the keys in redis and also by using tcpdump that celery is definitely able to reach the redis servers using the sentinel brokers. The deployment of celery is also serving tasks and otherwise seems to be working normally. I can’t though for some reason run any of the inspect like commands without getting Error: No nodes replied within time constraint.

The only thing I see in the debug logs is again proof that the celery workers are getting the message, but still nothing comes back:

[2019-08-20 16:34:23,472: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'routing_key': 'dbc97d66-fe94-3d6d-aa6a-bb965893ae2b', 'exchange': 'reply.celery.pidbox'} ticket:19949cbb-6bf0-4b36-89f7-d5851c0bddd0]

We also captured redis traffic using MONITOR and we can see that pings are being keyed and populated: https://gist.github.com/jslusher/3b24f7676c93f90cc55e1330f6e595d8

About this issue

  • Original URL
  • State: open
  • Created 5 years ago
  • Reactions: 10
  • Comments: 36 (13 by maintainers)

Commits related to this issue

Most upvoted comments

Was having this issue as well with kombu 4.6.4, downgraded it to version 4.6.3 and it now works.

yes 4.6.5 is underway with celery 4.4.0rc4 / final

I had the same issue. It seems when I installed celery, it also installed the development version of kombu, which is currently 4.6.5. I uninstall kombu and downgraded to the stable version, which is 4.5.0. It’s working now.

I need to investigate more. I just tried a clean environment with git checkout v4.4.7 && pip install -e '.[redis]' and it worked with redis-server 6.0.10, so there’s something else going on.

Also seeing this issue with these versions:

  • celery = “5.0.5”
  • kombu = “5.0.2”
  • redis = “3.5.3”

I was thinking that the comment about it being fixed in latest kombu meant that the v5 versions would work?

We are getting the same issue.

* kombu==4.6.11

* celery==4.4.6

* redis_version:5.0.6

We get Error: No nodes replied within time constraint. every time. We deploy the same on packages on some different envs and do not get this problem, this problem pops up for us only on our prod environment where we have many scheduled tasks.

We were able to resolve this issue by deleting the key _kombu.binding.reply.celery.pidbox from our Redis. It had been left over from a previous deployment using celery 5 and its presence was causing many of the celery remote inspect diagnostics commands to fail (as well as other commands).

With

  • kombu==4.6.11
  • celery==4.4.6
  • redis_version:5.0.6

With the above version configuration we were able to achieve the PING and other inspect commands were functioning as expected and desired by keeping ONLY the following kombu related keys in our Redis backend.

  • "_kombu.binding.celery"
  • "_kombu.binding.celery.pidbox"
  • "_kombu.binding.celeryev" This meant that we deleted the extra key "_kombu.binding.reply.celery.pidbox" as the corrective action for our case. Make sure to check all Redis DBs if using a Redis backend! Redis command info keyspace will give key counts across Redis DBs to ensure that you do not miss a DB which contains these Kombu keys.

If there are pending tasks in the queue before the update this is happening, need to figure out migrating queued tasks to a newer version.

If those tasks are not important just purge the queue and start the celery.

Looking forward to the update! This issue was driving me crazy.