kafka-python: kafka.errors.NoBrokersAvailable exception when running producer example on Mac

Running single-node Kafka cluster on localhost on Mac (OS X 10.11.6) Getting error on attempt to instantiate producer

>>> from kafka import KafkaProducer
>>> producer = KafkaProducer(bootstrap_servers=['localhost:9092'])

getting error

  File "<stdin>", line 1, in <module>
  File "/Users/user1/anaconda/envs/myenv/lib/python2.7/site-packages/kafka/producer/kafka.py", line 347, in __init__
    **self.config)
  File "/Users/user1/anaconda/envs/myenv/lib/python2.7/site-packages/kafka/client_async.py", line 221, in __init__
    self.config['api_version'] = self.check_version(timeout=check_timeout)
  File "/Users/user1/anaconda/envs/myenv/lib/python2.7/site-packages/kafka/client_async.py", line 826, in check_version
    raise Errors.NoBrokersAvailable()
kafka.errors.NoBrokersAvailable: NoBrokersAvailable

Kafka is up, and running locally and producer from confluent-kafka-python works without issues. Any suggestions what to look for?

server.properties:
. . . 
listeners=PLAINTEXT://localhost:9092
. . .

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Comments: 31 (5 by maintainers)

Commits related to this issue

Most upvoted comments

Version 1.3.5 of this library (which is latest on pypy) only lists certain API versions 0.8.0 to 0.10.1. So unless you explicitly specify api_version to be (0, 10, 1) the client library’s attempt to discover the version will cause a NoBrokersAvailable error.

an example from my code:

producer = KafkaProducer(
    bootstrap_servers=URL,
    client_id=CLIENT_ID,
    value_serializer=JsonSerializer.serialize,
    api_version=(0, 10, 1)
)

I meet this problem in centos7, and I fix this by using flowing code

       producer = KafkaProducer(bootstrap_servers='localhost:9092', request_timeout_ms=1000000, api_version_auto_timeout_ms=1000000)

It seems increasing this value api_version_auto_timeout_ms may fix this problem.

After investigating this issue for a while… if you’re using wurstmeister/kafka with docker-compose, please notice that in Kafka’s last version many parameters have been deprecated. instead of using -

KAFKA_HOST:
KAFKA_PORT: 9092
KAFKA_ADVERTISED_HOST_NAME: <IP-ADDRESS>
KAFKA_ADVERTISED_PORT: 9092

you need to use -

KAFKA_LISTENERS: PLAINTEXT://:9092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://<IP-ADDRESS>:9092

view this link for more details

When you set api_version the client will not attempt to probe brokers for version information. So it is the probe operation that is failing. One large difference between the version probe connections and the general connections is that the former only attempts to connect on a single interface per connection (per broker), where as the latter – general operation – will cycle through all interfaces continually until a connection succeeds. #1411 fixes this by switching the version probe logic to attempt a connection on all found interfaces.

Although setting api_version may appear to fix a problem, this is a very wrong assessment:

Version 1.3.5 of this library (which is latest on pypy) only lists certain API versions 0.8.0 to 0.10.1. So unless you explicitly specify api_version to be (0, 10, 1) the client library’s attempt to discover the version will cause a NoBrokersAvailable error.

The issue is not the version check, it is the TCP socket connection itself. If kafka-python can connect to a 1.0.0 broker it would still be identified as (0, 10, 1). The only thing you achieve by setting an api_version explicitly is you never actually try to open the TCP socket. But that probably just means that your connection issue will surface later when you try to send or receive messages.

The deeper issue here is a real connection bug that I believe has to do with using all available dns lookup data during bootstrap.

I fixed this error with kafka config by adding ‘custom kafka broker’ with key-value: advertised.listeners=PLAINTEXT://01.02.03.04:1234

When I explicitly set the api_version, I am able to produce and consume events.