Description
Description
Hello,
I am trying to get my integrate confluent_kafka in my Django webapp and it gives my quite a hard time.
I have several question, hopefully you could help me with one of them.
My webapp is suppose to use as an API endpoint to another big service,
I am using Kafka running on docker container and orchestrated with docker-compose,
For every message I produce with my producer (every http request is turned to kafka message) I want to read the proper response (from a different topic) with my consumer,
therefore every message has a unique ID, and I match between them, so:
#1 - The first question will be - is there a 'best practice' to use the consumer?
Is it recommended to use web sockets? or should I just call the consumer after every time I produce message to the
broker?
#2 - After setting up the consumer, and integrate in the apps 'views.py', the consumer keeps crashing (and therefore crashing
the Django server), sometimes it happened during the consumer subscription to the topic consumer.subscribe(topics)
,
and sometimes it happened while the consumer tries to read the message,
both fail with error that resembles, something like 'No topic was found with that name'.
the consumer config looks like this:
conf = { 'bootstrap.servers': '{}:{}'.format(env('KAFKA_SERVER'), env('KAFKA_PORT')), 'group.id': "1", 'auto.offset.reset': 'largest', 'allow.auto.create.topics': True }
How to reproduce
Checklist
Please provide the following information:
- confluent-kafka-python and librdkafka version (
confluent_kafka.version()
andconfluent_kafka.libversion()
): 1.7.0 - Apache Kafka broker version: 6.2.0
- Client configuration:
{'bootstrap.servers': '{}:{}'.format(env('KAFKA_SERVER'), env('KAFKA_PORT')), 'group.id': "1", 'auto.offset.reset': 'largest', 'allow.auto.create.topics': True}
- Operating system: ubuntu
- Provide client logs (with
'debug': '..'
as necessary) - Provide broker log excerpts
- Critical issue