Skip to main content

Django + Kafka

Q & A of connecting Apache Kafka and Django

Problem #1

Kafka consumer should always be listening for new messages in the queue. The consumer should be running in parallel to GraphSpace app (producer).

Solution

There are multiple ways to do this.
  1. We create another Django project say called "GraphSpace notification consumer" which starts along the GraphSpace application and establishes a connection with Kafka. This is an overkill for a simple consumer.
  2. We can use celery for multi-threading. Celery itself uses Redis or RabbitMQ as a queue for tasks. This setup might work for a large scale multi-threading system but for a simple setup of running a consumer this is a overkill. We will be running 2 queues (Redis and Kafka), one for queuing task and another for queuing content. I tried this setup with the following architecture:
  3. I will recommend keeping the architecture simple, run the consumer code on a different thread in daemon mode.

Problem #2

Where to start this consumer thread in Django?

Solution

The current setup of GraphSpace app with notification is as follows
The consumer thread is started in the wsgi settings. Django's internal server runs the wsgi.py file on loading the project. The location of this file is given in the settings file under WSGI_APPLICATION variable. We could have started the consumer thread in the settings.py* file but this might lead to a circular dependency problem. This is caused as we have to save the models in the consumer thread which requires all the settings file (settings.py, local.py/production.py). To start the consumer we need settings like the Kafka server location, this will be different for local and production environment. By starting the thread in the wsgi.py file we keep our code modular.

*the settings.py file can be a folder meaning having multiple configurations/setting files for different environments

Comments

  1. Great write up, really informative.
    I have tried to setup implement similar solution but then after starting the consumer thread in wsgi.py, I keep getting 504 http response via api endpoint. Do you have or know of a work around it?

    ReplyDelete
    Replies
    1. A bit too late for a reply, but have a look at: https://github.com/addu390/django-kafka

      Delete

Post a Comment

Popular posts from this blog

Websockets in Django

Q & A What are websocket? Why do we need it? Quoting from wikipedia , "WebSocket is a computer communications protocol, providing full-duplex communication channels over a single TCP connection. The WebSocket protocol was standardized by the IETF as RFC 6455 in 2011, and the WebSocket API in Web IDL is being standardized by the W3C." WebSockets are used for streaming messages. To implement real-time notification we need to stream messages from the server to the client without a refresh or making an HTTP request from the client. Hence websockets. How do we implement websockets in django? We will be using django-channels  (channels) for websockets. Django, by default does not support websocket. Channels is a django project which allows Django to handle websockets, HTTP and HTTP2 requests. But how does channels implement websockets in WSGI server (gunicorn) which does not support websockets. Simple we don't. Instead, we will be using  Daphne , an interface serv

Kafka Clients 101

In this post, we will look at packages to build a real time notification system for GraphSpace. GraphSpace is built in Django. The requirements for our system are Decoupled from the main app ,i.e, should be non-blocking Use Apache Kafka as the message queue system Follow a producer-consumer architecture with broker, Kafka Able to handle different types of notification: Group, Owner and Watching The above requirements can be satisfied if we have a package that can connect to Kafka, have an ability to handle different types of notification and is non-blocking. To handle different type of notifications we can have producers and consumers send and receive messages with different topics. So the package we need should be able to dynamically do so. Instead of having say 10 producers for 10 topics and establishing 10 connections to Kafka, it is better we have 1 producer which can send messages with 10 different topics. Similarly 1 consumer should be able to subscribe to 10 different