Asynchronous tasks and jobs in Django with RQ Django 04.10.2016

Asynchronous tasks allow move intensive tasks from the web layer to a background process outside the user request/response lifecycle. This ensures that web requests can always return immediately and reduces compounding performance issues that occur when requests become backlogged.

A good rule of thumb is to avoid web requests which run longer than 500ms.

A task queue is a powerful concept to avoid waiting for a resource-intensive task to finish. Instead, the task is scheduled (queued) to be processed later. Another worker process will pick up this task and do it in the background. This is very useful when the outcome of the task is not important in the current context but it must be executed anyway.


RQ (Redis Queue) makes it easy to add background tasks to your Python applications. RQ uses a Redis database as a queue to process background jobs. You should note that persistence is not the main goal of this data store, so your queue could be erased in the event of a power failure or other crash.

RQ is an alternative to Celery, and while not as featureful, does provide a lightweight solution that makes it easy to set up and use. RQ is written in Python and uses Redis as its backend for establishing and maintaining the job queue. There is a great package that provides RQ integration into your Django project, Django-RQ.

To get started using RQ, you need to configure your application and then run a worker process in your application. Obviously, you need to install and run Redis to use RQ.

First you should install django-rq

pip install django-rq  

add it to

    # other apps

Configure the queues in

    'default': {
        'HOST': 'localhost',
        'PORT': 6379,
        'DB': 0

To start a worker, all we need to do is to run the rqworker custom management command

python rqworker default

Also you can setup supervisor and start worker on system start or restart after crash. More details about supervisor installation here.

Django-RQ allows you to easily put jobs into any of the queues defined in

Create some task

# vim movies/

def update_rating(mark):
    # long task for movies update

Use task with default queue

import django_rq
from movies.tasks import update_rating


class RatingUpdateView(FormView):
    form_class = RatingForm
    template_name = 'rating.html'

    def form_valid(self, form):
        mark = form.cleaned_data['mark']
        django_rq.enqueue(update_rating, mark), 'Task enqueued')
        return redirect(self.get_success_url())

You can use named queue, defined in in

queue = django_rq.get_queue('high')
queue.enqueue(update_rating, mark)

enqueue() returns a job object that provides a variety of information about the job’s status, parameters, etc. equeue() takes the function to be enqueued as the first parameter, then a list of arguments.

To easily turn a callable into an RQ task, you can also use the @job decorator that comes with django_rq

# vim movies/
from django_rq import job

def update_rating(mark):
    # long task for movies update

Now, in your django view, all you need to do is to schedule the task using delay()

def form_valid(self, form):
    mark = form.cleaned_data['mark']
    update_rating.delay(mark), 'Task enqueued')
    return redirect(self.get_success_url())

Now once you execute delay(), a message is stored in Redis and you can return immediately to user without him waiting for long running update. This message will be consumed by one of RQ workers and processed.

Django-RQ also provides a set of views and urls that can provide information about completed and failed jobs. Just add

url(r'^django-rq/', include('django_rq.urls')),

to your site’s Enabling these urls will require the use of Django's admin interface, so make sure that is enabled in your installed apps and urls.

You can read about multiple django-apps and single redis db here.