How to setup celery with Django using Redis as a message broker

This tutorial describes how to set up Celery tasks scheduler for Django app with Django's admin interface for the purpose (tasks scheduling).

This tutorial consists of 3 parts: installation, configuration, and instructions on what to do next.

Installation

Run the following commands in your venv:

pip install -U "celery[redis]"
pip install django-celery-beat
pip install django-celery-result

Configuration

Let’s create a script with configs for celery. We have similar to the following project structure at the moment of writing this tutorial.

...
manage.py
apps
base
- tasks
- - task01.py
- - task02.py
- celeryconf.py
...

celeryconf.py

import os
from celery import Celery
from kombu import Queue, Exchange
app = Celery('tasks')
default_exchange = Exchange('default', type='direct')
app.conf.task_queues = (
Queue('default', default_exchange, routing_key='default'),
)
app.conf.task_default_queue = 'default'
app.conf.task_default_exchange = 'default'
app.conf.task_default_exchange_type = 'direct'
app.conf.task_default_routing_key = 'default'
app.conf.task_queues = (
Queue('default', default_exchange, routing_key='default'),
)
app.conf.beat_scheduler = 'django_celery_beat.schedulers:DatabaseScheduler'
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'base.settings')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
# hack for celery beat
import django
django.setup()

tasks/init.py

from celery import shared_task
from celery.utils.log import get_task_logger
logger = get_task_logger(name)
@shared_task
def parse_admitad_yml(channel_id=56):
from .parse_admitad_yml import ParseAdmitadYml
logger.info('Running parse_admitad_yml for channel_id %s', channel_id)
parser = ParseAdmitadYml()
try:
parser.execute(channel_id)
except Exception:
return 1
return 0@shared_task
def create_things():
from .create_things import CreateThings
logger.info('Running create_things')try:
create_things = CreateThings()
create_things.execute()
except Exception:
return 1
return 0@shared_task
def create_things_auto():
from .create_things_auto import CreateThingsAuto
logger.info('Running create_things_auto')
try:
create_things_auto = CreateThingsAuto()
create_things_auto.execute()
except Exception:
return 1
return 0@shared_task
def add_pictures_to_things():
from .add_pictures_to_things import AddPicturesToThings
logger.info('Running add_pictures_to_things')
try:
add_pictures_to_things = AddPicturesToThings()
add_pictures_to_things.execute()
except Exception:
return 1
return 0

tasks/task01.py

# coding: utf-8from django.core.management.base import NoArgsCommandfrom django.utils.translation import activatefrom admitad.models import Thingclass CreateThings:
"""A class for a celery task"""
def execute():
Thing.create_shop_things()

Start Redis server with

redis-server

Start a celery worker (executes the tasks) with the command:

celery -A base.celeryconf worker -l info

Then start a celery beat (adds tasks to the celery queue) with the command:

celery -A base.celeryconf beat -l info

The final words

If you completed the steps described above, you must have the celery setup for your Django application up and running. All that lasts is to daemonize things. You should run the Redis server, Celery worker, and beat in daemon mode (as background processes) for the task scheduling system to work using Django admin interface. For more info on the subject see celery docs.

Sources: