Asynchronous task processing is crucial for web applications that need to handle long-running or resource-intensive tasks. In this tutorial, we’ll walk through the steps of setting up Django with Celery, Redis, and Flower to manage asynchronous tasks, using a webhook as an example.
By the end of this guide, you’ll have a Django app that can handle webhook requests asynchronously using Celery and Redis, while monitoring the status of tasks with Flower.
Prerequisites
- Python (>= 3.7)
- Django (>= 3.0)
- Redis installed and running
- Basic understanding of Django and Python
Let’s get started!
Step 1: Install Required Packages
First, we need to install the necessary packages. Open your terminal and run the following command:
pip install django celery redis flower
This command installs:
- Django: The web framework.
- Celery: The asynchronous task queue.
- Redis: The message broker that Celery will use.
- Flower: A tool for monitoring Celery tasks.
Step 2: Create a New Django Project
Let’s begin by creating a new Django project and app:
django-admin startproject webhook_example
cd webhook_example
python manage.py startapp webhook
Once created, open the settings.py
file and add the webhook
app to the INSTALLED_APPS
:
INSTALLED_APPS = [
# other apps
'webhook',
]
Step 3: Configure Celery in Django
Now, let’s integrate Celery into the Django project.
- Create a
celery.py
file in your project root directory (webhook_example/celery.py
)
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'webhook_example.settings')
app = Celery('webhook_example')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
- Update the
__init__.py
in thewebhook_example
directory
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
- Configure Celery settings in your
settings.py
:Add the following lines to configure Redis as the message broker and backend for Celery
CELERY_BROKER_URL = 'redis://localhost:6379/0' # Redis as the message broker
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
Step 4: Create an Asynchronous Task
Now, let’s define an asynchronous task that will be triggered by our webhook. In the webhook
app, create a tasks.py
file and define a simple task to log the webhook payload.
# webhook/tasks.py
from celery import shared_task
@shared_task
def process_webhook(payload):
# Simulate processing the webhook
print(f"Processing webhook payload: {payload}")
# Add any additional processing logic here, like saving to the database or notifying users
Step 5: Create the Webhook View
Next, we’ll define a Django view that will handle incoming webhook requests and trigger the Celery task asynchronously.
- In the
views.py
file of thewebhook
app
# webhook/views.py
from django.http import JsonResponse
from .tasks import process_webhook
def webhook(request):
if request.method == "POST":
payload = request.POST.get('data') # Get the payload from the request
process_webhook.delay(payload) # Call Celery task asynchronously
return JsonResponse({'status': 'success'}, status=200)
return JsonResponse({'error': 'Invalid request'}, status=400)
- Create a URL pattern for the webhook in the
urls.py
file of thewebhook
app
# webhook/urls.py
from django.urls import path
from . import views
urlpatterns = [
path('webhook/', views.webhook, name='webhook'),
]
- In the project’s main
urls.py
, include thewebhook
app’s URLs
# webhook_example/urls.py
from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('admin/', admin.site.urls),
path('api/', include('webhook.urls')),
]
Step 6: Running Redis, Celery Worker, and Flower
- Start Redis server if you don’t have it running already. To do so, simply run
redis-server
- Start the Celery worker to process tasks. Run the following command in your project directory
celery -A webhook_example worker --loglevel=info
- Start Flower to monitor the Celery tasks
celery -A webhook_example worker --loglevel=info
Flower will be available at http://localhost:5555
, where you can see the status of your Celery tasks.
Step 7: Test the Webhook
You can now test the webhook by sending a POST request to the URL http://localhost:8000/api/webhook/
with a data
parameter.
Using curl
, or postman you can send the test data like this:
curl -X POST http://localhost:8000/api/webhook/ -d "data=Test Payload"
Once you send the request, the process_webhook
task will be triggered asynchronously by Celery, and you should see the output in your Celery worker console, logging the payload.
You can also monitor the task status in Flower at http://localhost:5555
.
Conclusion
In this tutorial, we’ve successfully set up asynchronous task processing in Django using Celery, Redis, and Flower. We’ve:
- Created a Django app to handle webhooks.
- Configured Celery with Redis as the broker and result backend.
- Set up Flower to monitor Celery tasks.
- Demonstrated how to process tasks asynchronously when a webhook is triggered.
This setup is ideal for web applications that need to handle long-running tasks, such as sending emails, processing images, or integrating with third-party APIs, without blocking the main application flow.
Happy coding! 😊