Python celery is one of the most popular python libraries used for asynchronous programming. Every Python developer has implemented it at least once in their applications. We are currently in times when users seek speed when using their applications and celery plays a very important role in the parallel execution of the task to save time.
Table of Contents
In this article, we will be extensively discussing python celery, its core concepts, how it works, its uses and a basic guide about how you can implement it in your python applications.
Python Celery is an open-source library used for executing the processes, asynchronously. It is a great tool that runs postponed or dedicated code in a separate process or even on a separate system that can significantly save you development time and effort. It provides a process queue that holds and distributes the processes to the workers. It is primarily used for real-time operation but it also supports scheduling. Python Celery works by combining various python web frameworks such as Flask, Pylons, web2py, Tryton, and Tornado.
Python celery offers a variety of applications for both real-time tasks as well as scheduling tasks. Suppose, you are developing a stock exchange application which needs to access an API every minute or to send scheduled emails to a group of people at a particular time of the day. Python Celery can be a very good option for both of these tasks.
Another very popular application of python celery is asynchronous execution of tasks. Suppose, a user sends a request, and the page is taking relatively longer to load. You can use Celery to decrease page load time as it can run some parts of the functionality at the moment and some as postponed tasks either on the same server or even on a different server. The main advantage of this is that the application would continue to respond to client requests so the end-users will keep getting the response and he or she won’t have to wait unnecessarily.
To achieve faster load times in an application, it is required to offload some of the tasks from the web server. Celery achieves this with synchronicity. While the webserver is loading the next page, a second server is made to do the required computations in the background. These background task-based server is known as “worker server”. Typically, there are very few web servers responding to user requests but there can be relatively way more worker servers processing the tasks in the background.
The worker servers do all the required tasks such as making changes in the database, updating the UI via call-backs or webhooks, adding items to the cache, processing files, sending emails, putting tasks in the queue etc. All this can be done while the main web server is available to respond to user requests.
The workers are assigned the tasks via a message queue. A queue is a basic data structure that works in a first-in, first-out fashion. All the messages are stored in a queue the first message will be the first to be processed. When a worker gets free, it takes the new task from the front of the queue and begins executing it. If there are many workers involved, each one takes a task in order. The queue ensures that each worker only takes one task at a time and each task is processed by a single worker.
Celery is a very useful library to decrease the production load by postponing the tasks, as it prepares asynchronous and scheduled jobs but there are a few highlighted features that make it even more convenient to use.
Following are some other very useful features offered by python Celery:
This is a simple demonstration for implementing Python Celery with Django to execute the tasks asynchronously.
First, a virtual environment is needed to be created where all the dependencies will be stored.
C:UsersUserDesktopcelery_django>python -m venv myenv
Once the virtual environment is created, it can be activated using the command mentioned below,
C:UserDesktopcelery_django>myenv/bin/activate
Now, you can Install Django using the following command,
pip install django
After the installation, create a project. Here a project named demo_celery is created using the below command.
C:UserDesktopcelery_django>django-admin startproject demo-celery
After setting up Django, it is time to install celery. You can install Celery using this simple command mentioned below,
pip install celery
After the celery installation, the next step will be to configure it in the Django project.
Open the settings.py file of your respective project and add the following configuration. Here, redis is used as a message broker.
# Celery Settings CELERY_BROKER_URL = 'redis://127.0.0.1:6379' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_RESULT_SERIALIZER = 'json' CELERY_TASK_SERIALIZER = 'json'CELERY_TIMEZONE = 'Asia/Karachi'
Message Brokers are separate services that enable applications, systems, and services to communicate and share information by creating a message queue. There are a variety of message brokers available to choose from. In this demonstration, we will be using the Redis Message Broker.
Download and then install the redis .msi file (Redis-x64-5.0.14.msi) from the following link,
https://github.com/tporadowski/redis/releases.
Redis dependency also requires the celery program to run. The redis dependency for celery can be installed using the following command.
pip install redis
You can check whether Redis is working properly or not by typing the following command in the terminal.
redis-cli ping
If it replies with the PONG, it means it is installed successfully and is working just fine.
Once the installation is done, you can start the server using,
Now open the demo_celery.py file in the Django project and add the below code.
import os from celery import Celery from celery.schedules import crontab # Setting the default Django settings module for the celery program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'demo_celery.settings') #passing the project name in Celery(project_name) app = Celery('demo_celery') # Using a string here means the worker doesn't have to serialize # the configuration object to child processes. # - namespace='CELERY' means all celery-related configuration keys # should have a `CELERY_` prefix. app.config_from_object('django.conf:settings', namespace='CELERY') #Celery Beat Settings app.conf.beat_schedule = { 'send-mail-every-day-at-8' : { 'task': 'emailExample.tasks.send_mail_func', 'schedule': crontab(hour = 03, minute = 51), } } # Load task modules from all registered Django apps. app.autodiscover_tasks() @app.task(bind=True) def debug_task(self): print(f'Request: {self.request!r}')
The celery tasks can be created in the celery_tasks.py in the Django project. We can create an app in the working directory using the following command,
C:UserDesktopcelery_django>python manage.py
Once an application is created, we will create a task.py file to create a task. The tasks are the regular Python functions that are called with the Celery.
See this very basic example of a function that will print the 1 to 100 integer number.
from celery import shared_task @shared_task(bind=True) def test_func(self): for i in range(100): print(i) return "Done"
Lastly, we have to create a view with the view.py file.
from django.http import HttpResponse # importing task from tasks.py file from .tasks import test_func # views will be created here. def test(request): # call the test_function using delay, calling task test_func.delay() return HttpResponse("Done")
Now we will map this view to the URL in the urls.py file.
from django.urls import path urlpatterns = [ path('', test, name = 'test'),]
You have got your result but it requires to install the following third-party app to view it. You will have to install the following third-party app.
pip install django-celery-result
Now it is all done! You can now execute your first asynchronous task. You have to run the python manage.py run server and then click on the local host link,
http://127.0.0.1:8000/
After that, open a new terminal, navigate to the project directory and activate the virtual environment. To start the celery worker run the following command,
PS C:celeryPython> celery -A CeleryDjango.celery worker --pool=solo -l info
Whenever the http://127.0.0.1:8000/ will be visited to make a request to the Django server, you will be shown the response and the Celery will perform the task asynchronously, in the background. That can be easily monitored in the Celery terminal.
Python Celery is an extremely powerful message queue library to run tasks asynchronously in the background. In this article, we have demonstrated a very basic implementation of Celery using Django. The basic concept of python Celery is also covered along with how it works. Although it is primarily used to send scheduled emails or to load web pages, there are numerous applications where python celery can provide a very feasible solution.
Also Read: Using WebSocket With Python
Shaharyar Lalani is a developer with a strong interest in business analysis, project management, and UX design. He writes and teaches extensively on themes current in the world of web and app development, especially in Java technology.
Create a free profile and find your next great opportunity.
Sign up and find a perfect match for your team.
Xperti vets skilled professionals with its unique talent-matching process.
Connect and engage with technology enthusiasts.
© Xperti.io All Rights Reserved
Privacy
Terms of use