This document discusses using asynchronous tasks with Django and Celery. It explains why task queues are useful for processing tasks outside of request-response cycles like video/image processing or communicating with external APIs. It then provides an overview of how Celery works with clients generating tasks, a message broker managing the task queue, and workers receiving tasks from the broker to execute. Finally, it gives steps for integrating Celery with Django projects, including installing dependencies, configuring settings, creating task functions, and running tasks.
2. ALLISSON AZEVEDO
Graduado em Licenciatura em Computação
Desenvolvedor Web
http://speakerdeck.com/allisson
http://slideshare.net/allisson
http://github.com/allisson
http://youtube.com/user/allissonazevedo
Monday, June 10, 13
5. PORQUE EU PRECISO DE UM
TASK/JOB QUEUE?
Necessidade de processar uma tarefa fora do ciclo de
requisição e reposta
Processamento de vídeo/imagens
Envio de e-mails
Geração de relatórios complexos
Comunicação com API’s externas (twitter, facebook)
Monday, June 10, 13
6. PORQUE EU PRECISO DE UM
TASK/JOB QUEUE?
Agendar tarefas (substituir o cron)
Trabalhar com indexação de um search engine
Monday, June 10, 13
7. COMO FUNCIONA?
Client
Quem gera a tarefa
Message Broker
Gerencia a fila de tarefas
Worker
Recebe as tarefas do Broker e executa as mesmas
Monday, June 10, 13
9. CELERY
“Distributed Task Queue”
Escrito em python
Integração com os principais frameworks python
(django, pyramid, flask, web2py, tornado)
Broker Backends (rabbitmq, redis, sqlalchemy, django,
mongodb)
Result Store Backends (Redis, memcached, MongoDB)
Monday, June 10, 13
10. QUAL BROKER USAR?
RabbitMQ (http://stackoverflow.com/a/9176046)
Redis
Não use o broker como result store
Monday, June 10, 13
11. INTEGRANDO COM O
DJANGO
pip install django-celery
Adicione o djcelery no INSTALLED_APPS
Adicione as linhas no settings.py
import djcelery
djcelery.setup_loader()
Monday, June 10, 13
12. INTEGRANDO COM O
DJANGO
Selecione o broker
BROKER_URL = 'redis://localhost:6379/0' #Redis
Inicie o worker
python manage.py celery worker --loglevel=info
Monday, June 10, 13