Celery

Òscar Vilaplana
Òscar VilaplanaPython Geek at Paylogic
Celery
Òscar Vilaplana
February 28 2012
@grimborg
dev@oscarvilaplana.cat
Outline
self.__dict__
Use task queues
Celery and RabbitMQ
Getting started with RabbitMQ
Getting started with Celery
Periodic tasks
Examples
self.__dict__
{'name': 'Òscar Vilaplana',
'origin': 'Catalonia',
'company': 'Paylogic',
'tags': ['developer', 'architect', 'geek'],
'email': 'dev@oscarvilaplana.cat',
}
Proposal
Take a slow task.
Decouple it from your system
Call it asynchronously
Separate projects
Separate projects allow us to:
Divide your system in sections
e.g. frontend, backend, mailing, reportgenerator...
Tackle them individually
Conquer themdeclare them Done:
Clean code
Clean interface
Unit tested
Maintainable
(but this is not only for Celery tasks)
Coupled Tasks
In some cases, it may not be possible to decouple some tasks.
Then, we either:
Have some workers in your system's network
with access to the code of your system
with access to the system's database
They handle messages from certain queues, e.g. internal.#
Candidates
Processes that:
Need a lot of memory.
Are slow.
Depend on external systems.
Need a limited amount of data to work (easy to decouple).
Need to be scalable.
Examples:
Render complex reports.
Import big les
Send e-mails
Example: sending complex emails
Create a in independent project: yourappmail
Generator of complex e-mails.
It needs the templates, images...
It doesn't need access to your system's database.
Deploy it in servers of our own, or in Amazon servers
We can add/remove as we need them
On startup:
Join the RabbitMQ cluster
Start celeryd
Normal operation: 1 server is enough
On high load: start as many servers as needed (
tpspeak
tpsserver
)
yourappmail
A decoupled email generator:
Has a clean API
Decoupled from your system's db: It needs to receive all
information
Customer information
Custom data
Contents of the email
Can be deployed to as many servers as we need
Scalable
Not for everything
Task queues are not a magic wand to make things faster
They can be used as such (like cache).
It hides the real problem.
Celery
Asynchronous distributed task queue
Based on distributed message passing.
Mostly for real-time queuing
Can do scheduling too.
REST: you can query status and results via URLs.
Written in Python
Celery: Message Brokers and Result Storage
Celery's tasks
Tasks can be async or sync
Low latency
Rate limiting
Retries
Each task has an UUID: you can ask for the result back if you
know the task UUID.
RabbitMQ
Messaging system
Protocol: AMQP
Open standard for messaging middleware
Written in Erlang
Easy to cluster!
Install the packages from the RabbitMQ website
RabbitMQ Server
Management Plugin (nice HTML interface)
rabbitmq-plugins enable rabbitmq_management
Go to http://localhost:55672/cli/ and download the cli.
HTML interface at http://localhost:55672/
Set up a cluster
rabbit1$ rabbitmqctl cluster_status
Cluster status of node rabbit@rabbit1 ...
[{nodes,[{disc,[rabbit@rabbit1]}]},{running_nodes,[rabbit@ra
...done.
rabbit2$ rabbitmqctl stop_app
Stopping node rabbit@rabbit2 ...done.
rabbit2$ rabbitmqctl reset
Resetting node rabbit@rabbit2 ...done.
rabbit2$ rabbitmqctl cluster rabbit@rabbit1
Clustering node rabbit@rabbit2 with [rabbit@rabbit1] ...done
rabbit2$ rabbitmqctl start_app
Starting node rabbit@rabbit2 ...done.
Notes
Automatic conguration
Use .config le to describe the cluster.
Change the type of the node
RAM node
Disk node
Install Celery
Just pip install
Dene a task
Example tasks.py
from celery.task import task
@task
def add(x, y):
print I received the task to add {} and {}.format(x, y
return x + y
Congure username, vhost, permissions
$ rabbitmqctl add_user myuser mypassword
$ rabbitmqctl add_vhost myvhost
$ rabbitmqctl set_permissions -p myvhost myuser .* .* .
Conguration le
Write celeryconfig.py
BROKER_HOST = localhost
BROKER_PORT = 5672
BROKER_USER = myusername
BROKER_PASSWORD = mypassword
BROKER_VHOST = myvhost
CELERY_RESULT_BACKEND = amqp
CELERY_IMPORTS = (tasks, )
Launch daemon
celeryd -I tasks # import the tasks module
Schedule tasks
from tasks import add
# Schedule the task
result = add.delay(1, 2)
value = result.get() # value == 3
Schedule tasks by name
Sometimes the tasks module is not available on the clients
from tasks import add
# Schedule the task
result = add.delay(1, 2)
value = result.get() # value == 3
print value
Schedule the tasks better: apply_async
task.apply_async has more options:
countdown=n: the task will run at least n seconds in the
future.
eta=datetime: the task will run not earlier than than
datetime.
expires=n or expires=datetime the task will be revoked in
n seconds or at datetime
It will be marked as REVOKED
result.get will raise a TaskRevokedError
serializer
pickle: default, unless CELERY_TASK_SERIALIZER says
otherwise.
alternative: json, yaml, msgpack
Result
A result has some useful operations:
successful: True if task succeeded
ready: True if the result is ready
revoke: cancel the task.
result: if task has been executed, this contains the result if it
raised an exception, it contains the exception instance
state:
PENDING
STARTED
RETRY
FAILURE
SUCCESS
TaskSet
Run several tasks at once. The result keeps the order.
from celery.task.sets import TaskSet
from tasks import add
job = TaskSet(tasks=[
add.subtask((4, 4)),
add.subtask((8, 8)),
add.subtask((16, 16)),
add.subtask((32, 32)),
])
result = job.apply_async()
result.ready() # True -- all subtasks completed
result.successful() # True -- all subtasks successful
values = result.join() # [4, 8, 16, 32, 64]
print values
TaskSetResult
The TaskSetResult has some interesting properties:
successful: if all of the subtasks nished successfully (no
Exception)
failed: if any of the subtasks failed.
waiting: if any of the subtasks is not ready yet.
ready: if all of the subtasks are ready.
completed_count: number of completed subtasks.
revoke: revoke all subtasks.
iterate: iterate oer the return values of the subtasks once
they nish (sorted by nish order).
join: gather the results of the subtasks and return them in a
list (sorted by the order on which they were called).
Retrying tasks
If the task fails, you can retry it by calling retry()
@task
def send_twitter_status(oauth, tweet):
try:
twitter = Twitter(oauth)
twitter.update_status(tweet)
except (Twitter.FailWhaleError, Twitter.LoginError), exc
send_twitter_status.retry(exc=exc)
To limit the number of retries set task.max_retries.
Routing
apply_async accepts the parameter routing to create some
RabbitMQ queues
pdf: ticket.#
import_files: import.#
Schedule the task to the appropriate queue
import_vouchers.apply_async(args=[filename],
routing_key=import.vouchers)
generate_ticket.apply_async(args=barcodes,
routing_key=ticket.generate)
celerybeat
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
# Executes every Monday morning at 7:30 A.M
every-monday-morning: {
task: tasks.add,
schedule: crontab(hour=7, minute=30,day_of_week=1),
args: (16, 16),
},
}
There can be only one celerybeat running
But we can have two machines that check on each other.
Import a big le:
tasks.py
def import_bigfile(server, filename):
with create_temp_file() as tmp:
fetch_bigfile(tmp, server, filename)
import_bigfile(tmp)
report_result(...) # e.g. send confirmation e-mail
Import big le: Admin interface, server-Side
import tasks
def import_bigfile(filename):
result = tasks.imporg_bigfile.delay(filename)
return result.task_id
class ImportBigfile(View):
def post_ajax(request):
filename = request.get('big_file')
task_id = import_bigfile(filename)
return task_id
Import big le: Admin interface, client-side
Post the le asynchronously
Get the task_id back
Put some working. . .  message.
Periodically ask Celery if the task is ready and change
working. . .  into done!
No need to call Paylogic code: just ask Celery directly
Improvements:
Send the username to the task.
Have the task call back the Admin interface when it's done.
The Backoce can send an e-mail to the user when the task is
done.
Do a time-consuming task.
from tasks import do_difficult_thing
...stuff...
# I have all data necessary to do the difficult thing
difficult_result = do_difficult_thing.delay(some, values)
# I don't need the result just yet, I can keep myself busy
... stuff ...
# Now I really need the result
difficult_value = difficult_result.get()
1 of 34

Recommended

Celery: The Distributed Task Queue by
Celery: The Distributed Task QueueCelery: The Distributed Task Queue
Celery: The Distributed Task QueueRichard Leland
7.2K views29 slides
Practical Celery by
Practical CeleryPractical Celery
Practical CeleryCameron Maske
5.6K views84 slides
Celery - A Distributed Task Queue by
Celery - A Distributed Task QueueCelery - A Distributed Task Queue
Celery - A Distributed Task QueueDuy Do
3.2K views34 slides
Celery with python by
Celery with pythonCelery with python
Celery with pythonAlexandre González Rodríguez
9.1K views35 slides
Django Celery by
Django Celery Django Celery
Django Celery Mat Clayton
12.7K views25 slides
Spock Testing Framework - The Next Generation by
Spock Testing Framework - The Next GenerationSpock Testing Framework - The Next Generation
Spock Testing Framework - The Next GenerationBTI360
1.7K views21 slides

More Related Content

What's hot

Advanced Jasmine - Front-End JavaScript Unit Testing by
Advanced Jasmine - Front-End JavaScript Unit TestingAdvanced Jasmine - Front-End JavaScript Unit Testing
Advanced Jasmine - Front-End JavaScript Unit TestingLars Thorup
24.2K views23 slides
Smarter Testing with Spock by
Smarter Testing with SpockSmarter Testing with Spock
Smarter Testing with SpockDmitry Voloshko
578 views50 slides
Unit Testing Express Middleware by
Unit Testing Express MiddlewareUnit Testing Express Middleware
Unit Testing Express MiddlewareMorris Singer
22.1K views58 slides
Jasmine BDD for Javascript by
Jasmine BDD for JavascriptJasmine BDD for Javascript
Jasmine BDD for JavascriptLuis Alfredo Porras Páez
9.9K views31 slides
Spock Framework by
Spock FrameworkSpock Framework
Spock FrameworkDaniel Kolman
4.1K views26 slides
Django Celery - A distributed task queue by
Django Celery - A distributed task queueDjango Celery - A distributed task queue
Django Celery - A distributed task queueAlex Eftimie
25.6K views24 slides

What's hot(20)

Advanced Jasmine - Front-End JavaScript Unit Testing by Lars Thorup
Advanced Jasmine - Front-End JavaScript Unit TestingAdvanced Jasmine - Front-End JavaScript Unit Testing
Advanced Jasmine - Front-End JavaScript Unit Testing
Lars Thorup24.2K views
Unit Testing Express Middleware by Morris Singer
Unit Testing Express MiddlewareUnit Testing Express Middleware
Unit Testing Express Middleware
Morris Singer22.1K views
Django Celery - A distributed task queue by Alex Eftimie
Django Celery - A distributed task queueDjango Celery - A distributed task queue
Django Celery - A distributed task queue
Alex Eftimie25.6K views
Unit Testing Express and Koa Middleware in ES2015 by Morris Singer
Unit Testing Express and Koa Middleware in ES2015Unit Testing Express and Koa Middleware in ES2015
Unit Testing Express and Koa Middleware in ES2015
Morris Singer5.2K views
Jasmine - why JS tests don't smell fishy by Igor Napierala
Jasmine - why JS tests don't smell fishyJasmine - why JS tests don't smell fishy
Jasmine - why JS tests don't smell fishy
Igor Napierala4.7K views
Smarter Testing With Spock by IT Weekend
Smarter Testing With SpockSmarter Testing With Spock
Smarter Testing With Spock
IT Weekend5.9K views
Data processing with celery and rabbit mq by Jeff Peck
Data processing with celery and rabbit mqData processing with celery and rabbit mq
Data processing with celery and rabbit mq
Jeff Peck1.6K views
Spock: Test Well and Prosper by Ken Kousen
Spock: Test Well and ProsperSpock: Test Well and Prosper
Spock: Test Well and Prosper
Ken Kousen4.3K views
PyCon AU 2012 - Debugging Live Python Web Applications by Graham Dumpleton
PyCon AU 2012 - Debugging Live Python Web ApplicationsPyCon AU 2012 - Debugging Live Python Web Applications
PyCon AU 2012 - Debugging Live Python Web Applications
Graham Dumpleton11.4K views
Javascript Testing with Jasmine 101 by Roy Yu
Javascript Testing with Jasmine 101Javascript Testing with Jasmine 101
Javascript Testing with Jasmine 101
Roy Yu1.6K views
Intro to testing Javascript with jasmine by Timothy Oxley
Intro to testing Javascript with jasmineIntro to testing Javascript with jasmine
Intro to testing Javascript with jasmine
Timothy Oxley9.4K views
Unit testing JavaScript using Mocha and Node by Josh Mock
Unit testing JavaScript using Mocha and NodeUnit testing JavaScript using Mocha and Node
Unit testing JavaScript using Mocha and Node
Josh Mock11.2K views
node.js practical guide to serverside javascript by Eldar Djafarov
node.js practical guide to serverside javascriptnode.js practical guide to serverside javascript
node.js practical guide to serverside javascript
Eldar Djafarov6.2K views
JavaScript Test-Driven Development with Jasmine 2.0 and Karma by Christopher Bartling
JavaScript Test-Driven Development with Jasmine 2.0 and Karma JavaScript Test-Driven Development with Jasmine 2.0 and Karma
JavaScript Test-Driven Development with Jasmine 2.0 and Karma

Similar to Celery

Parallel Programming With Dot Net by
Parallel Programming With Dot NetParallel Programming With Dot Net
Parallel Programming With Dot NetNeeraj Kaushik
549 views15 slides
Threads, Queues, and More: Async Programming in iOS by
Threads, Queues, and More: Async Programming in iOSThreads, Queues, and More: Async Programming in iOS
Threads, Queues, and More: Async Programming in iOSTechWell
372 views45 slides
Automation with Ansible and Containers by
Automation with Ansible and ContainersAutomation with Ansible and Containers
Automation with Ansible and ContainersRodolfo Carvalho
453 views23 slides
Concurrent Programming in Java by
Concurrent Programming in JavaConcurrent Programming in Java
Concurrent Programming in JavaRuben Inoto Soto
4.2K views77 slides
Java util concurrent by
Java util concurrentJava util concurrent
Java util concurrentRoger Xia
2.3K views62 slides

Similar to Celery(20)

Parallel Programming With Dot Net by Neeraj Kaushik
Parallel Programming With Dot NetParallel Programming With Dot Net
Parallel Programming With Dot Net
Neeraj Kaushik549 views
Threads, Queues, and More: Async Programming in iOS by TechWell
Threads, Queues, and More: Async Programming in iOSThreads, Queues, and More: Async Programming in iOS
Threads, Queues, and More: Async Programming in iOS
TechWell372 views
Automation with Ansible and Containers by Rodolfo Carvalho
Automation with Ansible and ContainersAutomation with Ansible and Containers
Automation with Ansible and Containers
Rodolfo Carvalho453 views
Java util concurrent by Roger Xia
Java util concurrentJava util concurrent
Java util concurrent
Roger Xia2.3K views
Fun Teaching MongoDB New Tricks by MongoDB
Fun Teaching MongoDB New TricksFun Teaching MongoDB New Tricks
Fun Teaching MongoDB New Tricks
MongoDB2.5K views
Binary Studio Academy: Concurrency in C# 5.0 by Binary Studio
Binary Studio Academy: Concurrency in C# 5.0Binary Studio Academy: Concurrency in C# 5.0
Binary Studio Academy: Concurrency in C# 5.0
Binary Studio273 views
Future Decoded - Node.js per sviluppatori .NET by Gianluca Carucci
Future Decoded - Node.js per sviluppatori .NETFuture Decoded - Node.js per sviluppatori .NET
Future Decoded - Node.js per sviluppatori .NET
Gianluca Carucci235 views
Structured Testing Framework by serzar
Structured Testing FrameworkStructured Testing Framework
Structured Testing Framework
serzar223 views
Viktor Tsykunov: Azure Machine Learning Service by Lviv Startup Club
Viktor Tsykunov: Azure Machine Learning ServiceViktor Tsykunov: Azure Machine Learning Service
Viktor Tsykunov: Azure Machine Learning Service
Lviv Startup Club140 views
Advanced patterns in asynchronous programming by Michael Arenzon
Advanced patterns in asynchronous programmingAdvanced patterns in asynchronous programming
Advanced patterns in asynchronous programming
Michael Arenzon234 views
[NDC 2019] Functions 2.0: Enterprise-Grade Serverless by KatyShimizu
[NDC 2019] Functions 2.0: Enterprise-Grade Serverless[NDC 2019] Functions 2.0: Enterprise-Grade Serverless
[NDC 2019] Functions 2.0: Enterprise-Grade Serverless
KatyShimizu845 views
[NDC 2019] Enterprise-Grade Serverless by KatyShimizu
[NDC 2019] Enterprise-Grade Serverless[NDC 2019] Enterprise-Grade Serverless
[NDC 2019] Enterprise-Grade Serverless
KatyShimizu118 views
Google App Engine Developer - Day3 by Simon Su
Google App Engine Developer - Day3Google App Engine Developer - Day3
Google App Engine Developer - Day3
Simon Su1.6K views
Background Jobs - Com BackgrounDRb by Juan Maiz
Background Jobs - Com BackgrounDRbBackground Jobs - Com BackgrounDRb
Background Jobs - Com BackgrounDRb
Juan Maiz489 views
關於測試,我說的其實是...... by hugo lu
關於測試,我說的其實是......關於測試,我說的其實是......
關於測試,我說的其實是......
hugo lu8.6K views

More from Òscar Vilaplana

Type Checking in Python at Tiqets by
Type Checking in Python at TiqetsType Checking in Python at Tiqets
Type Checking in Python at TiqetsÒscar Vilaplana
109 views27 slides
Lean Software Development: Validated Learning by
Lean Software Development: Validated LearningLean Software Development: Validated Learning
Lean Software Development: Validated LearningÒscar Vilaplana
983 views118 slides
Handling Massive Traffic with Python by
Handling Massive Traffic with PythonHandling Massive Traffic with Python
Handling Massive Traffic with PythonÒscar Vilaplana
741 views15 slides
Software Architecture: How Much Design? by
Software Architecture: How Much Design?Software Architecture: How Much Design?
Software Architecture: How Much Design?Òscar Vilaplana
912 views114 slides
Continuous deployment by
Continuous deploymentContinuous deployment
Continuous deploymentÒscar Vilaplana
593 views128 slides
Tornado in Depth by
Tornado in DepthTornado in Depth
Tornado in DepthÒscar Vilaplana
2.2K views100 slides

More from Òscar Vilaplana(8)

Recently uploaded

STPI OctaNE CoE Brochure.pdf by
STPI OctaNE CoE Brochure.pdfSTPI OctaNE CoE Brochure.pdf
STPI OctaNE CoE Brochure.pdfmadhurjyapb
14 views1 slide
Democratising digital commerce in India-Report by
Democratising digital commerce in India-ReportDemocratising digital commerce in India-Report
Democratising digital commerce in India-ReportKapil Khandelwal (KK)
18 views161 slides
Design Driven Network Assurance by
Design Driven Network AssuranceDesign Driven Network Assurance
Design Driven Network AssuranceNetwork Automation Forum
15 views42 slides
"Running students' code in isolation. The hard way", Yurii Holiuk by
"Running students' code in isolation. The hard way", Yurii Holiuk "Running students' code in isolation. The hard way", Yurii Holiuk
"Running students' code in isolation. The hard way", Yurii Holiuk Fwdays
17 views34 slides
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ... by
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...Jasper Oosterveld
19 views49 slides
Ransomware is Knocking your Door_Final.pdf by
Ransomware is Knocking your Door_Final.pdfRansomware is Knocking your Door_Final.pdf
Ransomware is Knocking your Door_Final.pdfSecurity Bootcamp
59 views46 slides

Recently uploaded(20)

STPI OctaNE CoE Brochure.pdf by madhurjyapb
STPI OctaNE CoE Brochure.pdfSTPI OctaNE CoE Brochure.pdf
STPI OctaNE CoE Brochure.pdf
madhurjyapb14 views
"Running students' code in isolation. The hard way", Yurii Holiuk by Fwdays
"Running students' code in isolation. The hard way", Yurii Holiuk "Running students' code in isolation. The hard way", Yurii Holiuk
"Running students' code in isolation. The hard way", Yurii Holiuk
Fwdays17 views
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ... by Jasper Oosterveld
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...
ESPC 2023 - Protect and Govern your Sensitive Data with Microsoft Purview in ...
Business Analyst Series 2023 - Week 3 Session 5 by DianaGray10
Business Analyst Series 2023 -  Week 3 Session 5Business Analyst Series 2023 -  Week 3 Session 5
Business Analyst Series 2023 - Week 3 Session 5
DianaGray10300 views
【USB韌體設計課程】精選講義節錄-USB的列舉過程_艾鍗學院 by IttrainingIttraining
【USB韌體設計課程】精選講義節錄-USB的列舉過程_艾鍗學院【USB韌體設計課程】精選講義節錄-USB的列舉過程_艾鍗學院
【USB韌體設計課程】精選講義節錄-USB的列舉過程_艾鍗學院
Unit 1_Lecture 2_Physical Design of IoT.pdf by StephenTec
Unit 1_Lecture 2_Physical Design of IoT.pdfUnit 1_Lecture 2_Physical Design of IoT.pdf
Unit 1_Lecture 2_Physical Design of IoT.pdf
StephenTec12 views
6g - REPORT.pdf by Liveplex
6g - REPORT.pdf6g - REPORT.pdf
6g - REPORT.pdf
Liveplex10 views
Future of AR - Facebook Presentation by ssuserb54b561
Future of AR - Facebook PresentationFuture of AR - Facebook Presentation
Future of AR - Facebook Presentation
ssuserb54b56115 views
Special_edition_innovator_2023.pdf by WillDavies22
Special_edition_innovator_2023.pdfSpecial_edition_innovator_2023.pdf
Special_edition_innovator_2023.pdf
WillDavies2218 views
STKI Israeli Market Study 2023 corrected forecast 2023_24 v3.pdf by Dr. Jimmy Schwarzkopf
STKI Israeli Market Study 2023   corrected forecast 2023_24 v3.pdfSTKI Israeli Market Study 2023   corrected forecast 2023_24 v3.pdf
STKI Israeli Market Study 2023 corrected forecast 2023_24 v3.pdf

Celery

  • 1. Celery Òscar Vilaplana February 28 2012 @grimborg dev@oscarvilaplana.cat
  • 2. Outline self.__dict__ Use task queues Celery and RabbitMQ Getting started with RabbitMQ Getting started with Celery Periodic tasks Examples
  • 3. self.__dict__ {'name': 'Òscar Vilaplana', 'origin': 'Catalonia', 'company': 'Paylogic', 'tags': ['developer', 'architect', 'geek'], 'email': 'dev@oscarvilaplana.cat', }
  • 4. Proposal Take a slow task. Decouple it from your system Call it asynchronously
  • 5. Separate projects Separate projects allow us to: Divide your system in sections e.g. frontend, backend, mailing, reportgenerator... Tackle them individually Conquer themdeclare them Done: Clean code Clean interface Unit tested Maintainable (but this is not only for Celery tasks)
  • 6. Coupled Tasks In some cases, it may not be possible to decouple some tasks. Then, we either: Have some workers in your system's network with access to the code of your system with access to the system's database They handle messages from certain queues, e.g. internal.#
  • 7. Candidates Processes that: Need a lot of memory. Are slow. Depend on external systems. Need a limited amount of data to work (easy to decouple). Need to be scalable. Examples: Render complex reports. Import big les Send e-mails
  • 8. Example: sending complex emails Create a in independent project: yourappmail Generator of complex e-mails. It needs the templates, images... It doesn't need access to your system's database. Deploy it in servers of our own, or in Amazon servers We can add/remove as we need them On startup: Join the RabbitMQ cluster Start celeryd Normal operation: 1 server is enough On high load: start as many servers as needed ( tpspeak tpsserver )
  • 9. yourappmail A decoupled email generator: Has a clean API Decoupled from your system's db: It needs to receive all information Customer information Custom data Contents of the email Can be deployed to as many servers as we need Scalable
  • 10. Not for everything Task queues are not a magic wand to make things faster They can be used as such (like cache). It hides the real problem.
  • 11. Celery Asynchronous distributed task queue Based on distributed message passing. Mostly for real-time queuing Can do scheduling too. REST: you can query status and results via URLs. Written in Python Celery: Message Brokers and Result Storage
  • 12. Celery's tasks Tasks can be async or sync Low latency Rate limiting Retries Each task has an UUID: you can ask for the result back if you know the task UUID. RabbitMQ Messaging system Protocol: AMQP Open standard for messaging middleware Written in Erlang Easy to cluster!
  • 13. Install the packages from the RabbitMQ website RabbitMQ Server Management Plugin (nice HTML interface) rabbitmq-plugins enable rabbitmq_management Go to http://localhost:55672/cli/ and download the cli. HTML interface at http://localhost:55672/
  • 14. Set up a cluster rabbit1$ rabbitmqctl cluster_status Cluster status of node rabbit@rabbit1 ... [{nodes,[{disc,[rabbit@rabbit1]}]},{running_nodes,[rabbit@ra ...done. rabbit2$ rabbitmqctl stop_app Stopping node rabbit@rabbit2 ...done. rabbit2$ rabbitmqctl reset Resetting node rabbit@rabbit2 ...done. rabbit2$ rabbitmqctl cluster rabbit@rabbit1 Clustering node rabbit@rabbit2 with [rabbit@rabbit1] ...done rabbit2$ rabbitmqctl start_app Starting node rabbit@rabbit2 ...done.
  • 15. Notes Automatic conguration Use .config le to describe the cluster. Change the type of the node RAM node Disk node
  • 17. Dene a task Example tasks.py from celery.task import task @task def add(x, y): print I received the task to add {} and {}.format(x, y return x + y
  • 18. Congure username, vhost, permissions $ rabbitmqctl add_user myuser mypassword $ rabbitmqctl add_vhost myvhost $ rabbitmqctl set_permissions -p myvhost myuser .* .* .
  • 19. Conguration le Write celeryconfig.py BROKER_HOST = localhost BROKER_PORT = 5672 BROKER_USER = myusername BROKER_PASSWORD = mypassword BROKER_VHOST = myvhost CELERY_RESULT_BACKEND = amqp CELERY_IMPORTS = (tasks, )
  • 20. Launch daemon celeryd -I tasks # import the tasks module
  • 21. Schedule tasks from tasks import add # Schedule the task result = add.delay(1, 2) value = result.get() # value == 3
  • 22. Schedule tasks by name Sometimes the tasks module is not available on the clients from tasks import add # Schedule the task result = add.delay(1, 2) value = result.get() # value == 3 print value
  • 23. Schedule the tasks better: apply_async task.apply_async has more options: countdown=n: the task will run at least n seconds in the future. eta=datetime: the task will run not earlier than than datetime. expires=n or expires=datetime the task will be revoked in n seconds or at datetime It will be marked as REVOKED result.get will raise a TaskRevokedError serializer pickle: default, unless CELERY_TASK_SERIALIZER says otherwise. alternative: json, yaml, msgpack
  • 24. Result A result has some useful operations: successful: True if task succeeded ready: True if the result is ready revoke: cancel the task. result: if task has been executed, this contains the result if it raised an exception, it contains the exception instance state: PENDING STARTED RETRY FAILURE SUCCESS
  • 25. TaskSet Run several tasks at once. The result keeps the order. from celery.task.sets import TaskSet from tasks import add job = TaskSet(tasks=[ add.subtask((4, 4)), add.subtask((8, 8)), add.subtask((16, 16)), add.subtask((32, 32)), ]) result = job.apply_async() result.ready() # True -- all subtasks completed result.successful() # True -- all subtasks successful values = result.join() # [4, 8, 16, 32, 64] print values
  • 26. TaskSetResult The TaskSetResult has some interesting properties: successful: if all of the subtasks nished successfully (no Exception) failed: if any of the subtasks failed. waiting: if any of the subtasks is not ready yet. ready: if all of the subtasks are ready. completed_count: number of completed subtasks. revoke: revoke all subtasks. iterate: iterate oer the return values of the subtasks once they nish (sorted by nish order). join: gather the results of the subtasks and return them in a list (sorted by the order on which they were called).
  • 27. Retrying tasks If the task fails, you can retry it by calling retry() @task def send_twitter_status(oauth, tweet): try: twitter = Twitter(oauth) twitter.update_status(tweet) except (Twitter.FailWhaleError, Twitter.LoginError), exc send_twitter_status.retry(exc=exc) To limit the number of retries set task.max_retries.
  • 28. Routing apply_async accepts the parameter routing to create some RabbitMQ queues pdf: ticket.# import_files: import.# Schedule the task to the appropriate queue import_vouchers.apply_async(args=[filename], routing_key=import.vouchers) generate_ticket.apply_async(args=barcodes, routing_key=ticket.generate)
  • 29. celerybeat from celery.schedules import crontab CELERYBEAT_SCHEDULE = { # Executes every Monday morning at 7:30 A.M every-monday-morning: { task: tasks.add, schedule: crontab(hour=7, minute=30,day_of_week=1), args: (16, 16), }, }
  • 30. There can be only one celerybeat running But we can have two machines that check on each other.
  • 31. Import a big le: tasks.py def import_bigfile(server, filename): with create_temp_file() as tmp: fetch_bigfile(tmp, server, filename) import_bigfile(tmp) report_result(...) # e.g. send confirmation e-mail
  • 32. Import big le: Admin interface, server-Side import tasks def import_bigfile(filename): result = tasks.imporg_bigfile.delay(filename) return result.task_id class ImportBigfile(View): def post_ajax(request): filename = request.get('big_file') task_id = import_bigfile(filename) return task_id
  • 33. Import big le: Admin interface, client-side Post the le asynchronously Get the task_id back Put some working. . . message. Periodically ask Celery if the task is ready and change working. . . into done! No need to call Paylogic code: just ask Celery directly Improvements: Send the username to the task. Have the task call back the Admin interface when it's done. The Backoce can send an e-mail to the user when the task is done.
  • 34. Do a time-consuming task. from tasks import do_difficult_thing ...stuff... # I have all data necessary to do the difficult thing difficult_result = do_difficult_thing.delay(some, values) # I don't need the result just yet, I can keep myself busy ... stuff ... # Now I really need the result difficult_value = difficult_result.get()