Your SlideShare is downloading. ×
0
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Celery
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Celery

1,814

Published on

Distrubuted Task Queue - Celery …

Distrubuted Task Queue - Celery

Python Istanbul

Published in: Technology
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,814
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
53
Comments
0
Likes
5
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Distributed Queue Management CELERYFatih Eriklihttp://fatiherikli.comfatiherikli@gmail.com
  • 2. What is Celery ?● Distributed & asynchronous message queue implementation.● Its just wrapper, not a broker.● Default message broker is RabbitMQ.
  • 3. Why use it?● Excluding long-process jobs from request & response cycle.● Minimizing request & response cycle duration.● Distributing jobs to different machines.● Schedulable & retryable jobs.
  • 4. Use cases● Email jobs● Long-process database operations (e.g. denormalizing)● Communication with external APIs● Image, video processing● Search Indexing
  • 5. How it works Publisher Broker Workers Result StoreUser makes a Broker If job is MondoDB,request. For redirects to completed RabbitMQ,example; a related successfully Redis,django view. worker for or fails, Django this job. result is sent Database to result store.
  • 6. InstallationRabbitMQ$ apt-get install rabbitmq-server(will start after the installation.)Celery$ pip install celery$ pip install django_celerySettings.pyINSTALLED_APPS += (djcelery, )
  • 7. ConfigurationBROKER_URL = "amqp://guest:guest@localhost:5672//"CELERY_RESULT_BACKEND = "database"CELERY_RESULT_DBURI = "sqlite:///mydatabase.db"CELERYD_CONCURRENCY = 10
  • 8. A jobapp/tasks.pyfrom celery.task import taskimport requests@task(queue=check-site, name=web_site_status)def web_site_status(url): """ Down for everyone or just me ! """ status_code = requests.get(url=url).status_code return status_code
  • 9. Workers ./manage.py celeryd -Q check-site./manage.py ./manage.py ./manage.py celeryd -Q celeryd -Q celeryd -Q email image video
  • 10. Calling a job./manage.py shell>>> from app.tasks import web_site_status>>> task = web_site_status.delay(http://google.com)# asynchronous request is started>>> task.task_id7b233971-36d4-4e9a-a4e9-f8d76fd9de8e# wait for completing task and get result.>>> task.get()200
  • 11. djcelery.viewsurls.pyurlpatterns = patterns(, url(r^check-site$, djcelery.views.apply, {task_name:web_site_status}, name=check,), url(r^get-status/(.+)$, djcelery.views.task_status, {},name=status, ),)
  • 12. Periodic tasks with ...settings.pyfrom datetime import timedeltaCELERYBEAT_SCHEDULE = { "runs-every-ten-minute": { "task": "web_site_status", "schedule": timedelta(minute=10), "args": ("http://google.com") },}
  • 13. ... celerybeat ./manage.py celerybeator if you using just one worker; ./manage.py celeryd -B
  • 14. Retryingapp/tasks.py@task(queue=check-site, name=web_site_status)def web_site_status(url): """ Down for everyone or just me ! """ try: status_code = requests.get(url=url).status_code except Exception as e: web_site_status.retry(exc=e, countdown=60) return status_code
  • 15. Demo Applicationhttp://pyist-celery-demo.myadslot.com:8000/Source Codehttps://github.com/fatiherikli/downforeveryoneorjustme
  • 16. Celeryhttp://celeryproject.orgRabbitMQhttp://rabbitmq.comThanks :)

×