using Queue Server for batch processing

376 views

Published on

4 Comments
0 Likes
Statistics
Notes
  • Be the first to like this

No Downloads
Views
Total views
376
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
4
Comments
4
Likes
0
Embeds 0
No embeds

No notes for slide

using Queue Server for batch processing

  1. 1. using Queue Service for batch processing
  2. 2. Job Queue data source convert the data into job and sent to queue 1
  3. 3. Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it
  4. 4. Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue
  5. 5. Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue 4 feed back to data aggregator pull result
  6. 6. Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue 4 aggregator pull result make report
  7. 7. Example for video resolution
  8. 8. Job Queue video table {id, url, how-to-parse} 1
  9. 9. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it video table {id, url, how-to-parse}
  10. 10. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 {id, width, height, codec} video table {id, url, how-to-parse}
  11. 11. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 update video where id=? set width=?, height=?, codec=? video table {id, url, how-to-parse} {id, width, height, codec} 4 aggregator pull result
  12. 12. Example for streaming alive
  13. 13. Job Queue streaming table {id, url} 1
  14. 14. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it streaming table {id, url}
  15. 15. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 {id, is_alive} streaming table {id, url}
  16. 16. Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 update streaming_status where id=? set status=? 4 aggregator pull result streaming table {id, url} {id, is_alive}
  17. 17. Deployment
  18. 18. Job Queue workerworkerworkerworkerworker Result Queue aggregator Everything can deploy locally, But
  19. 19. Job Queue workerworkerworkerworkerworker Result Queue aggregator Deploy Workers on the Cloud let it more scalable

×