using Queue Server for batch processing
Upcoming SlideShare
Loading in...5
×
 

using Queue Server for batch processing

on

  • 385 views

 

Statistics

Views

Total Views
385
Views on SlideShare
385
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
4

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • Got it, thanks.
    Are you sure you want to
    Your message goes here
    Processing…
  • We don't use many Queue Server. One zookeeper instance is enough to me. The concurrent connections are between 100 and 300 client. and the memory usage is limited by -Xmx512m. We don't have many powerful server, if we plan to add new Queue Server, I will put it on the another server.
    Are you sure you want to
    Your message goes here
    Processing…
  • I use a homemade queue based on ZooKeeper. Queue Server is not deploy on the cloud, because It will cost the Network Transfer Out. When mass jobs outgoing from Queue Server, there are many data transfer out. Put the Queue out of the Cloud, any kinds of job input will become Data-Transfer-In, and only the result is Data-Transfer-Out. In many case, the result is very small.
    Are you sure you want to
    Your message goes here
    Processing…
  • Which Queue Server do you use? There is only one Queue Server in your slide, do you recommend use one Queue Server to handle all jobs and results? Will you deploy Queue Server on the Cloud?
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

using Queue Server for batch processing using Queue Server for batch processing Presentation Transcript

  • using Queue Service for batch processing
  • Job Queue data source convert the data into job and sent to queue 1
  • Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it
  • Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue
  • Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue 4 feed back to data aggregator pull result
  • Job Queue data source 1 convert the data into job and sent to queue workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 send result to another queue 4 aggregator pull result make report
  • Example for video resolution
  • Job Queue video table {id, url, how-to-parse} 1
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it video table {id, url, how-to-parse}
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 {id, width, height, codec} video table {id, url, how-to-parse}
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 update video where id=? set width=?, height=?, codec=? video table {id, url, how-to-parse} {id, width, height, codec} 4 aggregator pull result
  • Example for streaming alive
  • Job Queue streaming table {id, url} 1
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it streaming table {id, url}
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 {id, is_alive} streaming table {id, url}
  • Job Queue 1 workerworkerworkerworkerworker 2 pull job and consume it Result Queue 3 update streaming_status where id=? set status=? 4 aggregator pull result streaming table {id, url} {id, is_alive}
  • Deployment
  • Job Queue workerworkerworkerworkerworker Result Queue aggregator Everything can deploy locally, But
  • Job Queue workerworkerworkerworkerworker Result Queue aggregator Deploy Workers on the Cloud let it more scalable