Long Haul Hadoop

Uploaded on

Mombasa: long-haul Hadoop

Mombasa: long-haul Hadoop

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. Long Haul Hadoop Steve Loughran © 2009 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice
  • 2. Recurrent theme Issue Title HADOOP-3421 Requirements for a Resource Manager HADOOP-4559 REST API for job/task statistics MAPREDUCE-454 Service interface for Job submission MAPREDUCE-445 Web service interface to the job tracker HADOOP-5123 Ant tasks for job submission
  • 3. In-datacentre: binary   Hadoop RPC, AVRO, Apache Thrift...   Space-efficient text and data   Performance through efficient marshalling, writable re-use   Brittle – fails with odd messages   Insecure – caller is trusted!
  • 4. Requirements   Works through proxies and firewalls   Robust against cluster upgrades   Deploys tasks and sequences of tasks   Monitor jobs, logs   Pause, resume, kill jobs   Only trusted users to submit, manage jobs
  • 5. WS-*
  • 6. WS-*   Integrates with SOA story   Integration with WS-* Management layer   BPEL-ready   Auto-generation of WSDL from server code   Auto-generation of client code from WSDL   Stable Apache stacks (Axis2, CXF)‫‏‬   Security through HTTPS and WS-Security
  • 7. WS-* Problems  Generated WSDL is a maintenance mess  Generated clients and server code brittle  No stable WS-* Management layer (yet)  WS-Security  Little/no web browser integration  Binary data tricky
  • 8. REST
  • 9. Pure REST   PUT and DELETE   Cleaner, purer   Restlet is best client (LGPL)‫‏‬   Needs HTML5 browser for in-browser PUT Model jobs as URLs you PUT; State through attributes you manipulate
  • 10. HTTP POST   Easy to use from web browser   Include URLs in bugreps   Security through HTTPS only   HttpClient for client API   Binary files through form/multipart Have a job manager you POST to; URLs for each created job
  • 11. JobTracker or Workflow? 1. HADOOP-5303: Oozie, Hadoop Workflow 2. Cascading 3. Pig 4. BPEL 5. Constraint-based languages The REST model could be independent of the back-end
  • 12. Status as of Aug 6 2009 1. Can deploy SmartFrog components and hence workflows or constrained models 2. Portlet engine runs in datacentre 3. No REST API 4. Need access to logs, rest of JobTracker 5. Issue: how best to configure the work?