• Like

Putting hadoop on any cloud big data spain

  • 954 views
Uploaded on

The massive computing and storage resources that are needed to support big data applications make cloud environments an ideal fit. Now more than ever, there is a growing number of choices of cloud …

The massive computing and storage resources that are needed to support big data applications make cloud environments an ideal fit. Now more than ever, there is a growing number of choices of cloud infrastructure providers, from Amazon AWS, OpenStack offered by the likes of HP, Rackspace and soon even Dell, VMware vCloud as well a...

INCLUDING
- Effectively managing your Hadoop stack in any data center (on-premise, cloud, hybrid…)
- Maintaining the flexibility to choose the right cloud for the job in an ever-changing environment
- Consistently manage your hadoop deployment with other elements of your Big Data system such as NoSQL DB, Web Tier etc.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
954
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
27
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • http://www.marineinsight.com/marine/life-at-sea/maritime-history/christopher-columbus-ships-vessels-that-discovered-america/Santa Maria, Niña and the Pinta were the three Christopher Columbus Ships
  • Santa Maria, Niña and the Pinta were the three Christopher Columbus Ships
  • • Dynamic Scaling – provision and add cloud servers to the BigInsights cluster on demand.• Failover – Transparently restart nodes or start new nodes in the case of failure.• Role assignments – Assign new roles to the BigInsights nodes from the Cloudify console.• Monitoring – Monitor deployments using the Cloudify UI with smooth integration to the BigInsights UI. • Data Rebalancing – Kick off Hadoop cluster rebalancing from the Cloudify console. • Hadoop operations – Run Hadoop DFS & DFSAdmin operations from the Cloudify console.
  • Any app - All clouds

Transcript

  • 1. Columbus & The Cloud
  • 2. http://code.zynga.com/2012/02/the-evolution-of-zcloud/
  • 3. http://code.mixpanel.com/2010/11/08/amazon-vs-rackspace/
  • 4. Realization: What You Really Care about Is App Portability
  • 5. http://techblog.netflix.com/2012/07/benchmarking-high-performance-io-with.html
  • 6. A Typical Big Data App…
  • 7. • Auto start VMs• Install and configure app components• Monitor• Repair• (Auto) Scale• Burst…
  • 8. Making thedeployment,installation, scaling, fail-over looks the samethrough the entire stack
  • 9. Running Bare-Metal forhigh I/O workload, Publiccloud for sporadicworkloads ..
  • 10. • Available under different distributions• Cloudera• IBM BigInsights• MapR• Hortonworks
  • 11. • Run on Any Cloud Putting • Consistent MGT • Dynamic ScalingCloudify and • Auto Recovery Hadoop • Auto Scaling Together • Role Assignments • Monitoring • Simple maintenance
  • 12. How it works.. 1 Upload your recipe. 2 Cloudify creates VM’s & installs agents 3 Agents install and manage your app 4 Cloudify automate the scaling
  • 13. Few Snippets..
  • 14. Demo Time..
  • 15. http://www.cloudifysource.orghttp://github.com/CloudifySourcehttps://github.com/CloudifySource/cloudify-recipes/tree/master/services/biginsights