Big Workflow Launch from Adaptive Computing


Published on

In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow-- the convergence of Cloud, Big Data, and HPC in enterprise computing.

"The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics," said Rob Clyde, CEO of Adaptive Computing. "A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage. We are confident that Big Workflow will enable enterprises across all industries to leverage big data that inspires game-changing, data-driven decisions."

Learn more:
Watch the video presentation:

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Adaptive Computing has proven leadership in accelerating IT to speed and improve business based on our 10 + years in delivering workload management and IT decision engine software. This leadership has been recognized in many ways with:50+patents filed or approved51% revenue growth in 2010 Acceleration centric investments in 2010 from leaders like Intel Capital, Epic ventures and Tudor Ventures who recognize our HPC market strength and cloud solution leadership and opportunityGlobal partnerships with organizations like HP, IBM, Cray, Microsoft and many others who utilize our innovative and leading decision engine to make their solutions more competitive and innovative for their customersAnd most importantly by customers who us to accelerate and manage their IT environments, the most dynamic and scale-intensive on the planet, a few of which are called out here, with heterogeneous, extreme-scale, multiple data centers with complex workloads, resources and priorities and decisions that we help accelerate in self-optimizing environment.In fact our software is used on well over $2 billion of hardware
  • Adaptive got its start in managing workloads for HPC. So traditionally we’ve played a huge role accelerating productivity, increase uptime, enforce SLAs and make benefits work multiple cluster locations and clouds.Moab HPC Suite – Enterprise Edition provides enterprise-ready HPC workload management that brings together the key enterprise HPC use cases and capabilities as well as implementation and 24x7 support services into a single integrated product to speed the realization of benefits from their HPC system including:Productivity accelerationto get more results done faster at a lower cost with the scalability and intelligent management to maximize utilization and throughput, and even lower power consumption when workload demand fluctuates. This includes overall system and resource productivity, including scarce resources like GPGPUs and Intel Xeon Phi coprocessors. It also includes user and admin productivity with fast and simple job submission and management and reduced complexity, time and management of the cluster and its workload. Uptime automationensures workload completes successfully and reliably, avoiding failures and missed organizational opportunities and objectives. Auto-SLA enforcementschedules and adjusts workload to consistently meet service guarantees and business priorities so the right workloads are completed at the optimal times while enforcing department usage budgetsGrid- and Cloud-ready HPC managementextends the benefits of your traditional HPC environment to more efficiently manage workload and better meet workload demand with Pay-for-use showback and chargeback and the ability tomanage and share workload across multiple remote clusters to meet growing workload demand or surges(with purchase of Moab Grid Suite option)
  • In 2010 we announced our cloud management suite which gave data centers a full cloud lifecycle saving time and saving money.
  • Speaker notes: According to Gartner in a multitude of surveys we found the need to harness the power of Big Data is growing to create business opportunity from the results derived from big data. Business leaders feel the results will enhance business performance through better innovation that will deliver a competitive advantage for the business and ultimately ensure success.#1 - In Gartner's 2013 big data study, information and business leaders most often associate the term "opportunity" with "big data.Survey Analysis, Big Data Adoption in 2013 Shows Substance Behind the Hype, Lisa Kart, Nick Heudecker, Frank Buytendijk. September 13, 2013. Page 1, Subhead#2 - Enhanced information use, including big data, is key to enhancing business performance in many industries.Top Industries Predicts 2014: The Pressure for Fundamental Transformation Continues to Accelerate, Kimberly Harris-Ferrante. October 4, 2013. Page 1, 3rd bullet under Key Findings#3 – Results from the Gartner-Forbes 2012 Board of Directors Survey revealed that 50% of respondents see IT as a way to change the rulesof competition in their industry.Three Big Data Challenges for the CIO, Stephen Prentice. July 24, 2012 Refresh November 7, 2013. Page 2, 2nd paragraph under Analysis.
  • The Enterprise is advancing simulation and big data analysis to pursue game-changing discoveries for the business. These discoveries are not possible without big workflow optimization and scheduling software that enhances the complex, compute-intensive simulation process and analyzes big data faster, more accurately and most cost effectively. Withconverged environments, data analyst can speed the time to discovery. Eureka.
  • With this constellation, we are able to task, capture and deliver imagery to our customers in as little as 90 minutes; delivering critical, life saving information literally when seconds count.
  • Our archive holdings are over 2.3Billion Sq KM.
  • That’s over 15 copies of the earth, and we are now adding a new copy about every 30 days.
  • That’s 2 PB a year of new data being added.
  • For example, when typhoon Hainan devastated large portions of Indonesia. Our 16 PB archive of previous collected imagery is combined with the latest acquisitions to build near real-time updates to first responders. This image set a before and after sequence of damage during this event. You can see the devastation to the temple, and destruction of hundreds of homes.
  • Quote 1 - Earl Joseph, VP of Tech. Computing at IDCQuote 2 - Chirag Dekate, Ph.D., research manager, High-Performance Systems at IDC
  • Big Workflow Launch from Adaptive Computing

    1. 1. Big Workflow Accelerate Insights Unify – Optimize – Guarantee 1 © 2013 ADAPTIVE COMPUTING, INC.
    2. 2. Adaptive Computing Highlights ▪ Experts in scheduling and optimization software for cloud, big data and HPC ▪ 10+ years battle-tested technology ▪ 50+ patents issued or pending ▪ Backed by top-tier investors: ▪ Global partnerships: Partners: 2 © 2013 ADAPTIVE COMPUTING, INC.
    3. 3. Broad Customer Base ▪ ▪ ▪ ▪ 3 Oil & Gas/Energy Finance, Insurance Manufacturing Government ▪ Research ▪ Pharma, Life Sciences, Bioinformati cs, Atmospheric, etc. © 2013 ADAPTIVE COMPUTING, INC.
    4. 4. Intelligent Workload Management for HPC Productivity acceleration Uptime automation • Prevent job • Max utilization 90-99% failures • Auto-respond • Max job throughput to failures • 20%+ lower power • Reduce missed USER deadlines • Simple submit/manage • Reduce support • Faster time to results costs ADMIN SYSTEM • Reduce management time/cost/complexity 4 Auto SLA enforcement • Auto adjust workload to meet SLAs • Balance competing priorities • Share resources fairly • Auto enforce usage budgets © 2013 ADAPTIVE COMPUTING, INC. Grid- & Cloud-ready management • More efficiently meet workload demand • Pay-for-use showback and chargeback • Extend collaboration • Dynamic resource provision/re-purpose • Share workload across wide area grids
    5. 5. Full Cloud Lifecycle Agile Service Delivery Basic Cloud Optimized Cloud Deliver services fast, efficiently and successfully Self-service catalog requests Auto decomissioning Auto self-healing Across the Service Lifecycle Optimized service placement Auto provisioning Showback and chargeback for usage Auto VM migration to maintain performance Reporting and management dashboard Adaptive Services and Resources Automated Management Adapt services to meet SLA’s, demand, and changing conditions Auto power management to reduce costs 5 Maximize utilization, reduce costs/complexity Capacity management optimization Auto incident response Auto maintenance and © 2013 ADAPTIVE COMPUTING, INC. future reservations
    6. 6. It’s not just about HPC or cloud for efficiency – It’s about HPC + Cloud for a purpose: Accelerate Insights Solving the Big Data Challenge 6 © 2013 ADAPTIVE COMPUTING, INC.
    7. 7. Traditional IT vs. Big Data Traditional IT Cloud ▪ ▪ ▪ ▪ ▪ ▪ ▪ Equilibrium/ Uptime Steady state - run forever Many apps per server Light compute load Light data load No scheduling needed Examples 7 ▪ ▪ ▪ Email servers Web servers CRM systems Simulation & Big Data Analysis ▪ ▪ ▪ ▪ ▪ ▪ ▪ Changing needs and load Run until insights Many servers per app Compute intensive Data intensive Scheduling crucial Examples ▪ Monte Carlo simulations ▪ Recommender systems ▪ © 2013 ADAPTIVE COMPUTING, INC. Weather prediction
    8. 8. Leverage Big Data for Data Driven Decisions Today’s Business Must Accelerate Insights To Gain A Competitive Advantage 8 © 2013 ADAPTIVE COMPUTING, INC.
    9. 9. What’s the Big Deal About Big Data Gartner Quotes  In Gartner's 2013 big data study, information and business leaders most often associate the term "opportunity" with “big data”.  Enhanced information use, including big data, is key to enhancing business performance in many industries.  Results from the Gartner-Forbes 2012 Board of Directors Survey revealed that 50% of respondents see IT as a way to change the rules of competition in their industry. 9 © 2013 ADAPTIVE COMPUTING, INC.
    10. 10. Challenges in the Data Center ▪ The data center is the logjam ▪ Process intensive data analysis and simulation ▪ ▪ ▪ ▪ ▪ ▪ Manual Time-consuming Multi step Complex dependencies Multi application Under/over used, siloed environments ▪ IT can collect and store data ▪ IT struggles to extract results to impact the business 10 © 2013 ADAPTIVE COMPUTING, INC.
    11. 11. Eliminate the Logjam ▪ Streamline the workflow ▪ By leveraging ▪ Scheduling and optimization ▪ Advanced policies ▪ All data center environments HPC, Cloud and Big Data ▪ Big Data + Workflow = Big Workflow ▪ Eliminate the logjam with Big Workflow 11 © 2013 ADAPTIVE COMPUTING, INC.
    12. 12. Big Workflow Accelerates Insights that Inspire Game-Changing, Data-Driven Decision ▪ Unify datacenter resources ▪ ▪ ▪ ▪ As a single, adaptive ecosystem Public and Private Cloud HPC VMs and Bare Metal ▪ Optimize the analysis process ▪ Increase throughput and productivity ▪ Reduce cost, complexity and errors ▪ Guarantee Service to the business ▪ ▪ ▪ ▪ 12 Ensure SLAs Maximize Uptime Prove services were delivered Verify resources were allocated fairly © 2013 ADAPTIVE COMPUTING, INC.
    13. 13. REALLY BIG DATA DigitalGlobe Case Study Global Satellite Imagery Provider 13 © 2013 ADAPTIVE COMPUTING, INC.
    14. 14. DigitalGlobe – 14 © 2013 ADAPTIVE COMPUTING, INC.
    15. 15. DigitalGlobe’s Big Data Challenges 15 © 2013 ADAPTIVE COMPUTING, INC.
    16. 16. Big Data Challenges 16 © 2013 ADAPTIVE COMPUTING, INC.
    17. 17. Big Data Challenges Each Year Digital Globe Collects 17 © 2013 ADAPTIVE COMPUTING, INC.
    18. 18. Typhoon Haiyan ▪ How can DigitalGlobe Respond <90 mins.? ▪ Moab is at the data center core ▪ Breaks down siloes ▪ Increases maximum workflow capacity ▪ Optimizes the analysis process ▪ “Without Moab, DigitalGlobe could not do what we do responsively” 18 © 2013 ADAPTIVE COMPUTING, INC. Several hours before Typhoon Haiyan made landfall, DG activated FirstLook. In the first few days, 19,000 square kilometers of imagery, in the hardest hit areas, was collected.
    19. 19. IDC Agrees "The most surprising findings of the 2013 study are the substantially increased penetration of co-processors and accelerators at HPC sites around the world, along with the large proportion of sites that are applying Big Data technologies and methods to their problems, and the steady growth in cloud computing for HPC.” ▪ ▪ ▪ Big Data Growth – $10.9 billion in 2013 October 2103 – $23.8 billion in 2016 2013 Worldwide HPC End User Study High Performance Data Analysis Report – 67% said they perform Big Data analysis on their HPC systems, – 30% of the available computing cycles devoted on average to Big Data analysis work Cloud Computing Report – The sites exploiting cloud computing to address parts of their HPC workloads rose from 13.8% in 2011 to 23.5% in 2013, with public and private cloud use about equally represented “Two thirds of HPC sites are now performing big data analysis as part of their HPC workloads, as well as an uptick in combined uses of cloud computing and supercomputing. As there is no shortage of big data to analyze and no sign of it slowing down, combined uses of cloud and HPC will occur with greater frequency, creating market opportunities for solutions such as Adaptive’s Big Workflow.” 19 © 2013 ADAPTIVE COMPUTING, INC.
    20. 20. BIG WORKFLOW It’s not just about HPC or cloud for efficiency It’s about HPC + Cloud for a purpose: Accelerate Insights 20 © 2013 ADAPTIVE COMPUTING, INC.