Advances in supercomputers have come at a steady pace over the past 20 years. The next milestone is to build an Exascale computer however this requires not only speed improvement but also significant enhancements for energy efficiency and massive parallelism. This paper examines technological progress of supercomputer development to identify the innovative potential of three leading technology paths toward Exascale development: hybrid system, multicore system and manycore system. Performance measurement and rate of change calculation were made by technology forecasting using data envelopment analysis (TFDEA.) The results indicate that the current level of technology and rate of progress can achieve Exascale performance between early 2021 and late 2022 as either hybrid systems or manycore systems.
This slide is about the paper "Lim, D.-J., Anderson, T. R., & Shott, T. (2015). Technological forecasting of supercomputer development: The march to Exascale computing. Omega, 51, 128–135"
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
Technological forecasting of supercomputer development: The march to exascale computing
1. ETM
Extreme Technology Analytics Research Group –– tfdea.com
- TFDEA application -
Technological forecasting of supercomputer
development: The march to exascale computing
Fall 2014
Department of Engineering and Technology Management
Dong-Joon Lim
Portland State University
Maseeh College of Engineering and Computer Science
2. ETM
Extreme Technology Analytics Research Group – tfdea.com
2
Why Exa?
- Expected breakthroughs -
Transitioning this technology to future Exascale
platforms will have a transformative impact upon
simulation-based engineering design, making possible
the design of aerodynamically optimized vehicles
including integrated effects of propulsion, structures,
and active controls, a “Grand Challenge” of
aerodynamic design. (DOE 2010)
3. ETM
Extreme Technology Analytics Research Group – tfdea.com
3
Why Exa?
- Expected breakthroughs -
Models that synthesize our
observations and theories of the
Earth system as accurately as
possible are central to research
on climate change.
…
Exascale models will be
capable of predicting how
anthropogenic pollutants and
land-surface alterations interact
with natural chemical and
ecological processes.
…
These fully coupled models
are capable of simulating the
climate at scales of 25 km – a
resolution comparable to the
size of an average U.S. county.
(DOE 2010)
4. ETM
Extreme Technology Analytics Research Group – tfdea.com
4
Why Exa?
- Expected breakthroughs -
Center for Exascale simulation of Combustion in Turbulence (ExaCT)
- Mission: Reduce petroleum use by 25% by 2020, greenhouse gas emission by 80% by 2050
- Objective: 50% improvement in engine efficiency
- Approach: Combines M&S and experimentation using Exascale programming models
5. ETM
Extreme Technology Analytics Research Group – tfdea.com
5
Why Exa?
- Expected breakthroughs -
A real time human brain scale simulation at about 1-10 Exaflops with 20 MW
Even under best assumptions, our brain will still be a million times more power efficient
(The human brain takes 20W)
6. ETM
Extreme Technology Analytics Research Group – tfdea.com
The march to Exascale computing
4
- Case study in Supercomputer development -
6
Objective
- Examines technological progress of supercomputer development to identify the
innovative potential of three leading technology paths toward Exascale
development: hybrid system, multicore system and manycore system
Background
- Advances in supercomputers have come at a steady pace over the past 20 years
- The next milestone is the Exascale computer;
a machine capable of doing a quintillion
operations, i.e.1018, per second
- Three technology paths
: Hybrid system – CPU+GPU/Accelerator
: Multicore system – Multi-complex-cores
: Manycore system – Many-simpler-low power-cores
-Which path can reach the goal first?
- Is 2020 achievable goal?
4 Lim, D.-J., Anderson, T. R., & Shott, T. (2015). Technological forecasting of supercomputer development: The march to Exascale
computing. Omega, 51, 128–135.
7. ETM
Extreme Technology Analytics Research Group – tfdea.com
7
The march to Exascale computing
- Case study in Supercomputer development -
Naïve anticipation
- Straight forward extrapolation envisions Exascale computers in 2018
“However, this rate of progress may not continue. There are great challenges ahead as we scale
towards Exascale such as energy consumption, multicore architecture, etc.” (Robert 2013)
8. The march to Exascale computing
Projection w/o
tradeoff
ETM
Extreme Technology Analytics Research Group – tfdea.com
- Case study in Supercomputer development -
Key challenge: power consumption
- Projections range up to 130 MW which would cost up to $150 million annually
- Few sites in the U.S. will be able to host the Exascale computing systems due to
limited availability of facilities with sufficient power and cooling capabilities
Design goal
- 100M cores, 20MW power, and 1Exaflops
- Average power efficiency of today’s
top 10 systems is 2.2 Petaflops/MW
- Improvement of power efficiency
by a factor of 23 is required
Questions to be answered
- How much performance
improvement would be restricted?
- How fast energy efficient
systems have been evolving?
- How long would it take to
achieve a given design goal?
(with new system trade-offs)
8
Performance
1Exa
flops
Power
Time
Time
2014
2014
20MW
?
Adjusted
projection
Adjusted
projection
Feasible limit
9. ETM
Extreme Technology Analytics Research Group – tfdea.com
9
The march to Exascale computing
Dataset
- Case study in Supercomputer development -
- TOP500 lists from 1993 to 2013
- Includes 1,199 machines from 2002 to 2013
: Number of cores ranging from 960 to 3.12 million
: Power consumption ranging from 19KW to 17.81MW
: Rmax ranging from 9 Teraflops to 33.86 Petaflops
- Variables
: Name (text): name of machine
: Year (year): year of installation/last major update
: Total Cores (number): number of processors
: Rmax (Gigaflops): maximal LINPACK performance achieved
: Power (Kilowatts): power consumption
: Interconnect family (text): interconnect being used
: Processor technology/family (text): processor architecture being used
- Model parameters
10. ETM
Extreme Technology Analytics Research Group – tfdea.com
10
The march to Exascale computing
- Case study in Supercomputer development -
13 SOA supercomputers
11. ETM
Extreme Technology Analytics Research Group – tfdea.com
11
The march to Exascale computing
- Case study in Supercomputer development -
Performance trajectory using DEA scores
12. ETM
Extreme Technology Analytics Research Group – tfdea.com
12
The march to Exascale computing
- Case study in Supercomputer development -
Model validation using a rolling origin hold-out sample test
- Provides a measure of accuracy both in near-term and far-term without being
affected by occurrences unique to a certain fixed origin
- Deviation statistics will provide confidence intervals of forecasting results
- Benchmark methods
: Planar – regression based model using a fixed tradeoff
: Random walk – simply assumes that new technology will be as good as today’s
13. ETM
Extreme Technology Analytics Research Group – tfdea.com
13
The march to Exascale computing
- Case study in Supercomputer development -
Hybrid systems
- Exascale performance is forecasted to be achieved earliest in 2021.13
- A high individualized RoC of 2.22% with the best current level of performance
represented by Tianhe-2
- One could expect the arrival of hybrid Exascale system within the 2020
timeframe considering the possible deviations (-1.32)
- Business environment
: Improvement for hybrid systems has come mostly from a combination of advances in
Cray systems, such as their transverse cooling system, Cray interconnects,
AMD processors and NVidia coprocessors
: Intel purchased the Cray interconnect division and is expected to design the next
generation Cray interconnect optimized for Intel processors and Xeon Phi coprocessors
: Cray/Intel collaboration will result in RoC greater than the 2.22% and might reach
the Exascale goal earlier
“GPU/Accelerator based systems will be more popular
in TOP500 list for their outstanding energy efficiency,
which may spur the Exascale development” (Simon, 2013)
14. ETM
Extreme Technology Analytics Research Group – tfdea.com
14
The march to Exascale computing
- Case study in Supercomputer development -
Multicore systems
- Exascale performance is forecasted far beyond 2020: 2031.74
(Planar model also estimated the arrival of multicore based Exascale system farther
beyond 2020 timeframe)
- A slow individualized RoC of 1.19% with the underperforming current systems
- Innovative engineering efforts are required for multicore based architecture
to be scaled up to the Exaflop performance
- Business environment
: IBM’s cancellation of Blue Water contract and recent movement toward design house
raise questions on the prospect of multicore based HPCs
: The RIKEN embarked on the project to develop the Exascale system continuing the
preceding success of K-computer
“Multicore with complex cores (x86, SPARC, Power7)
may be nearing the end of the line” (Simon, 2013)
“The innovative technology that IBM ultimately developed
was more complex and required significantly increased
financial and technical support by IBM beyond its
original expectations” (NCSA, 2011)
15. ETM
Extreme Technology Analytics Research Group – tfdea.com
15
The march to Exascale computing
- Case study in Supercomputer development -
Manycore systems
- The first manycore Exascale system is expected to reach the target by 2022.28
- Even allowing maximum deviation (-1.49), arrival of manycore Exascale
systems within the 2020 timeframe seems dubious
- Despite the fast individualized RoC of 2.34%, manycore systems may not
overcome the current performance gap with hybrid systems in the Exascale race
- Have been mostly led by the progress of IBM’s Blue Gene architecture
- Business environment
: IBM’s stable business environment may be more effective moving forward while
Intel/Cray work out their new relationship
: Exascale will be built by Cray or IBM after Cray purchased Appro in 2012
“Utilizing the available peak of a GPU is a difficult
challenge. The Blue Gene, however, is closer to
traditional designs, so realizing performance on these
platforms presents fewer programming challenges, as
long as the algorithms themselves scale.”
(Lazou, 2010)
GPU free
RISK free
16. ETM
Extreme Technology Analytics Research Group – tfdea.com
16
The march to Exascale computing
Exascale computing
- Conclusion -
-Will enable transformations that touch many disciplines
(Molecular modeling, genomics research, climate simulation, astrophysical recreation, etc.)
- Requires improvement of power efficiency by a factor of 23
- Past steady pace solely driven by speed may have to be adjusted
Forecasting
- Current development target of 2020 might entail technical risks
- Either Cray built hybrid system or IBM built Blue Gene system will likely
achieve the Exascale goal between early 2021 and late 2022
- Manycore systems might accomplish the Exascale goal earlier with faster RoC
Matters for future work
- External factors that can stimulate/constrain
the technological progress
: Innovation in interconnect and/or synchronization
- Advancement of small cores (ARM) based systems
: Mont-Blanc project / NVidia project
10
18
Editor's Notes
2.780 times
2.751 times
2.783 times
Hybrid systems suggest smaller cluster solutions for the next generation HPC with its promising performance potential
However the Blue Gene architecture demonstrates an alternate direction of massively parallel quantities of independently operating cores with fewer programming challenges