Unraveling Earthquake Dynamics Through Extreme-Scale Multi-Physics Simulations
ALICE GABRIEL (LUDWIG MAXIMILIAN UNIVERSITY OF MUNICH, GERMANY)
Earthquakes are highly non-linear multiscale problems, encapsulating geometry and rheology of faults within the Earth’s crust torn apart by propagating shear fracture and emanating seismic wave radiation.
This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behaviour. It will present the largest-scale dynamic earthquake rupture simulation to date, which models the 2004 Sumatra-Andaman event - an unexpected subduction zone earthquake which generated a rupture of over 1,500 km in length within the ocean floor followed by a series of devastating tsunamis.
The core components of the simulation software will be described, highlighting the benefits of strong collaborations between domain and computational scientists. Lastly, future directions in coupling the short-term elastodynamics phenomena to long-term tectonics and tsunami generation will be discussed.
https://pasc18.pasc-conference.org/program/keynote-presentations/
Unraveling earthquake dynamics through extreme-scale simulations
1. Unraveling earthquake dynamics
through extreme-scale multi-physics simulations
Alice-Agnes Gabriel
Computational model for a large-scale scenario of the 2004 Mw9.1 Sumatra-Andaman earthquake (Uphoff et al., SC2017)
2. Acknowledgements
The LMU earthquake physics team:
Betsy
Madden
Kenneth
Duru
Stephanie
Wollherr
Thomas
Ulrich
Former collaborators:
Christian Pelties (now MunichRe)
Alexander Breuer (now SDSC)
Alexander Heinecke (now intel)
The TUM HPC team:
Carsten
Uphoff
Leonhard
Rannabauer
3. I. Computational wave propagation and earthquake rupture
Schematic view of on-going seismic rupture of the Parkfield
segment of Sand Andreas Fault, Caltech/Tim Pyle
Wave simulations of the 2009 L’Aquila earthquake using SeisSol,
Igel 2017, Wenk et al., 2009
4. Computational wave propagation
• Seismology is data-rich and can often be treated as
linear system
• Computational seismology has been a pioneering field and
has been pioneered by HPC
• Key activities: Calculation of synthetic seismograms in
3D Earth and solving seismic inverse problems
• Key achievements: Imaging Earth’s interior,
understanding the dynamics of the mantle, tracking down
energy resources
• Common approach: time-domain solutions of space-
dependent seismic wavefield solved by domain
decomposition
On May 5th, the NASA “InSight”-lander set off to investigate the internal
structure of Mars carrying a seismometer. Forward simulations of seismic
waves travelling through Mars have been performed on “Piz Daint” in real
time solving 10 billion degrees of freedom and 300,000 time steps (Bozdag
et al., 2017 & PASC presentation of Hapla et al., Wed.11:15)`
5. Computational wave propagation
• Seismology is data-rich and can often be treated as
linear system
• Computational seismology has been a pioneering field and
has been pioneered by HPC
• Key activities: Calculation of synthetic seismograms in
3D Earth and solving seismic inverse problems
• Key achievements: Imaging Earth’s interior,
understanding the dynamics of the mantle, tracking down
energy resources
• Common approach: time-domain solutions of space-
dependent seismic wavefield solved by domain
decomposition
On May 5th, the NASA “InSight”-lander set off to investigate the internal
structure of Mars carrying a seismometer. Forward simulations of seismic
waves travelling through Mars have been performed on “Piz Daint” in real
time solving 10 billion degrees of freedom and 300,000 time steps (Bozdag
et al., 2017 & PASC presentation of Hapla et al., Wed.11:15)`
• On-going challenges: Computational efficiency
(resolving high frequencies), meshing (irregular
geometries), and the need for community solutions keep
us busy
6. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
7. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
• Direct (geological/borehole/laboratory studies) or indirect
(seismic radiation) observation is difficult
➡ Difficulties in theoretical aspects
➡ Scaling problems w.r.t. laboratory experiments
➡ Naturally limited amount of large, strongly radiating
earthquakes (sparse data, ill-posed inversion problem)
➡ We know that earthquakes are the result of ruptures
that nucleate, grow and terminate in most cases along
pre-existing faults (Gilbert, 1884)
Exemplary drilling, seismic and geodetic observational
approaches to understand earthquake physics
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
8. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
• Direct (geological/borehole/laboratory studies) or indirect
(seismic radiation) observation is difficult
➡ Difficulties in theoretical aspects
➡ Scaling problems w.r.t. laboratory experiments
➡ Naturally limited amount of large, strongly radiating
earthquakes (sparse data, ill-posed inversion problem)
Exemplary drilling, seismic and geodetic observational
approaches to understand earthquake physics
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
• Earthquake predictions escapes us - we address this
deficiency through understanding earthquake source
physics
9. Observations
• Recent well-recorded earthquakes, as well as laboratory
experiments, resolve
striking variability in terms of source dynamics
‣ super-shear propagation
‣ slip-reactivation
‣ nucleation with/without slow-slip pre-cursors,
‣ variability of rupture style (pulses vs cracks)
‣ rupture cascading and “jumping”
‣ Propagation along both locked and creeping fault
sections during the same earthquake
Tohoku-Oki back projection: Indicating major areas of high-
frequency radiation on the fault (Meng et al., 2012)
Source inversion model of Tohoku-Oki event (Japan) 2011, from
combined local ground motion, teleseismics, GPS & multiple
time window parametrization of slip rate. (Lee and Wang, 2011)
10. Observations
Supershear Mach cone emanating
from rupture tip in laboratory
experiment (Xia et al., 2004)
Denali EQ PS10 record: Rupture of form of two slip pulses, first at
supershear speed and second at subshear speed (Dunham et al.,
• Recent well-recorded earthquakes, as well as laboratory
experiments, resolve
striking variability in terms of source dynamics
‣ super-shear propagation
‣ slip-reactivation
‣ nucleation with/without slow-slip pre-cursors,
‣ variability of rupture style (pulses vs cracks)
‣ rupture cascading and “jumping”
‣ Propagation along both locked and creeping fault
sections during the same earthquake
11. Seismic hazard assessment
• Earthquake source effects are assessed by empirical
(probabilistic) and (over-) simplified approaches
• Reliability relies heavily on the definition of a
seismic faulting model underpinned by realistic
geological and physical constraints including fault zone
rock type, state of stress, fault geometry, fault yield
strength, friction and rupture laws
• Understanding earthquake source physics would shed
light on stress conditions, crustal processes,
fundamentals of friction, and lead to physics-based
seismic hazard assessment (improve building codes;
provide more reliable hazard maps; enable forecasting) Empirical attenuation relation (Boore et al., 1997)
12. • Earthquakes are in many sense unique, despite occurring
in potentially the same location (or nearby location)
• Which physical processes are dominant and relevant at
a given spatio-temporal scale (and in real earthquakes)?
Can we justify the (most often computational) cost of their
inclusion?
• Singular effects can be studied conceptually (2D dynamic
rupture modeling) and analytically (fracture mechanics)
• Large-scale dynamic rupture simulations aiming to
understand “in-scale” which of the aforementioned
“complexities” provides the first order influence on source
dynamics and the resulting observables for a given
geological region (tectonic setting), or fault system, or
type of fault system
1992 Landers dynamic rupture earthquake scenario, resolving 10
Hz wave propagation employing multi-petaflop performance.
(Heinecke et al., SC14)
• Large-scale dynamic rupture simulations aiming to
understand on “natural-scale” which of the
aforementioned “complexities” provides the first order
influences
Modelling challenges -
The search for required minimum
complexity
13. • Earthquakes are in many sense unique, despite occurring
in potentially the same location (or nearby location)
• Which physical processes are dominant and relevant at
a given spatio-temporal scale (and in real earthquakes)?
Can we justify the (most often computational) cost of their
inclusion?
• Singular effects can be studied conceptually (2D dynamic
rupture modeling) and analytically (fracture mechanics)
• Large-scale dynamic rupture simulations aiming to
understand “in-scale” which of the aforementioned
“complexities” provides the first order influence on source
dynamics and the resulting observables for a given
geological region (tectonic setting), or fault system, or
type of fault system
1992 Landers dynamic rupture earthquake scenario, resolving 10
Hz wave propagation employing multi-petaflop performance.
(Heinecke et al., SC14)
• Large-scale dynamic rupture simulations aiming to
understand on “natural-scale” which of the
aforementioned “complexities” provides the first order
influences
➡ Requires: Integrative view of multi-scale physics of rock
fracture, dynamic rupture propagation, and emanated seismic
radiation
➡ Representation of complex 3D geometries
➡ Computationally expensive
Modelling challenges -
The search for required minimum
complexity
14. II. Multi-physics dynamic rupture earthquake simulations
2016 Kaikoura, New Zealand dynamic rupture earthquake scenario, resolving the most complex rupture
observed to date (Ulrich et al., 2018, under revision, PASC presentation Gabriel et al. Tue. 16:30 )
15. Failure CriterionInitial fault stresses
Synthetic seismogramsGround motion
SOLVER
“Output”
Geological structure
CAD & mesh generation
“Input”
Multi-physics
earthquake simulations
• Physics-based approach: Solving for spontaneous
dynamic earthquake rupture as non-linear interaction of
frictional failure and seismic wave propagation
16. Harris et al., SRL 2018
Multi-physics
earthquake simulations
➡ Many methods successfully solve (idealised) community benchmarks
17. • Non-planar, intersecting faults
• Non-linear friction
• Heterogeneities in stress and strength
• Dynamic damage around the fault
• Fault roughness on all scales
• Bi-material effects
• Low velocity zones surrounding faults
• Thermal pressurization of fault zone fluids
• Thermal decomposition
• Dilatancy of the fault gouge
• Flash heating, melting, lubrication
… this list grows continuously
Multi-physics
earthquake simulations
➡ Few methods support all modelling requirements
Multitude of spatio-temporal scales: fault geometry spans hundreds of
km; frictional process zone size is m (or even cm) scale, tectonic loading
(seismic cycle) 10-10000 years; rise time on second scale
18. SeisSol - ADER-DG
A unique modelling framework
www.seissol.org
We develop and host an open-source Arbitrary high-order
DERivative Discontinuous Galerkin (ADER-DG) software
package. SeisSol solves the seismic wave equations in
elastic, viscoelastic, and viscoplastic media on unstructured
tetrahedral meshes.
Our method, by design, permits:
• representing complex geometries - by discretising the
volume via a tetrahedral mesh
• modelling heterogenous media - elastic, viscoelastic,
viscoplastic, anisotropic
• multi-physics coupling - flux based formulation is natural
for representing physics defined on interfaces
• high accuracy - modal flux based formulation allows us to
suppress spurious (unresolved) high frequencies
• high resolution - suitable for parallel computing
environments
Representation of the shear
stress discontinuity across the
fault interface. Spontaneous
rupture = internal boundary
condition of flux term.
fault
M. Käser and M. Dumbser, 2006; M. Dumbser and M. Käser, 2006
J. de la Puente et al., 2008; C. Pelties et al., 2014
github.com/SeisSol
Wave field of a point source
interacting with the
topography of Mount Merapi
Volcano.
PRACE ISC Award for
producing the first simulations
that obtained the “magical"
performance milestone of 1
Peta-flop/s (1015 floating point
operations per second) at the
Munich Supercomputing
Centre.
Due to the properties of the
exact Riemann solver, solutions
on the fault remain free of
spurious oscillations
19. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
constitutiverelationshipsintermsofvelocityconservationofmomentum
linearhyperbolicsystem
20. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
• ADER: high-order time integration + DG: high-order
space discretisation
• DG with orthogonal basis functions (modal)
• Exact Riemann-Solver computes the upwind flux = state
at the element interfaces
DGdiscreteformDGoperators
21. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
• ADER: high-order time integration + DG: high-order
space discretisation
• DG with orthogonal basis functions (modal)
• Exact Riemann-Solver computes the upwind flux = state
at the element interfaces
• Locality of the computations: only neighbouring
elements exchange data
➡ ADER-DG boils down to small matrix-matrix
multiplications , where the dimension of the matrices
depends on the order of the scheme (75 % of runtime
consumption).
22. “Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
SeisSol
Optimisation on all software levels
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
➡ Goal: End-to-end optimisation
on operational geophysics
software
23. “Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• Hybrid MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
• > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
1992 Landers dynamic rupture
earthquake scenario (Heinecke et al.,
Gordon Bell Prize Finalist Paper at
SC14)
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
SeisSol
Optimisation on all software levels
24. • > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
“Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• Hybrid MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
Partial kernel before (top) and after (bottom) removing
irrelevant entries in matrix chain products
➡A code generator automatically detects and exploits
sparse block patterns
➡Hardware specific full “unrolling” and vectorization of
all element operations
➡Customised code for each matrix-matrix
multiplication via the libxsmm back-end
➡Efficiently exploits as of 2014 available hardware
(AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2
1992 Landers dynamic rupture
earthquake scenario (Heinecke et al.,
Gordon Bell Prize Finalist Paper at
SC14)
SeisSol
Optimisation on all software levels
25. III. The 2004 Sumatra megathrust earthquake
A geophysics and HPC challenge …
Illustration of the subduction zone hosting the Christmas 2004 Mw 9.1 - 9,3 Sumatra-Andaman megathrust earthquake
26. • Huge event that triggered devastating tsunami claiming > 230,000 lives in 14 countries, no early-warning
III. The 2004 Sumatra megathrust earthquake
… which is about the people
The tsunami hits Thailand (wikipedia.com) A village near the coast of Sumatra lays in ruin (US Navy)
27. III. The 2004 Sumatra megathrust earthquake
… which is about the people
The tsunami hits Thailand (wikipedia.com) A village near the coast of Sumatra lays in ruin (US Navy)
➡ Earthquakes and tsunamis are not predictable hazards.
Challenge is in the physics: we do not know how earthquakes begin,
grow and sometimes arrest; we do not know when a large earthquake
triggers a tsunami; we do not know how subduction zones ‘operate’.
➡ Earthquakes and tsunamis are not predictable hazards.
Challenge is in the physics: we do not know how earthquakes begin,
grow and sometimes arrest; we do not know when a large earthquake
• Huge event that triggered devastating tsunami claiming > 230,000 lives in 14 countries, no early-warning
28. 1000 km
Shearer and Bürgmann (2010),
Fig.1
• An unexpected, very large earthquake (old oceanic crust,
slow convergence rates)
• Rupturing faults of 1300 to 1500 km, constantly slow
rupture velocity (2 to 3 km/s), long duration of 8 to 10
minutes
• Complex, non-planar intersections, at shallow angles -
CAD and mesh generation is a bottleneck
• Small-is “pop-up” fractures splaying off the megathrust
may be crucial for tsunami generation
A geophysics challenge
Tectonic plates involved in the Sumatra-Andaman
earthquake. Complex 3D geometry of the Sumatra
subduction zone model. The curved megathrust is
intersecting bathymetry, as are the 3 adjacent splay
faults: one forethrust and two backthrusts. The
subsurface consists of horizontally layered continental
crust and subducting layers of oceanic crust. Each layer
is characterized by a different wave speed and thus
requires a different mesh resolution.
29. An HPC challenge
• Spatial resolution (400m on-fault, O6) and 2.2 Hz wave
propagation required mesh with 220 million finite
elements (~111 x 109 DoF).
• Incorporation of high-resolution geodata requires
geoinformation server for fast loading and generalised
initialisation for large 3D datasets, parallel meshing and
file formats
• Unique capability of incorporating realistic geometries
causes highly varying element sizes due to static
adaptivity and intersection of fault with sea-floor or
material layers.
30. An HPC challenge
• Local time-stepping: Each element may have its own
time-step, limited by a CFL condition. Theoretical speed-
up of 14.3 with (perfect) LTS.
• Problem: Irregular update scheme not well-suited for
modern hardware. Idea: Partition elements into time
clusters.
⇒ Speed-up of 9.9x due to clustering (14.3 with per-cell
LTS), but ⇒ LTS scheme with petascale performance
➡ Sacrificing part of theoretical speed up in favour of
hardware oriented data structures and efficient load-
balancing
➡ Only 4% of elements hold dynamic rupture faces, but
are crucial to optimise by local time stepping and to
relax mesh generation
31. An HPC challenge
Note: on KNL we measured 467 TFLOPS with a speed-up of 1.28
compared to 512 nodes of Shaheen.
(86,016 cores)
32. SeisSol
Optimisation on all software levels
“Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation server
• Asynchronous input/output
• Overlaping computation and
communication
• Optimized for Intel KNL
• Speed up of 14x
• 14 hours compared to
almost 8 days for
Sumatra scenario on
SuperMuc2
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
• > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
33. Sumatra megathrust
& splay faults scenario
Best Paper Supercomputing Conference SC17
C. Uphoff, S. Rettenberger, M. Bader,
B. Madden, T. Ulrich, S. Wollherr, A.-A. Gabriel
➡Largest, longest scale dynamic rupture simulation performed so far
34. Sumatra megathrust
& splay faults scenario
• Replicating first-order observations (slip, ground
deformation) as well as producing unexpected features:
• Back thrust splay fault breaking delayed and reversed,
producing considerable contribution to vertical uplift
• Slow rupture speed, slip pulses: due to subducting
layers of lower wave speeds surrounding the fault
(megathrust LVZ)
Synthetic horizontal (left) and vertical (right) sea-floor displacement. Arrows depict
comparison to observations from geodetic and tsunami data summarized in Bletery et
al., 2016.
35. • “More” multi-physics based on new matrix-based code
generator: viscoelastic attenuation, off-fault plasticity
Sumatra megathrust
& splay faults scenario
36. • “More” multi-physics based on new matrix-based code
generator: viscoelastic attenuation, off-fault plasticity
• Generated high-resolution sea-floor displacement as
initial condition for tsunami models based on ASAGI
• Coupling with geodynamic thermo-mechanical models to
provide constraints on fault rheology and the state of
stress
Sumatra megathrust
& splay faults scenario
37. Outlook
Beyond scenario based simulations
• Uncertainty quantification
• Dynamic source inversion
• Adjoint calculations
• Urgent computing: real time scenarios
• Ensemble simulations
2D spacetree in ExaHyPE (Weinzierl et al., 2014
Rupture Complexities of Fluid
Induced Microseismic Events
at the Basel EGS Project
(Folesky et al., 2016)
Highly non-unique kinematic slip models for the
1999 Izmit earthquake (Ide et al., 2005)
➡Exascalesystems
38. Instead of Conclusions:
A reproducibility challenge!
• A setup including a mesh with over 3 million elements for
the 2004 Sumatra-Andaman earthquake can be obtained
from Zenodo https://dx.doi.org/10.5281/zenodo.439946.
$ git clone --recursive https://github.com/SeisSol/SeisSol
$ git checkout 201703
$ git submodule update
$ scons order=6 compileMode=release generatedKernels=yes arch=dhsw
parallelization=hybrid commThread=yes netcdf=yes
$ export OMP_NUM_THREADS=<threads >
$ mpiexec -n <processes > ./SeisSol parameters.par
# SuperMUC Phase 2
$ export OMP_NUM_THREADS =54
$ exportKMP_AFFINITY=compact ,granularity=thread
$ git clone --recursive https://github.com/SeisSol/SeisSol
$ git checkout 201703
$ git submodule update
$ scons order=6 compileMode=release generatedKernels=yes arch=dhsw
parallelization=hybrid commThread=yes netcdf=yes
40. SeisSol
Features and Scales
• Viscoelastic attenuation
• Kinematic sources
• Modern friction laws
• Off-fault plasticity
• Fault roughness
• Thermal pressurisation (2D)
• Fast loading of 3D datasets with
ASAGI
• Adjoint (2D)
• Checkpointing
• Parallel I/O
• Initial parametrization with EASI
• Full local time stepping
• Tested meshing workflow
up to 925 million elements
• Tools for pre- and post
processing
• overnight builds / code testing.
(using Travis, Jenkins, …)
Sumatra: 14 mio
element mesh, 400s: ~2h
on 300 nodes
Landers: 10 mio elements, 100s,
200m fault resolution, 500m topo
resolution, 3D velocity model =:
~1h on 100 nodes (with plasticity
6.2% increase)
Kaikoura: 29 mio elements, 90 sec.,
2 hours on 3000 Sandy Bridge cores
2D SeisSol (Laptop)
41. SeisSol
Features and Scales
• Viscoelastic attenuation
• Kinematic sources
• Modern friction laws
• Off-fault plasticity
• Fault roughness
• Thermal pressurisation (2D)
• Fast loading of 3D datasets with
ASAGI
• Adjoint (2D)
• Checkpointing
• Parallel I/O
• Initial parametrization with EASI
• Full local time stepping
• Tested meshing workflow
up to 925 million elements
• Tools for pre- and post
processing
• overnight builds / code testing.
(using Travis, Jenkins, …)
Sumatra: 14 mio
element mesh, 400s: ~2h
on 300 nodes
Landers: 10 mio elements, 100s,
200m fault resolution, 500m topo
resolution, 3D velocity model =:
~1h on 100 nodes (with plasticity
6.2% increase)
Kaikoura: 29 mio elements, 90 sec.,
2 hours on 3000 Sandy Bridge cores
2D SeisSol (Laptop)
Editor's Notes
- thank the organisers for giving me the opportunity as a seismologist interested in earthquakes to present about the our approach in shedding light on what happens when rock masses slide upon each other - turning apart Earth’s crust by propagating shear fracture and emanating seismic wave radiation.
highlighting the benefits of strong collaborations between domain and computational scientists
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
In distinction to the "wave propagation problem considered to be solved”, predictions escape us - we choose to address this defficiency though understanding the source physics
Inaccessibility of in-situ observations from several kilometers deep in the seismogenic zone (restriction to surface observation)
Complexity of natural geological settings (poor knowledge of small-scale features limits useable frequency band)
Multiple factors affecting recorded ground motion (contamination of source effects)
Naturally limited amount of large, strongly radiating earthquakes (sparse data, ill-posed inversion problem)
Scaling problems w.r.t. laboratory experiments
Earthquake source effects are routinely assessed by empirical (probabilistic) and (over-) simplified approaches
Understanding earthquake source physics (nucleation, dynamics) sheds light on stress conditions, crustal processes, fundamentals of friction
Physics-based seismic hazard assessment (improve building codes; provide reliable hazard maps; enable forecasting)
Earthquake source effects are routinely assessed by empirical (probabilistic) and (over-) simplified approaches
Understanding earthquake source physics (nucleation, dynamics) sheds light on stress conditions, crustal processes, fundamentals of friction
Physics-based seismic hazard assessment (improve building codes; provide reliable hazard maps; enable forecasting)
Isolated effects can be studied conceptually (2D dynamic rupture) and analytically (fracture mechanics)
Isolated effects can be studied conceptually (2D dynamic rupture) and analytically (fracture mechanics)
the only software that allows for rapid setup of models with realistic non- planar fault systems while exploiting the accuracy of a high-order numerical method.
→ hardware specific full “unrolling” and vectorization of all element operations
→ hardware specific full “unrolling” and vectorization of all element operations
→ small size matrix chain products
hardware specific full “unrolling” and vectorization of all element operations
SeisSol utilises hardware specific, customised code for each matrix-matrix multiplication via the libxsmm back-end. This approach effectively exploits the available hardware (AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2.
the code generator automatically detects and exploits sparse block patterns within the matrices for hardware specific full “unrolling” and vectorization of all element operations
an improved code generator facilitates the implementation of advanced PDE models, such as viscoelastic attenuation, which accounts for frequency dependent damping of seismic wave propagation.
SeisSol utilises hardware specific, customised code for each matrix-matrix multiplication via the libxsmm back-end. This approach effectively exploits the available hardware (AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2.
the code generator automatically detects and exploits sparse block patterns within the matrices for hardware specific full “unrolling” and vectorization of all element operations
an improved code generator facilitates the implementation of advanced PDE models, such as viscoelastic attenuation, which accounts for frequency dependent damping of seismic wave propagation.
Extreme multi-scale problem in both space and time
Each layer is characterized by a different wave speed and thus requires a different mesh resolution.
led to long-period, abrupt vertical displacements of the seafloor, and thus to an increased tsunami risk. At present, this capability of incorporating such realistic geometries into physical earthquake models is unique worldwide.”
-sacrificing part of theoretical speed up in favour of hardware oriented data structures and efficient load-balancing
- reformulating numerical scheme in terms of metric-chain products
i/o is practically for free since we reserve one core for communication anyway
2nd plot: Note that peak performance implies nothing about time-to-solution!!!
Production run not on KNL
Clustered local time-stepping for dynamic rupture
⇒ Speed-up of 6.8
Asynchronous I/O for writing 13 TB of data for checkpoints and 2.8 TB of data
for visualisation and post-processing
A setup including a mesh with over 3 million el- ements for the 2004 Sumatra-Andaman earthquake can be obtained from Zenodo https://dx.doi.org/10.5281/zenodo.439946.