Unraveling Earthquake Dynamics Through Extreme-Scale Multi-Physics Simulations
ALICE GABRIEL (LUDWIG MAXIMILIAN UNIVERSITY OF MUNICH, GERMANY)
Earthquakes are highly non-linear multiscale problems, encapsulating geometry and rheology of faults within the Earth’s crust torn apart by propagating shear fracture and emanating seismic wave radiation.
This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behaviour. It will present the largest-scale dynamic earthquake rupture simulation to date, which models the 2004 Sumatra-Andaman event - an unexpected subduction zone earthquake which generated a rupture of over 1,500 km in length within the ocean floor followed by a series of devastating tsunamis.
The core components of the simulation software will be described, highlighting the benefits of strong collaborations between domain and computational scientists. Lastly, future directions in coupling the short-term elastodynamics phenomena to long-term tectonics and tsunami generation will be discussed.
https://pasc18.pasc-conference.org/program/keynote-presentations/
A study on severe geomagnetic storms and earth’s magnetic field H variations,...IJERA Editor
For our study, we have selected ten severe geomagnetic storms. Which occurred during the years 1994 to 2015. Here great geomagnetic storm of Dst index from -422 nT to -17 nT are taken. These storms are significant not only because of the extremely high magnetic activity but also due to their great impact on the geomagnetosphere. We have studied the relation between severe geomagnetic storms with Earth’s magnetic field in horizontal component (H constant) and also studied the relation between Dst index with sunspots number. The H constant data from Kyoto data centre and Dst index, Ap index, Kp index from OMNI data centre. We have found that the Dst is at very lowest level in this storm period, Ap index Kp index are increased in severe geomagnetic storm period and H Constant is at very lowest level in storm period. We have found that geomagnetic storms were induced to form the cyclones within 29 days. The Sunspots numbers are increased to induce to geomagnetic storm within 5 – 15 days
2019-10-29 Recent progress of volcano deformation studies Yosuke Aoki
Recent progress of volcano deformation studies
The summary discusses the development of volcano geodesy driven by new observational techniques like GNSS, SAR, and modeling methods. It notes that while conventional observations and simple modeling are still useful, sophisticated numerical techniques are powerful but have limitations. Deformation from phreatic eruptions is complicated. Recent unrest at Hakone Volcano may offer insights for monitoring Tatun Volcano.
Earthquake disaster prevention in thailand sent 13-5-2013Tanakrom Pangam
The document discusses several topics related to earthquake disaster prevention, including:
1) Past major earthquakes like Kobe and Fukushima caused widespread damage due to poor preparedness, highlighting the importance of prevention measures.
2) Thailand is at risk of earthquakes from nearby tectonic plate boundaries and is working to implement preventative measures.
3) Numerical modeling has been used to simulate tsunamis and their risks, flooding potential, and damage in order to improve disaster response.
4) Research on earthquake impacts, building resistance, and improving structural designs can help reduce damage and losses from future seismic events.
Rupture processes of the 2012 September 5 Mw 7.6 Nicoya CONSTRAINEDAllan Lopez
On September 5, 2012, a Mw 7.6 earthquake ruptured beneath the Nicoya Peninsula in northwestern Costa Rica. Extensive geodetic and seismological observations from dense near-field strong motion sensors, GPS networks, and global seismic networks provide a unique opportunity to investigate the rupture process. Through a non-linear joint inversion of high-rate GPS waveforms, static GPS offsets, strong motion data, and teleseismic body waves, the authors obtained a robust rupture model. The earthquake was dominantly a pure thrust event with a maximum slip of 3.5 m located below the hypocenter, spanning about 50 km along dip and 110 km along strike. The static stress drop was approximately 3.4 MP
The benefit of hindsight in observational science - Retrospective seismologica...Elizabeth Entwistle
This document discusses a new technique called retrospective seismology that allows seismic recordings to be obtained at times before, during, or after a seismometer was physically installed using seismic interferometry theory. As an example, the document constructs seismograms from two past earthquakes using data from a seismometer installed afterwards. This allows novel information to be obtained about both the earth structure and earthquake sources. The key is that ambient seismic noise recordings can be used to synthesize deterministic signals from earthquake sources. This counterintuitive finding provides a new way to retrospectively observe seismic events.
- Researchers in China noticed significant gravity changes in a region covering the south-north earthquake belt before the 2008 Wenchuan earthquake (Mw 7.9). In 2006, they suggested a major earthquake could occur near Wenchuan in 2007-2008 based on these gravity variations.
- Repeated regional gravity surveys were conducted in 1998, 2000, 2002, and 2005 using absolute and relative gravity measurements. Gravity variations at some locations near Wenchuan were significant but more research is needed to determine if they could be considered precursors.
- Limitations in the data include measurement errors, effects of hydrology and crustal movements on gravity readings, coarse station density, and long time intervals between surveys. Improved
Options and uncertainties in planetary defense: Mission planning and vehicle ...Sérgio Sacani
This document discusses options and uncertainties in defending Earth from potential impacts by near-Earth objects (NEOs). It focuses on a case study using the asteroid 101955 Bennu to examine the feasibility of using kinetic impactors or nuclear explosives to deflect such threats. The authors find that for large threats like Bennu, kinetic impactors require very high impact speeds and masses to be effective, and nuclear explosives may be necessary if response time is limited. They present tools to analyze required launch opportunities and payloads, and propose a modular spacecraft design called HAMMER that could function as either a kinetic impactor or nuclear carrier.
This document summarizes a student's term project on the origin of intraplate earthquakes. It includes an outline, introduction, two case studies on intraplate earthquakes in North China and the New Madrid Seismic Zone, a proposed unified model, and conclusions. The unified model suggests that intraplate earthquakes occur when localized stress buildup at geological features approximates regional tectonic stresses. The conclusions emphasize the roaming nature and lack of periodicity in intraplate seismicity.
A study on severe geomagnetic storms and earth’s magnetic field H variations,...IJERA Editor
For our study, we have selected ten severe geomagnetic storms. Which occurred during the years 1994 to 2015. Here great geomagnetic storm of Dst index from -422 nT to -17 nT are taken. These storms are significant not only because of the extremely high magnetic activity but also due to their great impact on the geomagnetosphere. We have studied the relation between severe geomagnetic storms with Earth’s magnetic field in horizontal component (H constant) and also studied the relation between Dst index with sunspots number. The H constant data from Kyoto data centre and Dst index, Ap index, Kp index from OMNI data centre. We have found that the Dst is at very lowest level in this storm period, Ap index Kp index are increased in severe geomagnetic storm period and H Constant is at very lowest level in storm period. We have found that geomagnetic storms were induced to form the cyclones within 29 days. The Sunspots numbers are increased to induce to geomagnetic storm within 5 – 15 days
2019-10-29 Recent progress of volcano deformation studies Yosuke Aoki
Recent progress of volcano deformation studies
The summary discusses the development of volcano geodesy driven by new observational techniques like GNSS, SAR, and modeling methods. It notes that while conventional observations and simple modeling are still useful, sophisticated numerical techniques are powerful but have limitations. Deformation from phreatic eruptions is complicated. Recent unrest at Hakone Volcano may offer insights for monitoring Tatun Volcano.
Earthquake disaster prevention in thailand sent 13-5-2013Tanakrom Pangam
The document discusses several topics related to earthquake disaster prevention, including:
1) Past major earthquakes like Kobe and Fukushima caused widespread damage due to poor preparedness, highlighting the importance of prevention measures.
2) Thailand is at risk of earthquakes from nearby tectonic plate boundaries and is working to implement preventative measures.
3) Numerical modeling has been used to simulate tsunamis and their risks, flooding potential, and damage in order to improve disaster response.
4) Research on earthquake impacts, building resistance, and improving structural designs can help reduce damage and losses from future seismic events.
Rupture processes of the 2012 September 5 Mw 7.6 Nicoya CONSTRAINEDAllan Lopez
On September 5, 2012, a Mw 7.6 earthquake ruptured beneath the Nicoya Peninsula in northwestern Costa Rica. Extensive geodetic and seismological observations from dense near-field strong motion sensors, GPS networks, and global seismic networks provide a unique opportunity to investigate the rupture process. Through a non-linear joint inversion of high-rate GPS waveforms, static GPS offsets, strong motion data, and teleseismic body waves, the authors obtained a robust rupture model. The earthquake was dominantly a pure thrust event with a maximum slip of 3.5 m located below the hypocenter, spanning about 50 km along dip and 110 km along strike. The static stress drop was approximately 3.4 MP
The benefit of hindsight in observational science - Retrospective seismologica...Elizabeth Entwistle
This document discusses a new technique called retrospective seismology that allows seismic recordings to be obtained at times before, during, or after a seismometer was physically installed using seismic interferometry theory. As an example, the document constructs seismograms from two past earthquakes using data from a seismometer installed afterwards. This allows novel information to be obtained about both the earth structure and earthquake sources. The key is that ambient seismic noise recordings can be used to synthesize deterministic signals from earthquake sources. This counterintuitive finding provides a new way to retrospectively observe seismic events.
- Researchers in China noticed significant gravity changes in a region covering the south-north earthquake belt before the 2008 Wenchuan earthquake (Mw 7.9). In 2006, they suggested a major earthquake could occur near Wenchuan in 2007-2008 based on these gravity variations.
- Repeated regional gravity surveys were conducted in 1998, 2000, 2002, and 2005 using absolute and relative gravity measurements. Gravity variations at some locations near Wenchuan were significant but more research is needed to determine if they could be considered precursors.
- Limitations in the data include measurement errors, effects of hydrology and crustal movements on gravity readings, coarse station density, and long time intervals between surveys. Improved
Options and uncertainties in planetary defense: Mission planning and vehicle ...Sérgio Sacani
This document discusses options and uncertainties in defending Earth from potential impacts by near-Earth objects (NEOs). It focuses on a case study using the asteroid 101955 Bennu to examine the feasibility of using kinetic impactors or nuclear explosives to deflect such threats. The authors find that for large threats like Bennu, kinetic impactors require very high impact speeds and masses to be effective, and nuclear explosives may be necessary if response time is limited. They present tools to analyze required launch opportunities and payloads, and propose a modular spacecraft design called HAMMER that could function as either a kinetic impactor or nuclear carrier.
This document summarizes a student's term project on the origin of intraplate earthquakes. It includes an outline, introduction, two case studies on intraplate earthquakes in North China and the New Madrid Seismic Zone, a proposed unified model, and conclusions. The unified model suggests that intraplate earthquakes occur when localized stress buildup at geological features approximates regional tectonic stresses. The conclusions emphasize the roaming nature and lack of periodicity in intraplate seismicity.
What Do Ground Motion Prediction Equations Tell Us?Ali Osman Öncel
Ground motion prediction equations (GMPEs) provide simple equations to estimate ground motion levels based on magnitude, distance, site conditions, and other variables. While useful for engineering applications, GMPEs tell us little about ground motion variability near faults as they provide average motions from many recordings. Near-fault recordings of large earthquakes can provide more insight, showing variations in amplitude and polarization due to nonuniform fault slip, site effects, and fault zone effects. The dense network of stations recording the 2004 M6.0 Parkfield earthquake revealed less spatial variability for longer period ground motions compared to higher frequencies.
This document discusses variations in multiple atmospheric parameters (outgoing longwave radiation, surface latent heat flux, air temperature, relative humidity, and air pressure) prior to the 2008 Mw 8.0 Wenchuan earthquake in China. The results show significant anomalies in these parameters starting 2 weeks before the quake. Outgoing longwave radiation anomalies were first observed 13 days beforehand and covered an area of 20,000 km2 along the fault zone. Surface latent heat flux showed anomalies the day before the earthquake. The variations are attributed to energy accumulation and release related to the tectonic stress buildup along the fault that caused the earthquake.
This document summarizes the seismic hazard assessment conducted for the Kathmandu valley in Nepal. It describes the procedures used, including setting scenario earthquakes, developing a ground model, and assessing characteristics of the 2015 Gorkha earthquake. Scenario earthquakes of magnitudes 7.8-8.6 were set, and a ground model was developed using over 400 drilling data points, microtremor measurements, and geological cross sections. Site response analyses were then conducted to estimate seismic ground motions and risks of liquefaction and slope failure across the valley.
1) An earthquake occurs when rocks underground break due to accumulated stress exceeding their strength, releasing seismic waves. 2) Seismic waves include body waves that travel through the earth's interior and surface waves that travel along its surface. 3) Earthquake location is determined by measuring the time delays between P and S wave arrivals at multiple seismograph stations and triangulating the epicenter.
1) The document summarizes the steps taken to perform a seismic hazard assessment of Khyber Pakhtunkhwa (KPK) province in Pakistan. These steps include compiling an earthquake catalog from various sources, homogenizing the magnitudes, de-clustering the catalog, performing completeness analysis, defining seismic zones, and developing Gutenberg-Richter recurrence models.
2) Shallow seismic zones were defined based on clustering of shallow earthquakes in the de-clustered catalog. Deep seismic zones were also identified based on deep earthquake locations.
3) Gutenberg-Richter recurrence models were developed for each seismic zone to obtain cumulative frequency of earthquakes per year needed for probabilistic seismic hazard analysis.
The document summarizes an experimental investigation into debris motion under tsunami-like flow conditions. Key points:
- Experiments were conducted in a tsunami wave basin to track debris motion using smart debris and image processing.
- Methods were developed and validated to non-invasively track debris movement with high accuracy.
- Preliminary results show debris motion is repeatable and dependent on debris-debris and debris-ground interactions.
- Further experiments are needed to better understand how flow conditions affect debris entrainment and impact forces on structures.
1) The study develops a new attenuation relationship for Turkey by combining strong motion data from soil and rock sites.
2) To use the soil site data, boreholes were drilled at 64 soil sites to measure soil properties and remove soil amplification effects from the records.
3) Various regression models were tested using magnitude, distance, and peak ground acceleration to establish the new attenuation relationship for Turkey.
Limit radius in a binary system: Cosmological and Post-Newtonian effectsPremier Publishers
Frequently, in dynamical astronomy, the quantitative effect of the large-scale cosmological expansion on local systems is studied in the light of Newtonian approach. We, however, analyze the influence of cosmological expansion on binary systems (galaxies or black holes) in the light of Post-Newtonian approximation. Furthermore, we obtain the new radius at which the acceleration due to the cosmological expansion has the same magnitude as the two-body attraction, and the classical limit radius is obtained when the Schwarzschild radius approaches zero (for example, the Solar System).
This document discusses how satellite observations over the past 50 years have revolutionized the field of earth sciences. It describes how early satellite missions taught scientists not only about the earth but how to improve satellite technology. Precise measurements from satellites have enabled major advances in understanding plate tectonics, topography, seismology and more. The ubiquity of GPS has provided vital data on phenomena like sea level change, earthquakes and volcanoes. Open data policies have maximized the benefits of earth observations.
This document provides an overview of deterministic seismic ground motions and the PEER NGA ground motion prediction equations (GMPEs). It discusses what GMPEs are and how they are used. The document outlines the development of GMPEs, including the data and functions used as well as comparisons of median predictions and uncertainties. It also reviews the PEER NGA-West2 GMPEs and compares predictions to Greek data. Key variables in GMPEs like magnitude, distance, site effects and response spectra are defined.
1980 öncesi deprem istasyon sayısı Türkiye'de herhalde 50'den azdı ve bu nedenle deprem istatistiği çalışmaları Türkiye boyunca çok büyük alanlara bölünerek yapılmış. Okla gösterdiğim yerlerde magnitüd aralığı çok yetersiz. Bu çalışmada, 4x4 şeklinde dilimleme yapılmış. 400kmx400 km olarak dilimlere ayrılarak yapılmış. Veri olmadığı zaman mecbur ALANI büyütmek zorunda kalıyorsunuz... bu nedenle Makro-İstatistik İnceleme yapılmış oluyor.a/b oranını çalışmalarımda hiç kullanmadım fakat bana kalırsa yararlı bir parametre olarak görünüyor. Bir yıl içinde olması beklenen en büyük deprem büyüklüğünü veriyor. Buna göre bu çalışmada, bir yıl içinde beklenen en büyük deprem M=5 bulunmuş ve alan 39 E ve 41 B arasında bir yere denk geliyor... muhtemelen Karlıova Üçlü Bileşimi çevresi olabilir.
1. A seismic hazard assessment was conducted for Khyber Pakhtunkhwa (KPK) province in Pakistan using Cornel's probabilistic seismic hazard assessment (PSHA) methodology.
2. Earthquake data was compiled from various sources and homogenized to generate a composite earthquake catalogue for the region dating back to 1500 AD.
3. The catalogue was processed which included declustering, completeness analysis, and developing Gutenberg-Richter recurrence models for seismic zones.
4. Hazard curves were computed for KPK using the CRISIS 2007 software and PSHA methodology.
This document presents a preliminary seismic microzonation map of Sivas city in Turkey based on microtremor measurements. The researchers conducted microtremor measurements at 114 sites across the city to determine the dominant periods of vibration in the sediments. They divided the city into four zones based on variations in dominant periods, which likely correspond to different levels of seismic hazard. Refraction microtremor measurements along two profiles validated the microzonation map, but further studies are needed to fully characterize seismic hazards in the area.
11.the response of interplanetary medium to the geomagnetic storm of april 20...Alexander Decker
This document examines the behavior of the interplanetary medium during the geomagnetic storm that occurred between April 5-7, 2010. It analyzes data on the southward component of the Interplanetary Magnetic Field (Bz), Disturbance Storm Time index (Dst), solar wind speed, and the H and Z components of the Earth's magnetic field recorded at equatorial and polar stations. The storm had a sudden commencement phase when a strong solar wind compressed the magnetosphere, a main phase when Dst reached a minimum of -73nT, and a recovery phase as Bz and solar wind decreased. There was a sharp decrease in the H component across all latitudes during the storm in response to changes
This document discusses the development of a uniform, testable, global seismic hazard model. It proposes using a hybrid approach combining smoothed seismicity and tectonic strain rate data. This would provide spatially consistent seismic hazard values at a high resolution of 0.1° x 0.1°. The model would integrate global datasets on earthquake activity rates, fault sources, magnitude scaling, depth distributions, and ground motion prediction equations. The goal is to produce a fully transparent and reproducible global seismic hazard map and model that can then be refined at regional scales using more detailed local data and information.
A basic introduction to available geophysical test methods for the use of Geotechnical engineers presented at the USACE Infrastructure Conference in Atlanta, June 2011.
This document discusses kinetic energy of rainfall and its relationship to soil erosion. It provides background on kinetic energy and defines it as the energy from an object's motion. Raindrop kinetic energy depends on velocity and mass/size, and impacts soil detachment and erosion. The Universal Soil Loss Equation uses a rainfall erosivity index (R-factor) that incorporates kinetic energy and intensity. Various methods are discussed for measuring raindrop size distribution, velocity, kinetic energy, and developing relationships between kinetic energy and intensity for estimating soil loss. A piezoelectric force transducer is proposed as a simple, inexpensive method for routine kinetic energy measurements in soil erosion studies.
Earthquake and its predictions. by engr. ghulam yasin taunsviShan Khan
Earthquakes occur where tectonic plates meet, called faults.
California lies on one of the most active faults in the world, the San Andreas Fault.
Methods for predicting earthquakes on these faults vary, none of them being 100% accurate.
Predictions are generally given for a time frame instead of an exact date
The document discusses methods for predicting earthquakes, which scientists have tried with varying degrees of success. It outlines several contemporary prediction methods, such as observing unusual animal behavior, changes in water levels and radon emissions, and analyzing seismic electric signals. However, the document concludes that scientists have not achieved 100% accurate predictions yet, though prediction capabilities have improved over time as more data is collected and patterns analyzed.
This document provides information about earthquakes, including what causes them, the different types of seismic waves, how earthquakes are located, determined their magnitude, and the hazards they can cause. It defines key terms like focus, epicenter, Richter scale, intensity scale and explains the processes of triangulation of seismic waves to locate the epicenter of an earthquake. Diagrams are included to illustrate seismic wave propagation and tsunami movement. Web resources for further information on earthquakes are also listed.
What Do Ground Motion Prediction Equations Tell Us?Ali Osman Öncel
Ground motion prediction equations (GMPEs) provide simple equations to estimate ground motion levels based on magnitude, distance, site conditions, and other variables. While useful for engineering applications, GMPEs tell us little about ground motion variability near faults as they provide average motions from many recordings. Near-fault recordings of large earthquakes can provide more insight, showing variations in amplitude and polarization due to nonuniform fault slip, site effects, and fault zone effects. The dense network of stations recording the 2004 M6.0 Parkfield earthquake revealed less spatial variability for longer period ground motions compared to higher frequencies.
This document discusses variations in multiple atmospheric parameters (outgoing longwave radiation, surface latent heat flux, air temperature, relative humidity, and air pressure) prior to the 2008 Mw 8.0 Wenchuan earthquake in China. The results show significant anomalies in these parameters starting 2 weeks before the quake. Outgoing longwave radiation anomalies were first observed 13 days beforehand and covered an area of 20,000 km2 along the fault zone. Surface latent heat flux showed anomalies the day before the earthquake. The variations are attributed to energy accumulation and release related to the tectonic stress buildup along the fault that caused the earthquake.
This document summarizes the seismic hazard assessment conducted for the Kathmandu valley in Nepal. It describes the procedures used, including setting scenario earthquakes, developing a ground model, and assessing characteristics of the 2015 Gorkha earthquake. Scenario earthquakes of magnitudes 7.8-8.6 were set, and a ground model was developed using over 400 drilling data points, microtremor measurements, and geological cross sections. Site response analyses were then conducted to estimate seismic ground motions and risks of liquefaction and slope failure across the valley.
1) An earthquake occurs when rocks underground break due to accumulated stress exceeding their strength, releasing seismic waves. 2) Seismic waves include body waves that travel through the earth's interior and surface waves that travel along its surface. 3) Earthquake location is determined by measuring the time delays between P and S wave arrivals at multiple seismograph stations and triangulating the epicenter.
1) The document summarizes the steps taken to perform a seismic hazard assessment of Khyber Pakhtunkhwa (KPK) province in Pakistan. These steps include compiling an earthquake catalog from various sources, homogenizing the magnitudes, de-clustering the catalog, performing completeness analysis, defining seismic zones, and developing Gutenberg-Richter recurrence models.
2) Shallow seismic zones were defined based on clustering of shallow earthquakes in the de-clustered catalog. Deep seismic zones were also identified based on deep earthquake locations.
3) Gutenberg-Richter recurrence models were developed for each seismic zone to obtain cumulative frequency of earthquakes per year needed for probabilistic seismic hazard analysis.
The document summarizes an experimental investigation into debris motion under tsunami-like flow conditions. Key points:
- Experiments were conducted in a tsunami wave basin to track debris motion using smart debris and image processing.
- Methods were developed and validated to non-invasively track debris movement with high accuracy.
- Preliminary results show debris motion is repeatable and dependent on debris-debris and debris-ground interactions.
- Further experiments are needed to better understand how flow conditions affect debris entrainment and impact forces on structures.
1) The study develops a new attenuation relationship for Turkey by combining strong motion data from soil and rock sites.
2) To use the soil site data, boreholes were drilled at 64 soil sites to measure soil properties and remove soil amplification effects from the records.
3) Various regression models were tested using magnitude, distance, and peak ground acceleration to establish the new attenuation relationship for Turkey.
Limit radius in a binary system: Cosmological and Post-Newtonian effectsPremier Publishers
Frequently, in dynamical astronomy, the quantitative effect of the large-scale cosmological expansion on local systems is studied in the light of Newtonian approach. We, however, analyze the influence of cosmological expansion on binary systems (galaxies or black holes) in the light of Post-Newtonian approximation. Furthermore, we obtain the new radius at which the acceleration due to the cosmological expansion has the same magnitude as the two-body attraction, and the classical limit radius is obtained when the Schwarzschild radius approaches zero (for example, the Solar System).
This document discusses how satellite observations over the past 50 years have revolutionized the field of earth sciences. It describes how early satellite missions taught scientists not only about the earth but how to improve satellite technology. Precise measurements from satellites have enabled major advances in understanding plate tectonics, topography, seismology and more. The ubiquity of GPS has provided vital data on phenomena like sea level change, earthquakes and volcanoes. Open data policies have maximized the benefits of earth observations.
This document provides an overview of deterministic seismic ground motions and the PEER NGA ground motion prediction equations (GMPEs). It discusses what GMPEs are and how they are used. The document outlines the development of GMPEs, including the data and functions used as well as comparisons of median predictions and uncertainties. It also reviews the PEER NGA-West2 GMPEs and compares predictions to Greek data. Key variables in GMPEs like magnitude, distance, site effects and response spectra are defined.
1980 öncesi deprem istasyon sayısı Türkiye'de herhalde 50'den azdı ve bu nedenle deprem istatistiği çalışmaları Türkiye boyunca çok büyük alanlara bölünerek yapılmış. Okla gösterdiğim yerlerde magnitüd aralığı çok yetersiz. Bu çalışmada, 4x4 şeklinde dilimleme yapılmış. 400kmx400 km olarak dilimlere ayrılarak yapılmış. Veri olmadığı zaman mecbur ALANI büyütmek zorunda kalıyorsunuz... bu nedenle Makro-İstatistik İnceleme yapılmış oluyor.a/b oranını çalışmalarımda hiç kullanmadım fakat bana kalırsa yararlı bir parametre olarak görünüyor. Bir yıl içinde olması beklenen en büyük deprem büyüklüğünü veriyor. Buna göre bu çalışmada, bir yıl içinde beklenen en büyük deprem M=5 bulunmuş ve alan 39 E ve 41 B arasında bir yere denk geliyor... muhtemelen Karlıova Üçlü Bileşimi çevresi olabilir.
1. A seismic hazard assessment was conducted for Khyber Pakhtunkhwa (KPK) province in Pakistan using Cornel's probabilistic seismic hazard assessment (PSHA) methodology.
2. Earthquake data was compiled from various sources and homogenized to generate a composite earthquake catalogue for the region dating back to 1500 AD.
3. The catalogue was processed which included declustering, completeness analysis, and developing Gutenberg-Richter recurrence models for seismic zones.
4. Hazard curves were computed for KPK using the CRISIS 2007 software and PSHA methodology.
This document presents a preliminary seismic microzonation map of Sivas city in Turkey based on microtremor measurements. The researchers conducted microtremor measurements at 114 sites across the city to determine the dominant periods of vibration in the sediments. They divided the city into four zones based on variations in dominant periods, which likely correspond to different levels of seismic hazard. Refraction microtremor measurements along two profiles validated the microzonation map, but further studies are needed to fully characterize seismic hazards in the area.
11.the response of interplanetary medium to the geomagnetic storm of april 20...Alexander Decker
This document examines the behavior of the interplanetary medium during the geomagnetic storm that occurred between April 5-7, 2010. It analyzes data on the southward component of the Interplanetary Magnetic Field (Bz), Disturbance Storm Time index (Dst), solar wind speed, and the H and Z components of the Earth's magnetic field recorded at equatorial and polar stations. The storm had a sudden commencement phase when a strong solar wind compressed the magnetosphere, a main phase when Dst reached a minimum of -73nT, and a recovery phase as Bz and solar wind decreased. There was a sharp decrease in the H component across all latitudes during the storm in response to changes
This document discusses the development of a uniform, testable, global seismic hazard model. It proposes using a hybrid approach combining smoothed seismicity and tectonic strain rate data. This would provide spatially consistent seismic hazard values at a high resolution of 0.1° x 0.1°. The model would integrate global datasets on earthquake activity rates, fault sources, magnitude scaling, depth distributions, and ground motion prediction equations. The goal is to produce a fully transparent and reproducible global seismic hazard map and model that can then be refined at regional scales using more detailed local data and information.
A basic introduction to available geophysical test methods for the use of Geotechnical engineers presented at the USACE Infrastructure Conference in Atlanta, June 2011.
This document discusses kinetic energy of rainfall and its relationship to soil erosion. It provides background on kinetic energy and defines it as the energy from an object's motion. Raindrop kinetic energy depends on velocity and mass/size, and impacts soil detachment and erosion. The Universal Soil Loss Equation uses a rainfall erosivity index (R-factor) that incorporates kinetic energy and intensity. Various methods are discussed for measuring raindrop size distribution, velocity, kinetic energy, and developing relationships between kinetic energy and intensity for estimating soil loss. A piezoelectric force transducer is proposed as a simple, inexpensive method for routine kinetic energy measurements in soil erosion studies.
Earthquake and its predictions. by engr. ghulam yasin taunsviShan Khan
Earthquakes occur where tectonic plates meet, called faults.
California lies on one of the most active faults in the world, the San Andreas Fault.
Methods for predicting earthquakes on these faults vary, none of them being 100% accurate.
Predictions are generally given for a time frame instead of an exact date
The document discusses methods for predicting earthquakes, which scientists have tried with varying degrees of success. It outlines several contemporary prediction methods, such as observing unusual animal behavior, changes in water levels and radon emissions, and analyzing seismic electric signals. However, the document concludes that scientists have not achieved 100% accurate predictions yet, though prediction capabilities have improved over time as more data is collected and patterns analyzed.
This document provides information about earthquakes, including what causes them, the different types of seismic waves, how earthquakes are located, determined their magnitude, and the hazards they can cause. It defines key terms like focus, epicenter, Richter scale, intensity scale and explains the processes of triangulation of seismic waves to locate the epicenter of an earthquake. Diagrams are included to illustrate seismic wave propagation and tsunami movement. Web resources for further information on earthquakes are also listed.
1) Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic energy.
2) Earthquake magnitude is measured using several scales based on the amplitude and period of seismic waves, with the most commonly used being the Moment Magnitude scale.
3) Earthquake geography and hazards assessment involves locating faults, estimating recurrence, measuring crustal deformation, assessing shaking intensity, liquefaction potential, tsunami risk and more to determine earthquake probabilities in a region.
Introduction to Structural Geology-Patrice intro to_sgRoland Guillen
Charles Darwin developed an approach to scientific investigation based on building a model from facts and using that model to make testable predictions that could be verified through new data or experiments. If predictions were not verified, the model could be modified iteratively to strengthen it. This iterative process of building, testing, and modifying models leads to robust explanations for observations. Structural geology follows scientific methods to understand deformation structures based on field observations, data analysis, model building and testing. The discipline aims to characterize deformation features, kinematics, and driving forces at various scales from microns to kilometers.
This document discusses characteristics of ground motions from earthquakes. It covers several topics: direct information that can be obtained from an accelerogram; wave types involved in strong ground motion; factors that affect strong ground motion; empirical and theoretical prediction of ground motions; probabilistic seismic hazard analysis; and uncertainty in ground motion prediction. The document serves as an introduction to key concepts and literature on analyzing ground motion characteristics rather than providing detailed information.
This document summarizes a study that performed broadband frequency simulations of strong ground motion in the Sea of Marmara region of Turkey based on fault rupture scenarios. Three earthquake scenarios were modeled involving rupture of the Central Marmara Fault and North Boundary Fault, which pose the largest hazard to Istanbul. A hybrid technique was used that combines deterministic and semi-stochastic methods. The location of the hypocenter was found to be a critical parameter for predicting ground motions in Istanbul. Anelasticity was also found to significantly affect regional attenuation of peak ground accelerations. The simulated ground motions resulted in large acceleration response spectra at long periods that could be critical for building damage in Istanbul during an actual earthquake.
The October 2004 Mw=7.1 Nicaragua earthquake: Rupture process, aftershock loc...Gus Alex Reyes
The subduction zone off the Nicaragua
coastline has been the site of several large
earthquakes in the past decades, including
the 1992 tsunami earthquake that was
anomalous in the size of the tsunami relative
to moment release [Kanamori and
Kikuchi, 1993]. As a focus site for both
the MARGINS-SEIZE and SubFac initiatives,
it is an area of keen interest for
scientists interested in earthquake rupture
and volcanic processes.
1) The document discusses different statistical distributions that can be used to model earthquake magnitude, including the pure power law, truncated power law, exponential taper models, and extreme value theory distributions.
2) It reviews the history of statistical modeling of earthquake magnitudes dating back to the 1930s and highlights different approaches researchers have taken over time to characterize the tail of the magnitude distribution.
3) The author expresses a wish to develop a model that recovers the power law in the body, ensures finite first moment, allows for a "soft" cutoff, integrates hierarchical data sets, and fully characterizes uncertainties through Bayesian Monte Carlo methods.
In this deck from the 2018 HPC User Forum in Tucson, Christine Goulet from the Southern California Earthquake Center presents: HPC Use for Earthquake Research.
"The Southern California Earthquake Center (SCEC) was founded as a Science & Technology Center in 1991, with joint funding by the NSF and the U. S. Geological Survey. SCEC coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. This research program is investigator-driven and supports core research and education in seismology, tectonic geodesy, earthquake geology, and computational science. The SCEC community advances earthquake system science through three basic activities: (a) gathering information from seismic and geodetic sensors, geologic field observations, and laboratory experiments; (b) synthesizing knowledge of earthquake phenomena through physics-based modeling, including system-level hazard modeling; and communicating our understanding of seismic hazards to reduce earthquake risk and promote community resilience."
Watch the video: https://wp.me/p3RLHQ-imT
Learn more: https://www.scec.org/about
and
http://hpcuserforum.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
Earthquakes AND ITS EFFECTS1111111111111111111111111sanketsanghai
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing energy as seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Geography of earthquakes is influenced by tectonic plate boundaries and fault zones.
- Seismic hazards include shaking, liquefaction, landslides and tsunamis.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Various scales are used to measure earthquake magnitude based on the energy released and intensity of shaking. The Richter scale is a common logarithmic scale for measuring magnitude.
- Earthquake geography is influenced by tectonic plate interactions and locations of faults. Hazards include shaking, liquefaction, landslides and tsunamis.
- Recurrence of earthquakes can be estimated by studying historical records and geology to assess probabilities of future seismic events. Accurate prediction remains difficult but mitigation efforts can reduce risks.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
- Earthquakes are caused by the accumulation of strain along faults until rupture occurs, releasing seismic waves.
- Their magnitude is measured using different scales based on the amplitude and period of seismic waves or the rupture area and displacement.
- Recurrence refers to the frequency of earthquakes in a given area, which can be estimated from historical records and geology.
- While prediction of individual quakes remains difficult, hazards can be assessed through evaluating faults, recurrence, and the effects of local geology on shaking intensity. Preparedness involves building design, codes, and public education.
The dynamic loads mainly derive from earthquakes, operation of heavy machinery, blasts, and wave or wind forces, etc. Common soil dynamics topics include the determination of dynamic earth pressures, the analysis and design of foundations under dynamic loads and dynamic soil-structure interaction problems. In civil engineering, earthquakes are the most common phenomena from which dynamic loads affect structures.
Understanding the dynamic behavior of soils is critical to prevent any structural or ground failure under earthquake loads. The properties that are needed to be determined to evaluate the dynamic behavior of soil are the following:
Dynamic Young’s modulus (E) and dynamic shear modulus (G) and their variation with shear strain (typically referred to as Shear Modulus Reduction curves)
Damping ratio (ξ) and its variation with shear strain (typically referred to as material damping curves)
Poisson’s ratio (ν)
Other parameters related to liquefaction (e.g. cyclic shearing stress ratio and cyclic deformation)
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
Alice Gabriel - Pasc18 - Keynote
1. Unraveling earthquake dynamics
through extreme-scale multi-physics simulations
Alice-Agnes Gabriel
Computational model for a large-scale scenario of the 2004 Mw9.1 Sumatra-Andaman earthquake (Uphoff et al., SC2017)
2. Acknowledgements
The LMU earthquake physics team:
Betsy
Madden
Kenneth
Duru
Stephanie
Wollherr
Thomas
Ulrich
Former collaborators:
Christian Pelties (now MunichRe)
Alexander Breuer (now SDSC)
Alexander Heinecke (now intel)
The TUM HPC team:
Carsten
Uphoff
Leonhard
Rannabauer
3. I. Computational wave propagation and earthquake rupture
Schematic view of on-going seismic rupture of the Parkfield
segment of Sand Andreas Fault, Caltech/Tim Pyle
Wave simulations of the 2009 L’Aquila earthquake using SeisSol,
Igel 2017, Wenk et al., 2009
4. Computational wave propagation
• Seismology is data-rich and can often be treated as
linear system
• Computational seismology has been a pioneering field and
has been pioneered by HPC
• Key activities: Calculation of synthetic seismograms in
3D Earth and solving seismic inverse problems
• Key achievements: Imaging Earth’s interior,
understanding the dynamics of the mantle, tracking down
energy resources
• Common approach: time-domain solutions of space-
dependent seismic wavefield solved by domain
decomposition
On May 5th, the NASA “InSight”-lander set off to investigate the internal
structure of Mars carrying a seismometer. Forward simulations of seismic
waves travelling through Mars have been performed on “Piz Daint” in real
time solving 10 billion degrees of freedom and 300,000 time steps (Bozdag
et al., 2017 & PASC presentation of Hapla et al., Wed.11:15)`
5. Computational wave propagation
• Seismology is data-rich and can often be treated as
linear system
• Computational seismology has been a pioneering field and
has been pioneered by HPC
• Key activities: Calculation of synthetic seismograms in
3D Earth and solving seismic inverse problems
• Key achievements: Imaging Earth’s interior,
understanding the dynamics of the mantle, tracking down
energy resources
• Common approach: time-domain solutions of space-
dependent seismic wavefield solved by domain
decomposition
On May 5th, the NASA “InSight”-lander set off to investigate the internal
structure of Mars carrying a seismometer. Forward simulations of seismic
waves travelling through Mars have been performed on “Piz Daint” in real
time solving 10 billion degrees of freedom and 300,000 time steps (Bozdag
et al., 2017 & PASC presentation of Hapla et al., Wed.11:15)`
• On-going challenges: Computational efficiency
(resolving high frequencies), meshing (irregular
geometries), and the need for community solutions keep
us busy
6. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
7. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
• Direct (geological/borehole/laboratory studies) or indirect
(seismic radiation) observation is difficult
➡ Difficulties in theoretical aspects
➡ Scaling problems w.r.t. laboratory experiments
➡ Naturally limited amount of large, strongly radiating
earthquakes (sparse data, ill-posed inversion problem)
➡ We know that earthquakes are the result of ruptures
that nucleate, grow and terminate in most cases along
pre-existing faults (Gilbert, 1884)
Exemplary drilling, seismic and geodetic observational
approaches to understand earthquake physics
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
8. Earthquake rupture
• Earthquake source studies are ill-constrained and highly
non-linear
• Earthquakes are ubiquitous on Earth - potentially
disastrous, as well as an invaluable source of information
• While most of today’s knowledge of the structure of the
Earth is imaged by propagating waves, earthquake
source processes are still mysterious
• Direct (geological/borehole/laboratory studies) or indirect
(seismic radiation) observation is difficult
➡ Difficulties in theoretical aspects
➡ Scaling problems w.r.t. laboratory experiments
➡ Naturally limited amount of large, strongly radiating
earthquakes (sparse data, ill-posed inversion problem)
Exemplary drilling, seismic and geodetic observational
approaches to understand earthquake physics
W-phase CMT solutions obtained for all Mw>=6.5 earthquakes
occurring between 1990-2011 (Duputel et al., 2012)
• Earthquake predictions escapes us - we address this
deficiency through understanding earthquake source
physics
9. Observations
• Recent well-recorded earthquakes, as well as laboratory
experiments, resolve
striking variability in terms of source dynamics
‣ super-shear propagation
‣ slip-reactivation
‣ nucleation with/without slow-slip pre-cursors,
‣ variability of rupture style (pulses vs cracks)
‣ rupture cascading and “jumping”
‣ Propagation along both locked and creeping fault
sections during the same earthquake
Tohoku-Oki back projection: Indicating major areas of high-
frequency radiation on the fault (Meng et al., 2012)
Source inversion model of Tohoku-Oki event (Japan) 2011, from
combined local ground motion, teleseismics, GPS & multiple
time window parametrization of slip rate. (Lee and Wang, 2011)
10. Observations
Supershear Mach cone emanating
from rupture tip in laboratory
experiment (Xia et al., 2004)
Denali EQ PS10 record: Rupture of form of two slip pulses, first at
supershear speed and second at subshear speed (Dunham et al.,
• Recent well-recorded earthquakes, as well as laboratory
experiments, resolve
striking variability in terms of source dynamics
‣ super-shear propagation
‣ slip-reactivation
‣ nucleation with/without slow-slip pre-cursors,
‣ variability of rupture style (pulses vs cracks)
‣ rupture cascading and “jumping”
‣ Propagation along both locked and creeping fault
sections during the same earthquake
11. Seismic hazard assessment
• Earthquake source effects are assessed by empirical
(probabilistic) and (over-) simplified approaches
• Reliability relies heavily on the definition of a
seismic faulting model underpinned by realistic
geological and physical constraints including fault zone
rock type, state of stress, fault geometry, fault yield
strength, friction and rupture laws
• Understanding earthquake source physics would shed
light on stress conditions, crustal processes,
fundamentals of friction, and lead to physics-based
seismic hazard assessment (improve building codes;
provide more reliable hazard maps; enable forecasting) Empirical attenuation relation (Boore et al., 1997)
12. • Earthquakes are in many sense unique, despite occurring
in potentially the same location (or nearby location)
• Which physical processes are dominant and relevant at
a given spatio-temporal scale (and in real earthquakes)?
Can we justify the (most often computational) cost of their
inclusion?
• Singular effects can be studied conceptually (2D dynamic
rupture modeling) and analytically (fracture mechanics)
• Large-scale dynamic rupture simulations aiming to
understand “in-scale” which of the aforementioned
“complexities” provides the first order influence on source
dynamics and the resulting observables for a given
geological region (tectonic setting), or fault system, or
type of fault system
1992 Landers dynamic rupture earthquake scenario, resolving 10
Hz wave propagation employing multi-petaflop performance.
(Heinecke et al., SC14)
• Large-scale dynamic rupture simulations aiming to
understand on “natural-scale” which of the
aforementioned “complexities” provides the first order
influences
Modelling challenges -
The search for required minimum
complexity
13. • Earthquakes are in many sense unique, despite occurring
in potentially the same location (or nearby location)
• Which physical processes are dominant and relevant at
a given spatio-temporal scale (and in real earthquakes)?
Can we justify the (most often computational) cost of their
inclusion?
• Singular effects can be studied conceptually (2D dynamic
rupture modeling) and analytically (fracture mechanics)
• Large-scale dynamic rupture simulations aiming to
understand “in-scale” which of the aforementioned
“complexities” provides the first order influence on source
dynamics and the resulting observables for a given
geological region (tectonic setting), or fault system, or
type of fault system
1992 Landers dynamic rupture earthquake scenario, resolving 10
Hz wave propagation employing multi-petaflop performance.
(Heinecke et al., SC14)
• Large-scale dynamic rupture simulations aiming to
understand on “natural-scale” which of the
aforementioned “complexities” provides the first order
influences
➡ Requires: Integrative view of multi-scale physics of rock
fracture, dynamic rupture propagation, and emanated seismic
radiation
➡ Representation of complex 3D geometries
➡ Computationally expensive
Modelling challenges -
The search for required minimum
complexity
14. II. Multi-physics dynamic rupture earthquake simulations
2016 Kaikoura, New Zealand dynamic rupture earthquake scenario, resolving the most complex rupture
observed to date (Ulrich et al., 2018, under revision, PASC presentation Gabriel et al. Tue. 16:30 )
15. Failure CriterionInitial fault stresses
Synthetic seismogramsGround motion
SOLVER
“Output”
Geological structure
CAD & mesh generation
“Input”
Multi-physics
earthquake simulations
• Physics-based approach: Solving for spontaneous
dynamic earthquake rupture as non-linear interaction of
frictional failure and seismic wave propagation
16. Harris et al., SRL 2018
Multi-physics
earthquake simulations
➡ Many methods successfully solve (idealised) community benchmarks
17. • Non-planar, intersecting faults
• Non-linear friction
• Heterogeneities in stress and strength
• Dynamic damage around the fault
• Fault roughness on all scales
• Bi-material effects
• Low velocity zones surrounding faults
• Thermal pressurization of fault zone fluids
• Thermal decomposition
• Dilatancy of the fault gouge
• Flash heating, melting, lubrication
… this list grows continuously
Multi-physics
earthquake simulations
➡ Few methods support all modelling requirements
Multitude of spatio-temporal scales: fault geometry spans hundreds of
km; frictional process zone size is m (or even cm) scale, tectonic loading
(seismic cycle) 10-10000 years; rise time on second scale
18. SeisSol - ADER-DG
A unique modelling framework
www.seissol.org
We develop and host an open-source Arbitrary high-order
DERivative Discontinuous Galerkin (ADER-DG) software
package. SeisSol solves the seismic wave equations in
elastic, viscoelastic, and viscoplastic media on unstructured
tetrahedral meshes.
Our method, by design, permits:
• representing complex geometries - by discretising the
volume via a tetrahedral mesh
• modelling heterogenous media - elastic, viscoelastic,
viscoplastic, anisotropic
• multi-physics coupling - flux based formulation is natural
for representing physics defined on interfaces
• high accuracy - modal flux based formulation allows us to
suppress spurious (unresolved) high frequencies
• high resolution - suitable for parallel computing
environments
Representation of the shear
stress discontinuity across the
fault interface. Spontaneous
rupture = internal boundary
condition of flux term.
fault
M. Käser and M. Dumbser, 2006; M. Dumbser and M. Käser, 2006
J. de la Puente et al., 2008; C. Pelties et al., 2014
github.com/SeisSol
Wave field of a point source
interacting with the
topography of Mount Merapi
Volcano.
PRACE ISC Award for
producing the first simulations
that obtained the “magical"
performance milestone of 1
Peta-flop/s (1015 floating point
operations per second) at the
Munich Supercomputing
Centre.
Due to the properties of the
exact Riemann solver, solutions
on the fault remain free of
spurious oscillations
19. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
constitutiverelationshipsintermsofvelocityconservationofmomentum
linearhyperbolicsystem
20. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
• ADER: high-order time integration + DG: high-order
space discretisation
• DG with orthogonal basis functions (modal)
• Exact Riemann-Solver computes the upwind flux = state
at the element interfaces
DGdiscreteformDGoperators
21. SeisSol - ADER-DG
Numerics in a nutshell
• Elastic wave equation in velocity stress formulation
• ADER: high-order time integration + DG: high-order
space discretisation
• DG with orthogonal basis functions (modal)
• Exact Riemann-Solver computes the upwind flux = state
at the element interfaces
• Locality of the computations: only neighbouring
elements exchange data
➡ ADER-DG boils down to small matrix-matrix
multiplications , where the dimension of the matrices
depends on the order of the scheme (75 % of runtime
consumption).
22. “Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
SeisSol
Optimisation on all software levels
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
➡ Goal: End-to-end optimisation
on operational geophysics
software
23. “Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• Hybrid MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
• > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
1992 Landers dynamic rupture
earthquake scenario (Heinecke et al.,
Gordon Bell Prize Finalist Paper at
SC14)
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
SeisSol
Optimisation on all software levels
24. • > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
“Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• Hybrid MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation
server
• Asynchronous input/output
• Overlaping computation and
communication
Partial kernel before (top) and after (bottom) removing
irrelevant entries in matrix chain products
➡A code generator automatically detects and exploits
sparse block patterns
➡Hardware specific full “unrolling” and vectorization of
all element operations
➡Customised code for each matrix-matrix
multiplication via the libxsmm back-end
➡Efficiently exploits as of 2014 available hardware
(AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2
1992 Landers dynamic rupture
earthquake scenario (Heinecke et al.,
Gordon Bell Prize Finalist Paper at
SC14)
SeisSol
Optimisation on all software levels
25. III. The 2004 Sumatra megathrust earthquake
A geophysics and HPC challenge …
Illustration of the subduction zone hosting the Christmas 2004 Mw 9.1 - 9,3 Sumatra-Andaman megathrust earthquake
26. • Huge event that triggered devastating tsunami claiming > 230,000 lives in 14 countries, no early-warning
III. The 2004 Sumatra megathrust earthquake
… which is about the people
The tsunami hits Thailand (wikipedia.com) A village near the coast of Sumatra lays in ruin (US Navy)
27. III. The 2004 Sumatra megathrust earthquake
… which is about the people
The tsunami hits Thailand (wikipedia.com) A village near the coast of Sumatra lays in ruin (US Navy)
➡ Earthquakes and tsunamis are not predictable hazards.
Challenge is in the physics: we do not know how earthquakes begin,
grow and sometimes arrest; we do not know when a large earthquake
triggers a tsunami; we do not know how subduction zones ‘operate’.
➡ Earthquakes and tsunamis are not predictable hazards.
Challenge is in the physics: we do not know how earthquakes begin,
grow and sometimes arrest; we do not know when a large earthquake
• Huge event that triggered devastating tsunami claiming > 230,000 lives in 14 countries, no early-warning
28. 1000 km
Shearer and Bürgmann (2010),
Fig.1
• An unexpected, very large earthquake (old oceanic crust,
slow convergence rates)
• Rupturing faults of 1300 to 1500 km, constantly slow
rupture velocity (2 to 3 km/s), long duration of 8 to 10
minutes
• Complex, non-planar intersections, at shallow angles -
CAD and mesh generation is a bottleneck
• Small-is “pop-up” fractures splaying off the megathrust
may be crucial for tsunami generation
A geophysics challenge
Tectonic plates involved in the Sumatra-Andaman
earthquake. Complex 3D geometry of the Sumatra
subduction zone model. The curved megathrust is
intersecting bathymetry, as are the 3 adjacent splay
faults: one forethrust and two backthrusts. The
subsurface consists of horizontally layered continental
crust and subducting layers of oceanic crust. Each layer
is characterized by a different wave speed and thus
requires a different mesh resolution.
29. An HPC challenge
• Spatial resolution (400m on-fault, O6) and 2.2 Hz wave
propagation required mesh with 220 million finite
elements (~111 x 109 DoF).
• Incorporation of high-resolution geodata requires
geoinformation server for fast loading and generalised
initialisation for large 3D datasets, parallel meshing and
file formats
• Unique capability of incorporating realistic geometries
causes highly varying element sizes due to static
adaptivity and intersection of fault with sea-floor or
material layers.
30. An HPC challenge
• Local time-stepping: Each element may have its own
time-step, limited by a CFL condition. Theoretical speed-
up of 14.3 with (perfect) LTS.
• Problem: Irregular update scheme not well-suited for
modern hardware. Idea: Partition elements into time
clusters.
⇒ Speed-up of 9.9x due to clustering (14.3 with per-cell
LTS), but ⇒ LTS scheme with petascale performance
➡ Sacrificing part of theoretical speed up in favour of
hardware oriented data structures and efficient load-
balancing
➡ Only 4% of elements hold dynamic rupture faces, but
are crucial to optimise by local time stepping and to
relax mesh generation
31. An HPC challenge
Note: on KNL we measured 467 TFLOPS with a speed-up of 1.28
compared to 512 nodes of Shaheen.
(86,016 cores)
32. SeisSol
Optimisation on all software levels
“Geophysics” Version
• Fortran 90
• MPI parallelised
• Ascii based, serial I/O
Landers scenario
(96 billion DoF,
200,000 time steps)
• MPI+OpenMP parallelisation
• Parallel I/O (HDF5, inc. mesh init.)
• Assembler-level DG kernels
• multi-physics off-load scheme for
many-core architectures
Sumatra scenario
(111 billion DoF,
3,300,000 time steps)
• Cluster-based local time stepping
• Code generator also for advanced
PDE's as viscoelastic attunation
• Asagi (XDMF)-geoinformation server
• Asynchronous input/output
• Overlaping computation and
communication
• Optimized for Intel KNL
• Speed up of 14x
• 14 hours compared to
almost 8 days for
Sumatra scenario on
SuperMuc2
Breuer et al.,ISC14, Heinecke et al.,SC14
Breuer et al.,IEEE16, Heinecke et al.,SC16
Rettenberger et al., EASC16
Upphoff & Bader, HPCS’16
Uphoff et al., SC17
• > 1 PFlop/s performance
• 90% parallel efficiency
• 45% of peak performance
• 5x-10x faster time-to-solution
• 10x-100x bigger problems
33. Sumatra megathrust
& splay faults scenario
Best Paper Supercomputing Conference SC17
C. Uphoff, S. Rettenberger, M. Bader,
B. Madden, T. Ulrich, S. Wollherr, A.-A. Gabriel
➡Largest, longest scale dynamic rupture simulation performed so far
34. Sumatra megathrust
& splay faults scenario
• Replicating first-order observations (slip, ground
deformation) as well as producing unexpected features:
• Back thrust splay fault breaking delayed and reversed,
producing considerable contribution to vertical uplift
• Slow rupture speed, slip pulses: due to subducting
layers of lower wave speeds surrounding the fault
(megathrust LVZ)
Synthetic horizontal (left) and vertical (right) sea-floor displacement. Arrows depict
comparison to observations from geodetic and tsunami data summarized in Bletery et
al., 2016.
35. • “More” multi-physics based on new matrix-based code
generator: viscoelastic attenuation, off-fault plasticity
Sumatra megathrust
& splay faults scenario
36. • “More” multi-physics based on new matrix-based code
generator: viscoelastic attenuation, off-fault plasticity
• Generated high-resolution sea-floor displacement as
initial condition for tsunami models based on ASAGI
• Coupling with geodynamic thermo-mechanical models to
provide constraints on fault rheology and the state of
stress
Sumatra megathrust
& splay faults scenario
37. Outlook
Beyond scenario based simulations
• Uncertainty quantification
• Dynamic source inversion
• Adjoint calculations
• Urgent computing: real time scenarios
• Ensemble simulations
2D spacetree in ExaHyPE (Weinzierl et al., 2014
Rupture Complexities of Fluid
Induced Microseismic Events
at the Basel EGS Project
(Folesky et al., 2016)
Highly non-unique kinematic slip models for the
1999 Izmit earthquake (Ide et al., 2005)
➡Exascalesystems
38. Instead of Conclusions:
A reproducibility challenge!
• A setup including a mesh with over 3 million elements for
the 2004 Sumatra-Andaman earthquake can be obtained
from Zenodo https://dx.doi.org/10.5281/zenodo.439946.
$ git clone --recursive https://github.com/SeisSol/SeisSol
$ git checkout 201703
$ git submodule update
$ scons order=6 compileMode=release generatedKernels=yes arch=dhsw
parallelization=hybrid commThread=yes netcdf=yes
$ export OMP_NUM_THREADS=<threads >
$ mpiexec -n <processes > ./SeisSol parameters.par
# SuperMUC Phase 2
$ export OMP_NUM_THREADS =54
$ exportKMP_AFFINITY=compact ,granularity=thread
$ git clone --recursive https://github.com/SeisSol/SeisSol
$ git checkout 201703
$ git submodule update
$ scons order=6 compileMode=release generatedKernels=yes arch=dhsw
parallelization=hybrid commThread=yes netcdf=yes
40. SeisSol
Features and Scales
• Viscoelastic attenuation
• Kinematic sources
• Modern friction laws
• Off-fault plasticity
• Fault roughness
• Thermal pressurisation (2D)
• Fast loading of 3D datasets with
ASAGI
• Adjoint (2D)
• Checkpointing
• Parallel I/O
• Initial parametrization with EASI
• Full local time stepping
• Tested meshing workflow
up to 925 million elements
• Tools for pre- and post
processing
• overnight builds / code testing.
(using Travis, Jenkins, …)
Sumatra: 14 mio
element mesh, 400s: ~2h
on 300 nodes
Landers: 10 mio elements, 100s,
200m fault resolution, 500m topo
resolution, 3D velocity model =:
~1h on 100 nodes (with plasticity
6.2% increase)
Kaikoura: 29 mio elements, 90 sec.,
2 hours on 3000 Sandy Bridge cores
2D SeisSol (Laptop)
41. SeisSol
Features and Scales
• Viscoelastic attenuation
• Kinematic sources
• Modern friction laws
• Off-fault plasticity
• Fault roughness
• Thermal pressurisation (2D)
• Fast loading of 3D datasets with
ASAGI
• Adjoint (2D)
• Checkpointing
• Parallel I/O
• Initial parametrization with EASI
• Full local time stepping
• Tested meshing workflow
up to 925 million elements
• Tools for pre- and post
processing
• overnight builds / code testing.
(using Travis, Jenkins, …)
Sumatra: 14 mio
element mesh, 400s: ~2h
on 300 nodes
Landers: 10 mio elements, 100s,
200m fault resolution, 500m topo
resolution, 3D velocity model =:
~1h on 100 nodes (with plasticity
6.2% increase)
Kaikoura: 29 mio elements, 90 sec.,
2 hours on 3000 Sandy Bridge cores
2D SeisSol (Laptop)
Editor's Notes
- thank the organisers for giving me the opportunity as a seismologist interested in earthquakes to present about the our approach in shedding light on what happens when rock masses slide upon each other - turning apart Earth’s crust by propagating shear fracture and emanating seismic wave radiation.
highlighting the benefits of strong collaborations between domain and computational scientists
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
Why have Earth and Mars developed so differently although their original structure and chemical composition seem so similar? How large, thick and dense are the core, mantle and crust? What is their structure? The scientists are hoping to gain fundamental insights into the general formation of rocky planets such as Mars, Earth, Mercury and Venus.
In distinction to the "wave propagation problem considered to be solved”, predictions escape us - we choose to address this defficiency though understanding the source physics
Inaccessibility of in-situ observations from several kilometers deep in the seismogenic zone (restriction to surface observation)
Complexity of natural geological settings (poor knowledge of small-scale features limits useable frequency band)
Multiple factors affecting recorded ground motion (contamination of source effects)
Naturally limited amount of large, strongly radiating earthquakes (sparse data, ill-posed inversion problem)
Scaling problems w.r.t. laboratory experiments
Earthquake source effects are routinely assessed by empirical (probabilistic) and (over-) simplified approaches
Understanding earthquake source physics (nucleation, dynamics) sheds light on stress conditions, crustal processes, fundamentals of friction
Physics-based seismic hazard assessment (improve building codes; provide reliable hazard maps; enable forecasting)
Earthquake source effects are routinely assessed by empirical (probabilistic) and (over-) simplified approaches
Understanding earthquake source physics (nucleation, dynamics) sheds light on stress conditions, crustal processes, fundamentals of friction
Physics-based seismic hazard assessment (improve building codes; provide reliable hazard maps; enable forecasting)
Isolated effects can be studied conceptually (2D dynamic rupture) and analytically (fracture mechanics)
Isolated effects can be studied conceptually (2D dynamic rupture) and analytically (fracture mechanics)
the only software that allows for rapid setup of models with realistic non- planar fault systems while exploiting the accuracy of a high-order numerical method.
→ hardware specific full “unrolling” and vectorization of all element operations
→ hardware specific full “unrolling” and vectorization of all element operations
→ small size matrix chain products
hardware specific full “unrolling” and vectorization of all element operations
SeisSol utilises hardware specific, customised code for each matrix-matrix multiplication via the libxsmm back-end. This approach effectively exploits the available hardware (AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2.
the code generator automatically detects and exploits sparse block patterns within the matrices for hardware specific full “unrolling” and vectorization of all element operations
an improved code generator facilitates the implementation of advanced PDE models, such as viscoelastic attenuation, which accounts for frequency dependent damping of seismic wave propagation.
SeisSol utilises hardware specific, customised code for each matrix-matrix multiplication via the libxsmm back-end. This approach effectively exploits the available hardware (AVX, MIC), reaching unto 8.6 PFLOPS on Tianhe-2.
the code generator automatically detects and exploits sparse block patterns within the matrices for hardware specific full “unrolling” and vectorization of all element operations
an improved code generator facilitates the implementation of advanced PDE models, such as viscoelastic attenuation, which accounts for frequency dependent damping of seismic wave propagation.
Extreme multi-scale problem in both space and time
Each layer is characterized by a different wave speed and thus requires a different mesh resolution.
led to long-period, abrupt vertical displacements of the seafloor, and thus to an increased tsunami risk. At present, this capability of incorporating such realistic geometries into physical earthquake models is unique worldwide.”
-sacrificing part of theoretical speed up in favour of hardware oriented data structures and efficient load-balancing
- reformulating numerical scheme in terms of metric-chain products
i/o is practically for free since we reserve one core for communication anyway
2nd plot: Note that peak performance implies nothing about time-to-solution!!!
Production run not on KNL
Clustered local time-stepping for dynamic rupture
⇒ Speed-up of 6.8
Asynchronous I/O for writing 13 TB of data for checkpoints and 2.8 TB of data
for visualisation and post-processing
A setup including a mesh with over 3 million el- ements for the 2004 Sumatra-Andaman earthquake can be obtained from Zenodo https://dx.doi.org/10.5281/zenodo.439946.