SlideShare a Scribd company logo
1 of 5
Download to read offline
Producing realistic earthquake shake maps using cloud computing: A cloud
implementation of a stochastic simulation approach to compute expected ground
motions for strong earthquakes
Papazachos C.B., G. Spyrou and A.A. Skarlatoudis
Geophysical Laboratory, Aristotle University of Thessaloniki, Greece
Abstract
With the proposed technique we aim to exploit the Microsoft Azure cloud
infrastructure for performing scientific computations for the near-real time assessment of
seismic motions after the occurrence of a strong earthquake. The main problem that
motivates this effort is the fact that after significant earthquakes limited information is
almost immediately available regarding the experienced seismic motions and especially
the potential damage on the infrastructures and lifelines of the urban areas that have
been affected by the earthquake. Even worse, information for the areas of higher
damage is not only unavailable but often false, as in most cases information (e.g. phone
calls to the police or fire brigade) is available from areas of less impact. Therefore, the
implementation of an automated application is proposed, in order to produce realistic
maps of the expected (estimated) spatial distribution of appropriate ground motion
measures and possibly induced damage, in a very short time (near-real time) after the
occurrence of a large earthquake.
The proposed application uses the automatic information for an earthquake,
distributed by the Seismological Station of the Aristotle University
(http://geophysics.geo.auth.gr/ss/station_index_en.html), and produces synthetic ground
motion parameters and seismograms, in order to derive the necessary ground motion
measures. The simulations for producing the synthetic seismograms are performed with
the stochastic method (using both point and finite source models) for a dense grid of
receivers that covers the study area. Stochastic simulations of strong ground motion
(either point source or more advanced finite-fault simulations) have been used during the
last 15 years mainly at a research level for simulations of individual earthquakes. The
original method, as well as its latest modifications are described in detail in [1-4]. In the
proposed application we mainly rely on the use of the modifications made to EXSIM [5]
by the implementation by Boore (2009) [6].
The stochastic method has been selected mainly because of its proven accuracy in
seismic impact assessment and wide applicability despite its simplicity compared to
more elaborate and/or deterministic methods. In addition, the execution time of the
various implementations of the method for a single receiver is of the order of few
minutes, which ranks the method among the first ones in terms of computational cost
and accuracy of the expected results. Nevertheless, in cases where a significant number
of sequential simulations has to be computed (e.g. the dense grid of receivers used in
our application) the execution time can be significantly larger, inadequate for practical
applications. Therefore, the successful implementation of the proposed application in
Microsoft Azure cloud infrastructure that will be based on the large number of available
computing nodes, allows the realization of the involved sequential simulation jobs within
a few minutes. Moreover, the application demands can be efficiently scaled up when
necessary (e.g. more scenarios, additional simulation sites, etc.), taking advantage of
the elasticity properties of the cloud-computing infrastructure.
From the technical point of view the proposed method utilizes the Microsoft Azure
infrastructure as a calculation backend by implementing the map-reduce execution
model. The first step is to populate a table in Azure Table Storage with the parameters
for the grid of virtual receivers for which the stochastic simulations will be performed
(receiver’s table). The process that manages the parameters stored receiver’s table for
each one them can be executed manually, giving the user the ability to add, delete and
modify the contents of the table as well as automatically based on preselected
parameters such area coverage, epicenter location (inland or sea), etc.
The map reduce model is being implemented in Windows Azure with the following
three simple worker roles, similar to [7]: A partitioner role , a simulation role and a
reducer role . Inter-process communication within the Azure datacenter, as well as with
external systems, such the servers in Seismological Data Centers, is achieved by
utilizing the Azure Queues system. Whenever an earthquake is automatically located
within the Seismological Station servers (running Linux operating system) a message
with the specific event information is being sent through Azure Queues to the Windows
Azure data centers . This message is being handled by the partionioner role, which
performs a query to the Azure Table Storage and based on the results it sends a number
of messages via Azure Queue that determine the map reduce pattern to the simulation
role. Each simulation role performs the following procedure in order to produce results:
1) Creates the input file based on the station parameters and earthquake data.
2) Executes the legacy simulation process (implemented in FORTRAN) and stores
the output files to local storage
3) Uploads the output files to Azure Blob storage for persistence
4) De-queues the specific message from the Azure Queue, so it can continue
working on another node of the simulation
5) Finally when all the simulation nodes are completed, the reducer role gathers all
the necessary data from Azure Blob Storage and creates several output products, such
a color shake map file that represents a specific ground motion measure, e.g. PGA
(Peak Ground Acceleration, see example of figure 1).
The fact that earthquakes phenomena by nature appear in a random (spike-type)
manner makes them a perfect candidate for utilizing the elastic capabilities of a cloud
infrastructure. When an earthquake occurs based on the its parameters (such as
preliminary location, magnitude, depth etc ) the number of worker nodes can be
allocated dynamically in order to achieve the execution of the simulation in the minimum
amount of time. After the simulation is completed these computational resources may
be released in order to minimize the cost. By using the elasticity of the Windows Azure
the Seismological Data Center can achieve both the execution of the simulation in a
minimum time, as well as keeping the cost of these calculation at the optimal level.
Figure 1. Typical shake map for Peak
Ground Acceleration (PGA in
cm/sec2
), generated for a Santorini
volcano intra-caldera earthquake with
magnitude M=5.5
Based on the previous description the proposed working flow after the occurrence of
the automatic location of an earthquake comprises of the following steps:
a) Execution of stochastic simulations assuming a point-source model, for a grid of
several hundred (e.g. 400-1000) of virtual receivers (earthquake recording stations) in
order to produce preliminary strong ground motion spatial distribution maps, using the
automatic location of the earthquake.
b) Using existing typical focal mechanisms (estimated from current knowledge for the
seismotectonic properties of the area), stochastic simulations will be performed for the
same grid of receivers in order to produce more accurate results, assuming a finite-
source model. The strong ground motion spatial distribution maps will be updated with
the more accurate results from this modeling.
c) Critical information such as potential high-damage areas and maps with strong ground
motion spatial distribution will be disseminated to Aristotle University Seismological
Station scientists for scientific assessment.
d) The previous steps will be re-executed immediately after a manual solution for the
location of the earthquake or its focal mechanism are available. All maps will be
updated after each step’s re-execution, in order to provide more accurate spatial shake
maps.
Critical information such as potential high-damage areas and maps with strong ground
motion spatial distribution will be disseminated to Aristotle University Seismological
Station scientists for scientific assessment, as well as to local and state authorities in
order to proceed in all the necessary risk mitigation actions.
The benefits from the proposed application have multiple levels of impact. The use of
the stochastic method for near-real time purposes is novel computationally, as this will
be the first time that this method will be realized on a cloud infrastructure (to the best of
our knowledge). A typical application for a single earthquake involves simulations for 600
receivers, for which 30 typical rupture scenarios would be examined. The sequential
execution for this number of receivers corresponds to an execution time of 1000-3000
minutes (depending on the configuration), with minimal memory requirements (a few Mb)
per simulation. Such computations can be easily realized on personal computers,
however, results would need several hours (or even days), making their use for near-real
time assessment of seismic motions impossible. This limitation can be efficiently handled
by cloud-computing infrastructures, since:
1) The large number of available computing nodes allows the realization of the
involved sequential simulation jobs within a few minutes. For example, execution of the
previously described application (single event-30 scenarios-300 simulation sites) on e.g.
200 cloud computing CPU cores would require a few minutes, making the near real-time
use possible.
2) The application demands can be efficiently scaled up when necessary (e.g. more
scenarios, additional simulation sites, etc.), taking advantage of the cloud elasticity,
without the need to invest the large cost involved with permanent computing
infrastructures.
3) Even when the involved application is used for high seismicity areas (e.g. broader
Aegean area), relatively strong earthquakes occur rather sparsely (e.g. once every
month), corresponding to ~10-20 annual “production” runs. Therefore, the maintenance
of a large dedicated computational infrastructure for the specific application (even at a
national level) corresponds to a very high-cost investment that will be relatively sparsely
used and practically never depreciated. Hence, cloud computing is a very efficient cost-
wise approach, since computational resources are allocated and used only whenever
the corresponding application need occurs.
The method is currently being tested for a preselected area (namely the broader city
of Thessaloniki and the Santorini island, Greece) in order to check the reliability of the
technique and calibrating (using cloud computing) the procedure against pre-existing
advance modeling results [8-10] but also due to the availability of instrumental/modeling
data for this test area. The produced results can have a significant impact on scientists,
as well as the wider public. Scientists like seismologists, soil- and civil-engineers, etc.
that are either interested on computing or using information related to the estimation of
the direct impact of strong earthquakes (e.g. expected or observed seismic motions and
damage distribution after a major seismic event) or even more diverse disciplines
(sociologists, psychologists, etc.) that are interested in the indirect, earthquake-induced
effects such as mass behavior, etc could benefit from our results. Moreover, the
produced results could be used by agencies that participate in the seismic risk mitigation
and especially those that are responsible to organize post-earthquake relief measures,
such as general or specialized governmental agencies and non-governmental
organizations involved in natural disaster support actions. Finally, parts of the produced
results could be used for public awareness and/or to inform people that are either
interested to share experience about earthquakes and disseminate this information
through the web or simply want to have reliable information about the possibly expected
or even observed consequences of strong seismic events.
Acknowledgements
This work has been partly funded by the CLOUD-QUAKE pilot project of the VENUS-
C research initiative (http://www.venus-c.eu), funded by the European Union through the
7th Framework Programme (Grant Agreement number 261565). All cloud computing
was performed on the Windows Azure infrastructure (https://windows.azure.com) for
which Microsoft Corporation & Microsoft Research has provided free access (~70,000
CPU hours), as part of the VENUS-C project. The help of both Microsoft
Corporation/Microsoft Research and VENUS-C project is gratefully acknowledged.
References
[1] Boore, D. M. (1983). Stochastic simulation of high-frequency ground motions based
on seismological models of the radiated spectra, Bull. Seism. Soc. Am., 73, 1865-1894.
[2] Boore, D. M. (1996). SMSIM--Fortran programs for simulating ground motions from
earthquakes: version 1.0, U.S. Geol. Surv. Open-File Rept. 96-80-A and 96-80-B, 73 pp.
[3] Boore, D. M. (2003). Simulation of ground motion using the stochastic method, Pure
and Applied Geophysics 160, 635-675.
[4] Joyner, W.B. and D.M. Boore (1988). Measurement, characterization, and prediction
of strong ground motion, in Earthquake Engineering and Soil Dynamics II, Proc. Am.
Soc. Civil Eng. Geotech. Eng. Div. Specialty Conf., June 27-30, 1988, Park City,
Utah, 43-102.
[5] Motazedian, D., and G. Atkinson (2005). Stochastic finite-fault model based on
dynamic corner frequency, Bull. Seism. Soc. Am., 95, 995–1010.
[6] Boore, D. M. (2009). Comparing stochastic point-source and finite-source ground-
motion simulations: SMSIM and EXSIM, Bull. Seism. Soc. Am. 99, 3202-3216.
[7] Carrión, A., I. Blanquer, V. Hernández (2012), A Service-based BLAST command
tool supported by Cloud Infrastructures, Proceedings of the HealthGrid 2012,
Amsterdam, 21-23 May 2012, (accepted and publication pending).
[8] Skarlatoudis A.A., C.B. Papazachos, N. Theodoulidis, J. Kristek and P. Moczo,
(2010). Local site-effects for the city of Thessaloniki (N. Greece) using a 3D Finite-
Difference method: A case of complex dependence on source and model parameters,
Geoph. J. Int.,182, 279-298.
[9] Skarlatoudis A.A., C.B. Papazachos and N. Theodoulidis, (2011). Spatial distribution
of site-effects and wave propagation properties in Thessaloniki (N. Greece) using a 3D
Finite Difference method, Geoph. J. Int., 185, 485-513.
[10] Skarlatoudis A.A., C.B. Papazachos and N. Theodoulidis, (2012). Site response
study of the city of Thessaloniki (N. Greece), for the 04/07/1978 (M5.1) aftershock, using
a 3D Finite-Difference wave propagation method, Bull. Seism. Soc. Am., (accept. publ.).

More Related Content

What's hot

ES_SAA_OG_PF_ECCTD_Pos
ES_SAA_OG_PF_ECCTD_PosES_SAA_OG_PF_ECCTD_Pos
ES_SAA_OG_PF_ECCTD_Pos
Syed Asad Alam
 
Automatic Object Exploratiob based on Probability
Automatic Object Exploratiob based on ProbabilityAutomatic Object Exploratiob based on Probability
Automatic Object Exploratiob based on Probability
peterson iit
 
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
inside-BigData.com
 
Progress_report_KUSP2016_Ngo-Sy-Toan
Progress_report_KUSP2016_Ngo-Sy-ToanProgress_report_KUSP2016_Ngo-Sy-Toan
Progress_report_KUSP2016_Ngo-Sy-Toan
Toan Ngo Sy
 

What's hot (20)

ES_SAA_OG_PF_ECCTD_Pos
ES_SAA_OG_PF_ECCTD_PosES_SAA_OG_PF_ECCTD_Pos
ES_SAA_OG_PF_ECCTD_Pos
 
Real-Time Visual Simulation of Smoke
Real-Time Visual Simulation of SmokeReal-Time Visual Simulation of Smoke
Real-Time Visual Simulation of Smoke
 
Acceleration of the Longwave Rapid Radiative Transfer Module using GPGPU
Acceleration of the Longwave Rapid Radiative Transfer Module using GPGPUAcceleration of the Longwave Rapid Radiative Transfer Module using GPGPU
Acceleration of the Longwave Rapid Radiative Transfer Module using GPGPU
 
Real-Time Analysis of Streaming Synchotron Data: SCinet SC19 Technology Chall...
Real-Time Analysis of Streaming Synchotron Data: SCinet SC19 Technology Chall...Real-Time Analysis of Streaming Synchotron Data: SCinet SC19 Technology Chall...
Real-Time Analysis of Streaming Synchotron Data: SCinet SC19 Technology Chall...
 
QGIS training class 2
QGIS training class 2QGIS training class 2
QGIS training class 2
 
A Highly Parallel Semi-Dataflow FPGA Architecture for Large-Scale N-Body Simu...
A Highly Parallel Semi-Dataflow FPGA Architecture for Large-Scale N-Body Simu...A Highly Parallel Semi-Dataflow FPGA Architecture for Large-Scale N-Body Simu...
A Highly Parallel Semi-Dataflow FPGA Architecture for Large-Scale N-Body Simu...
 
Automatic Object Exploratiob based on Probability
Automatic Object Exploratiob based on ProbabilityAutomatic Object Exploratiob based on Probability
Automatic Object Exploratiob based on Probability
 
Multi-core GPU – Fast parallel SAR image generation
Multi-core GPU – Fast parallel SAR image generationMulti-core GPU – Fast parallel SAR image generation
Multi-core GPU – Fast parallel SAR image generation
 
Implementation and integration of GPU-accelerated easyWave for instant tsunam...
Implementation and integration of GPU-accelerated easyWave for instant tsunam...Implementation and integration of GPU-accelerated easyWave for instant tsunam...
Implementation and integration of GPU-accelerated easyWave for instant tsunam...
 
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and Risk
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and RiskTowards Exascale Simulations for Regional-Scale Earthquake Hazard and Risk
Towards Exascale Simulations for Regional-Scale Earthquake Hazard and Risk
 
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
Scratch to Supercomputers: Bottoms-up Build of Large-scale Computational Lens...
 
Progress_report_KUSP2016_Ngo-Sy-Toan
Progress_report_KUSP2016_Ngo-Sy-ToanProgress_report_KUSP2016_Ngo-Sy-Toan
Progress_report_KUSP2016_Ngo-Sy-Toan
 
AASWinter2016
AASWinter2016AASWinter2016
AASWinter2016
 
GoogleSky Status at Google
GoogleSky Status at GoogleGoogleSky Status at Google
GoogleSky Status at Google
 
Parcel-based Damage Detection using SAR Data
Parcel-based Damage Detection using SAR DataParcel-based Damage Detection using SAR Data
Parcel-based Damage Detection using SAR Data
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)
 
第13回 配信講義 計算科学技術特論A(2021)
第13回 配信講義 計算科学技術特論A(2021)第13回 配信講義 計算科学技術特論A(2021)
第13回 配信講義 計算科学技術特論A(2021)
 
Anchor Positioning using Sensor Transmission Range Based Clustering for Mobil...
Anchor Positioning using Sensor Transmission Range Based Clustering for Mobil...Anchor Positioning using Sensor Transmission Range Based Clustering for Mobil...
Anchor Positioning using Sensor Transmission Range Based Clustering for Mobil...
 
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
 
Resource aware and incremental mosaics of wide areas from small scale ua vs
Resource aware and incremental mosaics of wide areas from small scale ua vsResource aware and incremental mosaics of wide areas from small scale ua vs
Resource aware and incremental mosaics of wide areas from small scale ua vs
 

Viewers also liked (15)

Cloud-Quake , Global Azure Bootcamp 2016
Cloud-Quake , Global Azure Bootcamp 2016Cloud-Quake , Global Azure Bootcamp 2016
Cloud-Quake , Global Azure Bootcamp 2016
 
Shooting schedule
Shooting scheduleShooting schedule
Shooting schedule
 
The history of horror cw
The history of horror cwThe history of horror cw
The history of horror cw
 
Mi biografía
Mi biografíaMi biografía
Mi biografía
 
Trailer overview
Trailer overview Trailer overview
Trailer overview
 
mi biofgrafia
mi biofgrafiami biofgrafia
mi biofgrafia
 
Supernatural horror conventions
Supernatural horror conventionsSupernatural horror conventions
Supernatural horror conventions
 
Mi biografía
Mi biografíaMi biografía
Mi biografía
 
Location ideas
Location ideasLocation ideas
Location ideas
 
Clew Design Institutional video
Clew Design Institutional videoClew Design Institutional video
Clew Design Institutional video
 
Locationideas 131127075623-phpapp02
Locationideas 131127075623-phpapp02Locationideas 131127075623-phpapp02
Locationideas 131127075623-phpapp02
 
Call sheets
Call sheetsCall sheets
Call sheets
 
Career assistance for older workers
Career assistance for older workersCareer assistance for older workers
Career assistance for older workers
 
Presentation sheryl añis tirados
Presentation sheryl añis tiradosPresentation sheryl añis tirados
Presentation sheryl añis tirados
 
Seobe srba i njihov položaj u ugarskoj
Seobe srba i njihov položaj u ugarskojSeobe srba i njihov položaj u ugarskoj
Seobe srba i njihov položaj u ugarskoj
 

Similar to cloud_futures_2.0_Papazachos

Automated engineering of domain-specific metamorphic testing environments
Automated engineering of domain-specific metamorphic testing environmentsAutomated engineering of domain-specific metamorphic testing environments
Automated engineering of domain-specific metamorphic testing environments
Pablo Gómez Abajo
 
RDC-2016-ST-paper-final-Mukherjee.pdf
RDC-2016-ST-paper-final-Mukherjee.pdfRDC-2016-ST-paper-final-Mukherjee.pdf
RDC-2016-ST-paper-final-Mukherjee.pdf
Poulastya Mukherjee
 
Generating higher accuracy digital data products by model parameter
Generating higher accuracy digital data products by model parameterGenerating higher accuracy digital data products by model parameter
Generating higher accuracy digital data products by model parameter
IAEME Publication
 

Similar to cloud_futures_2.0_Papazachos (20)

Digital Heritage Documentation Via TLS And Photogrammetry Case Study
Digital Heritage Documentation Via TLS And Photogrammetry Case StudyDigital Heritage Documentation Via TLS And Photogrammetry Case Study
Digital Heritage Documentation Via TLS And Photogrammetry Case Study
 
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHMA ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
 
Complex Background Subtraction Using Kalman Filter
Complex Background Subtraction Using Kalman FilterComplex Background Subtraction Using Kalman Filter
Complex Background Subtraction Using Kalman Filter
 
Automated engineering of domain-specific metamorphic testing environments
Automated engineering of domain-specific metamorphic testing environmentsAutomated engineering of domain-specific metamorphic testing environments
Automated engineering of domain-specific metamorphic testing environments
 
CE541-F14.Bhobe.Jain
CE541-F14.Bhobe.JainCE541-F14.Bhobe.Jain
CE541-F14.Bhobe.Jain
 
Differential SAR Interferometry Using ALOS-2 Data for Nepal Earthquake
Differential SAR Interferometry Using ALOS-2 Data for Nepal EarthquakeDifferential SAR Interferometry Using ALOS-2 Data for Nepal Earthquake
Differential SAR Interferometry Using ALOS-2 Data for Nepal Earthquake
 
IRJET-Multiple Object Detection using Deep Neural Networks
IRJET-Multiple Object Detection using Deep Neural NetworksIRJET-Multiple Object Detection using Deep Neural Networks
IRJET-Multiple Object Detection using Deep Neural Networks
 
Surface generation from point cloud.pdf
Surface generation from point cloud.pdfSurface generation from point cloud.pdf
Surface generation from point cloud.pdf
 
An Unmanned Rotorcraft System with Embedded Design
An Unmanned Rotorcraft System with Embedded DesignAn Unmanned Rotorcraft System with Embedded Design
An Unmanned Rotorcraft System with Embedded Design
 
RDC-2016-ST-paper-final-Mukherjee.pdf
RDC-2016-ST-paper-final-Mukherjee.pdfRDC-2016-ST-paper-final-Mukherjee.pdf
RDC-2016-ST-paper-final-Mukherjee.pdf
 
Background Subtraction Algorithm for Moving Object Detection Using Denoising ...
Background Subtraction Algorithm for Moving Object Detection Using Denoising ...Background Subtraction Algorithm for Moving Object Detection Using Denoising ...
Background Subtraction Algorithm for Moving Object Detection Using Denoising ...
 
Testing the global grid of master events for waveform cross correlation with ...
Testing the global grid of master events for waveform cross correlation with ...Testing the global grid of master events for waveform cross correlation with ...
Testing the global grid of master events for waveform cross correlation with ...
 
Stereo vision-based obstacle avoidance module on 3D point cloud data
Stereo vision-based obstacle avoidance module on 3D point cloud dataStereo vision-based obstacle avoidance module on 3D point cloud data
Stereo vision-based obstacle avoidance module on 3D point cloud data
 
Future guidelines the meteorological view - Isabel Martínez (AEMet)
Future guidelines the meteorological view - Isabel Martínez (AEMet)Future guidelines the meteorological view - Isabel Martínez (AEMet)
Future guidelines the meteorological view - Isabel Martínez (AEMet)
 
ltu-cover6899158065669445093
ltu-cover6899158065669445093ltu-cover6899158065669445093
ltu-cover6899158065669445093
 
Robust foreground modelling to segment and detect multiple moving objects in ...
Robust foreground modelling to segment and detect multiple moving objects in ...Robust foreground modelling to segment and detect multiple moving objects in ...
Robust foreground modelling to segment and detect multiple moving objects in ...
 
Generating higher accuracy digital data products by model parameter
Generating higher accuracy digital data products by model parameterGenerating higher accuracy digital data products by model parameter
Generating higher accuracy digital data products by model parameter
 
Jgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging componentJgrass-NewAge: Kriging component
Jgrass-NewAge: Kriging component
 
40120140503006
4012014050300640120140503006
40120140503006
 
PhD Thesis: Performance Modeling of Cloud Computing Centers
PhD Thesis: Performance Modeling of Cloud Computing CentersPhD Thesis: Performance Modeling of Cloud Computing Centers
PhD Thesis: Performance Modeling of Cloud Computing Centers
 

More from George Spyrou

More from George Spyrou (9)

IT PRO|DEV Connections 2020 - "Developing a Speech to Text component using Az...
IT PRO|DEV Connections 2020 - "Developing a Speech to Text component using Az...IT PRO|DEV Connections 2020 - "Developing a Speech to Text component using Az...
IT PRO|DEV Connections 2020 - "Developing a Speech to Text component using Az...
 
Global Azure 2020 - Developing a Speech to Text component
Global Azure 2020 - Developing a Speech to Text componentGlobal Azure 2020 - Developing a Speech to Text component
Global Azure 2020 - Developing a Speech to Text component
 
Getting started with Azure Machine Learning Studio
Getting started with Azure Machine Learning StudioGetting started with Azure Machine Learning Studio
Getting started with Azure Machine Learning Studio
 
Global Azure Bootcamp 2018 - Using Azure Functions and Microsoft Cognitive Se...
Global Azure Bootcamp 2018 - Using Azure Functions and Microsoft Cognitive Se...Global Azure Bootcamp 2018 - Using Azure Functions and Microsoft Cognitive Se...
Global Azure Bootcamp 2018 - Using Azure Functions and Microsoft Cognitive Se...
 
Azure Global Bootcamp 2017 - Microsoft Cognitive Services
Azure Global Bootcamp 2017 - Microsoft Cognitive ServicesAzure Global Bootcamp 2017 - Microsoft Cognitive Services
Azure Global Bootcamp 2017 - Microsoft Cognitive Services
 
Tech saloniki - Cross platform mobile development using xamarin
Tech saloniki  - Cross platform mobile development using xamarinTech saloniki  - Cross platform mobile development using xamarin
Tech saloniki - Cross platform mobile development using xamarin
 
AirNow app presentation at EUAPPCUP competition
AirNow app presentation at EUAPPCUP competitionAirNow app presentation at EUAPPCUP competition
AirNow app presentation at EUAPPCUP competition
 
Appcampus Alumni activities
Appcampus Alumni activitiesAppcampus Alumni activities
Appcampus Alumni activities
 
Appcademy presentation at Mobile App Accelerator Camp Athens event
Appcademy presentation at Mobile App Accelerator Camp Athens eventAppcademy presentation at Mobile App Accelerator Camp Athens event
Appcademy presentation at Mobile App Accelerator Camp Athens event
 

cloud_futures_2.0_Papazachos

  • 1. Producing realistic earthquake shake maps using cloud computing: A cloud implementation of a stochastic simulation approach to compute expected ground motions for strong earthquakes Papazachos C.B., G. Spyrou and A.A. Skarlatoudis Geophysical Laboratory, Aristotle University of Thessaloniki, Greece Abstract With the proposed technique we aim to exploit the Microsoft Azure cloud infrastructure for performing scientific computations for the near-real time assessment of seismic motions after the occurrence of a strong earthquake. The main problem that motivates this effort is the fact that after significant earthquakes limited information is almost immediately available regarding the experienced seismic motions and especially the potential damage on the infrastructures and lifelines of the urban areas that have been affected by the earthquake. Even worse, information for the areas of higher damage is not only unavailable but often false, as in most cases information (e.g. phone calls to the police or fire brigade) is available from areas of less impact. Therefore, the implementation of an automated application is proposed, in order to produce realistic maps of the expected (estimated) spatial distribution of appropriate ground motion measures and possibly induced damage, in a very short time (near-real time) after the occurrence of a large earthquake. The proposed application uses the automatic information for an earthquake, distributed by the Seismological Station of the Aristotle University (http://geophysics.geo.auth.gr/ss/station_index_en.html), and produces synthetic ground motion parameters and seismograms, in order to derive the necessary ground motion measures. The simulations for producing the synthetic seismograms are performed with the stochastic method (using both point and finite source models) for a dense grid of receivers that covers the study area. Stochastic simulations of strong ground motion (either point source or more advanced finite-fault simulations) have been used during the last 15 years mainly at a research level for simulations of individual earthquakes. The original method, as well as its latest modifications are described in detail in [1-4]. In the proposed application we mainly rely on the use of the modifications made to EXSIM [5] by the implementation by Boore (2009) [6]. The stochastic method has been selected mainly because of its proven accuracy in seismic impact assessment and wide applicability despite its simplicity compared to more elaborate and/or deterministic methods. In addition, the execution time of the various implementations of the method for a single receiver is of the order of few minutes, which ranks the method among the first ones in terms of computational cost and accuracy of the expected results. Nevertheless, in cases where a significant number of sequential simulations has to be computed (e.g. the dense grid of receivers used in our application) the execution time can be significantly larger, inadequate for practical
  • 2. applications. Therefore, the successful implementation of the proposed application in Microsoft Azure cloud infrastructure that will be based on the large number of available computing nodes, allows the realization of the involved sequential simulation jobs within a few minutes. Moreover, the application demands can be efficiently scaled up when necessary (e.g. more scenarios, additional simulation sites, etc.), taking advantage of the elasticity properties of the cloud-computing infrastructure. From the technical point of view the proposed method utilizes the Microsoft Azure infrastructure as a calculation backend by implementing the map-reduce execution model. The first step is to populate a table in Azure Table Storage with the parameters for the grid of virtual receivers for which the stochastic simulations will be performed (receiver’s table). The process that manages the parameters stored receiver’s table for each one them can be executed manually, giving the user the ability to add, delete and modify the contents of the table as well as automatically based on preselected parameters such area coverage, epicenter location (inland or sea), etc. The map reduce model is being implemented in Windows Azure with the following three simple worker roles, similar to [7]: A partitioner role , a simulation role and a reducer role . Inter-process communication within the Azure datacenter, as well as with external systems, such the servers in Seismological Data Centers, is achieved by utilizing the Azure Queues system. Whenever an earthquake is automatically located within the Seismological Station servers (running Linux operating system) a message with the specific event information is being sent through Azure Queues to the Windows Azure data centers . This message is being handled by the partionioner role, which performs a query to the Azure Table Storage and based on the results it sends a number of messages via Azure Queue that determine the map reduce pattern to the simulation role. Each simulation role performs the following procedure in order to produce results: 1) Creates the input file based on the station parameters and earthquake data. 2) Executes the legacy simulation process (implemented in FORTRAN) and stores the output files to local storage 3) Uploads the output files to Azure Blob storage for persistence 4) De-queues the specific message from the Azure Queue, so it can continue working on another node of the simulation 5) Finally when all the simulation nodes are completed, the reducer role gathers all the necessary data from Azure Blob Storage and creates several output products, such a color shake map file that represents a specific ground motion measure, e.g. PGA (Peak Ground Acceleration, see example of figure 1). The fact that earthquakes phenomena by nature appear in a random (spike-type) manner makes them a perfect candidate for utilizing the elastic capabilities of a cloud infrastructure. When an earthquake occurs based on the its parameters (such as preliminary location, magnitude, depth etc ) the number of worker nodes can be allocated dynamically in order to achieve the execution of the simulation in the minimum amount of time. After the simulation is completed these computational resources may
  • 3. be released in order to minimize the cost. By using the elasticity of the Windows Azure the Seismological Data Center can achieve both the execution of the simulation in a minimum time, as well as keeping the cost of these calculation at the optimal level. Figure 1. Typical shake map for Peak Ground Acceleration (PGA in cm/sec2 ), generated for a Santorini volcano intra-caldera earthquake with magnitude M=5.5 Based on the previous description the proposed working flow after the occurrence of the automatic location of an earthquake comprises of the following steps: a) Execution of stochastic simulations assuming a point-source model, for a grid of several hundred (e.g. 400-1000) of virtual receivers (earthquake recording stations) in order to produce preliminary strong ground motion spatial distribution maps, using the automatic location of the earthquake. b) Using existing typical focal mechanisms (estimated from current knowledge for the seismotectonic properties of the area), stochastic simulations will be performed for the same grid of receivers in order to produce more accurate results, assuming a finite- source model. The strong ground motion spatial distribution maps will be updated with the more accurate results from this modeling. c) Critical information such as potential high-damage areas and maps with strong ground motion spatial distribution will be disseminated to Aristotle University Seismological Station scientists for scientific assessment. d) The previous steps will be re-executed immediately after a manual solution for the location of the earthquake or its focal mechanism are available. All maps will be updated after each step’s re-execution, in order to provide more accurate spatial shake maps. Critical information such as potential high-damage areas and maps with strong ground motion spatial distribution will be disseminated to Aristotle University Seismological Station scientists for scientific assessment, as well as to local and state authorities in order to proceed in all the necessary risk mitigation actions.
  • 4. The benefits from the proposed application have multiple levels of impact. The use of the stochastic method for near-real time purposes is novel computationally, as this will be the first time that this method will be realized on a cloud infrastructure (to the best of our knowledge). A typical application for a single earthquake involves simulations for 600 receivers, for which 30 typical rupture scenarios would be examined. The sequential execution for this number of receivers corresponds to an execution time of 1000-3000 minutes (depending on the configuration), with minimal memory requirements (a few Mb) per simulation. Such computations can be easily realized on personal computers, however, results would need several hours (or even days), making their use for near-real time assessment of seismic motions impossible. This limitation can be efficiently handled by cloud-computing infrastructures, since: 1) The large number of available computing nodes allows the realization of the involved sequential simulation jobs within a few minutes. For example, execution of the previously described application (single event-30 scenarios-300 simulation sites) on e.g. 200 cloud computing CPU cores would require a few minutes, making the near real-time use possible. 2) The application demands can be efficiently scaled up when necessary (e.g. more scenarios, additional simulation sites, etc.), taking advantage of the cloud elasticity, without the need to invest the large cost involved with permanent computing infrastructures. 3) Even when the involved application is used for high seismicity areas (e.g. broader Aegean area), relatively strong earthquakes occur rather sparsely (e.g. once every month), corresponding to ~10-20 annual “production” runs. Therefore, the maintenance of a large dedicated computational infrastructure for the specific application (even at a national level) corresponds to a very high-cost investment that will be relatively sparsely used and practically never depreciated. Hence, cloud computing is a very efficient cost- wise approach, since computational resources are allocated and used only whenever the corresponding application need occurs. The method is currently being tested for a preselected area (namely the broader city of Thessaloniki and the Santorini island, Greece) in order to check the reliability of the technique and calibrating (using cloud computing) the procedure against pre-existing advance modeling results [8-10] but also due to the availability of instrumental/modeling data for this test area. The produced results can have a significant impact on scientists, as well as the wider public. Scientists like seismologists, soil- and civil-engineers, etc. that are either interested on computing or using information related to the estimation of the direct impact of strong earthquakes (e.g. expected or observed seismic motions and damage distribution after a major seismic event) or even more diverse disciplines (sociologists, psychologists, etc.) that are interested in the indirect, earthquake-induced effects such as mass behavior, etc could benefit from our results. Moreover, the produced results could be used by agencies that participate in the seismic risk mitigation and especially those that are responsible to organize post-earthquake relief measures, such as general or specialized governmental agencies and non-governmental organizations involved in natural disaster support actions. Finally, parts of the produced
  • 5. results could be used for public awareness and/or to inform people that are either interested to share experience about earthquakes and disseminate this information through the web or simply want to have reliable information about the possibly expected or even observed consequences of strong seismic events. Acknowledgements This work has been partly funded by the CLOUD-QUAKE pilot project of the VENUS- C research initiative (http://www.venus-c.eu), funded by the European Union through the 7th Framework Programme (Grant Agreement number 261565). All cloud computing was performed on the Windows Azure infrastructure (https://windows.azure.com) for which Microsoft Corporation & Microsoft Research has provided free access (~70,000 CPU hours), as part of the VENUS-C project. The help of both Microsoft Corporation/Microsoft Research and VENUS-C project is gratefully acknowledged. References [1] Boore, D. M. (1983). Stochastic simulation of high-frequency ground motions based on seismological models of the radiated spectra, Bull. Seism. Soc. Am., 73, 1865-1894. [2] Boore, D. M. (1996). SMSIM--Fortran programs for simulating ground motions from earthquakes: version 1.0, U.S. Geol. Surv. Open-File Rept. 96-80-A and 96-80-B, 73 pp. [3] Boore, D. M. (2003). Simulation of ground motion using the stochastic method, Pure and Applied Geophysics 160, 635-675. [4] Joyner, W.B. and D.M. Boore (1988). Measurement, characterization, and prediction of strong ground motion, in Earthquake Engineering and Soil Dynamics II, Proc. Am. Soc. Civil Eng. Geotech. Eng. Div. Specialty Conf., June 27-30, 1988, Park City, Utah, 43-102. [5] Motazedian, D., and G. Atkinson (2005). Stochastic finite-fault model based on dynamic corner frequency, Bull. Seism. Soc. Am., 95, 995–1010. [6] Boore, D. M. (2009). Comparing stochastic point-source and finite-source ground- motion simulations: SMSIM and EXSIM, Bull. Seism. Soc. Am. 99, 3202-3216. [7] Carrión, A., I. Blanquer, V. Hernández (2012), A Service-based BLAST command tool supported by Cloud Infrastructures, Proceedings of the HealthGrid 2012, Amsterdam, 21-23 May 2012, (accepted and publication pending). [8] Skarlatoudis A.A., C.B. Papazachos, N. Theodoulidis, J. Kristek and P. Moczo, (2010). Local site-effects for the city of Thessaloniki (N. Greece) using a 3D Finite- Difference method: A case of complex dependence on source and model parameters, Geoph. J. Int.,182, 279-298. [9] Skarlatoudis A.A., C.B. Papazachos and N. Theodoulidis, (2011). Spatial distribution of site-effects and wave propagation properties in Thessaloniki (N. Greece) using a 3D Finite Difference method, Geoph. J. Int., 185, 485-513. [10] Skarlatoudis A.A., C.B. Papazachos and N. Theodoulidis, (2012). Site response study of the city of Thessaloniki (N. Greece), for the 04/07/1978 (M5.1) aftershock, using a 3D Finite-Difference wave propagation method, Bull. Seism. Soc. Am., (accept. publ.).