Seismic Hazard Model for the Middle-East Region, Laurentiu Danciu, Swiss Seismological Service, ETH Zurich, Switzerland & GEM Hazard Modeler; Karin Sesetyan & Mine Demircioglu, Kandilli Observatory and Earthquake Research Institute, Istanbul, Turkey
This document summarizes a seismic hazard model for the Middle East region. It includes 3 area source models, 9 fault source models, and a spatially smoothed seismicity model developed based on a declustered earthquake catalog. The models were constructed through a collaborative process involving multiple experts. The key elements summarized are:
- 143 area source zones defined based on seismicity patterns and tectonic features.
- Fault sources were selected based on being capable and having slip rates above 0.1 mm/year, with 3 confidence classes.
- Maximum magnitudes were assigned through various methods with sensitivity analysis performed.
- A logic tree incorporates the alternative source models and characterizations.
- The models were developed to be stable
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Titan’s Topography and Shape at the Endof the Cassini MissionSérgio Sacani
With the conclusion of the Cassini mission, we present an updated topographic map of Titan,including all the available altimetry, SARtopo, and stereophotogrammetry topographic data sets availablefrom the mission. We use radial basis func tions to interpolate the sparse data set, which covers only ∼9%of Titan’s global area. The most notable updates to the topography include higher coverage of the polesof Titan, improved fits to the global shape, and a finer resolution of the global interpolation. We alsopresent a statistical analysis of the error in the derived products and perform a global minimization on aprofile-by-profile basis to account for observed biases in the input data set. We find a greater flattening ofTitan than measured, additional topographic rises in Titan’s southern hemisphere and better constrain thepossible locations of past and present liquids on Titan’s surface.
Extreme weather events pose great potential risk on ecosystem, infrastructure and human health. Analyzing extreme weather in the observed record (satellite, reanalysis products) and characterizing changes in extremes in simulations of future climate regimes is an important task. Thus far, extreme weather events have been typically specified by the community through hand-coded, multi-variate threshold conditions. Such criteria are usually subjective, and often there is little agreement in the community on the specific algorithm that should be used. We propose the use of a different approach: machine learning (and in particular deep learning) for solving this important problem. If human experts can provide spatio-temporal patches of a climate dataset, and associated labels, we can turn to a machine learning system to learn the underlying feature representation. The trained Machine Learning (ML) system can then be applied to novel datasets, thereby automating the pattern detection step. Summary statistics, such as location, intensity and frequency of such events can be easily computed as a post-process.
We will report compelling results from our investigations of Deep Learning for the tasks of classifying tropical cyclones, atmospheric rivers and weather front events. For all of these events, we observe 90-99% classification accuracy. We will also report on progress in localizing such events: namely drawing a bounding box (of the correct size and scale) around the weather pattern of interest. Both tasks currently utilize multi-layer convolutional networks in conjunction with hyper-parameter optimization. We utilize HPC systems at NERSC to perform the optimization across multiple nodes, and utilize highly-tuned libraries to utilize multiple cores on a single node. We will conclude with thoughts on the frontier of Deep Learning and the role of humans (vis-a-vis AI) in the scientific discovery process.
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Titan’s Topography and Shape at the Endof the Cassini MissionSérgio Sacani
With the conclusion of the Cassini mission, we present an updated topographic map of Titan,including all the available altimetry, SARtopo, and stereophotogrammetry topographic data sets availablefrom the mission. We use radial basis func tions to interpolate the sparse data set, which covers only ∼9%of Titan’s global area. The most notable updates to the topography include higher coverage of the polesof Titan, improved fits to the global shape, and a finer resolution of the global interpolation. We alsopresent a statistical analysis of the error in the derived products and perform a global minimization on aprofile-by-profile basis to account for observed biases in the input data set. We find a greater flattening ofTitan than measured, additional topographic rises in Titan’s southern hemisphere and better constrain thepossible locations of past and present liquids on Titan’s surface.
Similar to Seismic Hazard Model for the Middle-East Region, Laurentiu Danciu, Swiss Seismological Service, ETH Zurich, Switzerland & GEM Hazard Modeler; Karin Sesetyan & Mine Demircioglu, Kandilli Observatory and Earthquake Research Institute, Istanbul, Turkey
New generation of high sensitivity airborne potassium magnetometersGem Systems
Overview
Airborne Trends in Mineral Exploration
Why Potassium?
Benefits of Potassium Vapour Magnetometers
How we did it!
Bird’s family
Gradiometers – Rationale
Tri-Directional Gradiometer – Bird
GEM DAS
Sample Customer Maps
Conclusion
Source : http://www.gemsys.ca/technology/tech-notes-papers/
We present a survey of computational and applied mathematical techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties.
Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.
Testing the global grid of master events for waveform cross correlation with ...Ivan Kitov
Abstract
The Comprehensive Nuclear-Test-Ban Treaty’s verification regime requires uniform distribution of monitoring capabilities over the globe. The use of waveform cross correlation as a monitoring technique demands waveform templates from master events outside regions of natural seismicity and test sites. We populated aseismic areas with masters having synthetic templates for predefined sets (from 3 to 10) of primary array stations of the International Monitoring System. Previously, we tested the global set of master events and synthetic templates using IMS seismic data for February 12, 2013 and demonstrated excellent detection and location capability of the matched filter technique. In this study, we test the global grid of synthetic master events using seismic events from the Reviewed Event Bulletin. For detection, we use standard STA/LTA (SNR) procedure applied to the time series of cross correlation coefficient (CC). Phase association is based on SNR, CC, and arrival times. Azimuth and slowness estimates based f-k analysis cross correlation traces are used to reject false arrivals.
Similar to Seismic Hazard Model for the Middle-East Region, Laurentiu Danciu, Swiss Seismological Service, ETH Zurich, Switzerland & GEM Hazard Modeler; Karin Sesetyan & Mine Demircioglu, Kandilli Observatory and Earthquake Research Institute, Istanbul, Turkey (20)
Social Vulnerability Datasets through the OpenQuake Platform and Description of a Case-Scenario of Integrated Risk and Resilience using OpenQuake Tools.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Seismic Hazard Model for the Middle-East Region, Laurentiu Danciu, Swiss Seismological Service, ETH Zurich, Switzerland & GEM Hazard Modeler; Karin Sesetyan & Mine Demircioglu, Kandilli Observatory and Earthquake Research Institute, Istanbul, Turkey
1. Seismic Hazard Model
For Middle-East Region
Laurentiu Danciu
Swiss Seismological Service,
ETH Zurich, Switzerland &
GEM Hazard Modeler
Karin Sesetyan
Mine Demircioglu
KANDILLI OBSERVATORY
and EARTHQUAKE RESEARCH
INSTITUTE,
Istanbul Turkey
EMME Final Meeting
September 30th – October 2nd
Istanbul, Turkey
5. Stability
1. Provide assurance that the numerical hazard
results will be stable for the next years (50
years ?)
2. Unless significant new seismic information,
which could occur at any time, calls for a
major revision
7. What Consensus stands for?
• 1) there is not likely to be "consensus" (as the
word is commonly understood) among the various
experts and
• 2) no single interpretation concerning a complex
earth - sciences issue is the "correct" one.
SSHAC : Recommendations for PSHA: Guidance on Uncertainty and Use
of Experts
Most likely for no consensus there is a
consensus!
8. SHARE Project- DB Stats
• 3 source models
• 960 End-Branches
• 12 Intensity Measure Types
• 7 Return Periods [50 to 10000Years]
• Mean, Median and Four Quantile
• 130 000 sites
‣ Hazard Maps: 504
‣ Hazard Curves: 9.36 mil
‣ Uniform Hazard Spectra: 5.46mil
‣ Disaggregation: ongoing
Dynamic Model
EMME Project- DB Stats
55000
sites
9. Overview
• Earthquake Catalog
• Maximum Magnitude
• Seismic Source Models
–Area Source Model
–Fault source Model
–Spatially Smoothed Seismicity
10. EMME Earthquake Catalog
• Historical part (-1900)
• Early and modern instrumental (~2006)
• Harmonized in terms of Mw
Total : 27174 events
11. EMME Earthquake Catalog
• Seismicity models require a
– Declustered earthquake catalog of independent events
– Completeness intervals for estimating the Poissonian (time-
independent) earthquake rates.
• Declustering Method.
– Windowing approach based on windows provided by Grünt
hal (1985)
• Gardner and Knopoff (1974)
– After Declustering: 10524 events
17. Maximum Magnitude
• Largest magnitudes that a seismogenic region is
capable of generating.
• Upper-bound magnitude to the earthquake
recurrence (frequency-magnitude) curve.
• Maximum Magnitude assessment (Super-Zones)
– Historical seismicity record
– Location uncertainties
– Analogies to tectonic regions
– Added increment (0.30)
23. Maximum Magnitude: Sensitivity
• Maximum magnitude impacts the activity
computation –if used to anchor the expert
fitting
• 5 to 20% increased hazard values
• Return period dependent
• The pair of Maximum Magnitude
Recurrence rates to carefully be revised
25. Area Source Model
• Classical area source zones based on the tectonic
findings and their correlation and up-to-day
seismicity
• Derived from seismicity patterns
– Ensure the zonation adequately reflect this
pattern
• Surface projection of identified active faults
(capable of generating earthquakes)
26. Model Construction: Phase One
• Country based models
• Phase one:
– Overlapping sources at national borders
• [ trying to keep the original information]
– Remove duplicates (the same source
defined within countries)
– Eliminate zones too small to be analyzed
(spatially smooth seismicity take cares of
it)
27. Model Construction: Phase Two
– Simplify unnecessary or artificial complex
zonation
– Reshaping according to the known main
seismogenic features (i.e known faults )
– Local experts feedback
– Reconcile different interpretations
– New sources re-defined after technical
discussions among the national
representatives/local experts
28. Area Source Model
• 143 shallow crustal area source zones,
• 6 for modeling the deep seismicity, and
• 5 complex faults
29. Source Characterization
• Main Assumptions:
– Homogeneous, declustered catalogue
– Completeness defined for 18 super zones spanning the entire
region
– Maximum likelihood approach (Weichert 1984)
– Truncated Guttenberg-Richter Magnitude Frequency
Distribution
• 10a – annual number of events of magnitude greater or
equal to zero
• b-value
• Truncated at each assigned maximum magnitude
• For each source three magnitude-frequency-distributions were
derived
• A Matlab* toolbox was developed
30. Source Characterization: issues
• Sources with limited number of events
• Sources with less that 15 events were assigned with a default
activity rate corresponding to each region
• 30 area source zones with less than 15 events
• Estimation stability achieved for more than 30 events/source
47. EMME Faults Dataset
• Fault source model derived from the faults database collected within WP02
– Total number: 3397 fault segments
– Total Km: 91551km
48. Fault Sources
• Criteria to select active faults to be used for hazard
assessment:
– Identified active faults [capable of earthquakes]: Northern
Anatolian Faults, Marmara Faults, Zagros Transform Faults
– At least 0.10mm/year (1m in 1000years - Neocene)
– Maximum magnitude equal to 6.20
– Fully parameterized:
• Geometry
• Slip-rates
– Confidence Classes:
• Class A: complete information provided by the compiler
• Class B: partial information provided by compiler
• Class C: limited information provided
• Class D: only top trace available
49. Fault Sources
• Confidence Classes:
• Class A (red): complete information provided by the compiler
• Class B (green): partial information provided by compiler
• Class C (blue): limited information provided
53. Fault Sources- Class C
• Class C fault trace and fault type info available
• Maximum magnitude
– Estimated from faults size
– Slip rate
• First Slip Rate Estimated as proposed by USGS
57. Active Faults
How to characterize the seismic potential of the faults?
- Convert slip-rates to seismicity
58. Fault Source Model Characterization
Procedure:
1. Generate a buffer region of 20km
for each fault
59. Procedure:
2. Remove earthquakes within buffer zone
3. Activity on faults computed from slip rates
4. Activity on the background – based on the “outside”
catalogue
Fault Source Model Characterization
62. Faults Characterization
Activity rates are calculated from geologic
information:
•Slip rate
•Fault length / aspect ratio
•Maximum Magnitude
Recurrence Rate Model:
•Anderson & Luco (1983) Model 2:
•b-value assumed from the corresponding completeness
super zones
•Integration from Mmin = 5.00 to Faults Mmax
N2(M)=
d -b
b
æ
èç
ö
ø÷
S
b
æ
èç
ö
ø÷ eb-(Mmax-M
-1é
ë
ù
ûe-((d/2)Mmax )
63. Activity Rates - Background
• Smoothed spatially with a variable Kernel
– r : epicentral distance
– di: Variable epicentral distance to next neighbor
nv
• Optimization for distance parameter with
retrospective tests
vsF (r,di ) c(di )(r2
di
2
)1.5
64. Kernel Optimization: Retrospective Testing
Optimize kernel using a
likelihood tests
Split catalog in learning
and target period
Optimize on 5 year target
period
Use best likelihood-value
to generate model rates
Learning Period Target Period
1000 2002 2007
70. 15Years Seismicity Mw >= 6.5
2013-09-24 Awaran Pakistan
2013-04-16 East of Khash Iran
2011-10-23 Eastern Turkey
2011-01-18 southwestern Pakistan
2010-12-20 southeastern Iran
2009-01-03 Hindu Kush region Afghanistan
2008-10-05 Kyrgyzstan
2005-12-12 Hindu Kush region Afghanistan
2005-10-08 Pakistan
2004-04-05 Hindu Kush region Afghanistan
2003-12-26 Southeastern Iran
2002-06-22 Western Iran
2002-03-03 Hindu Kush region Afghanistan
2002-02-03 western Turkey
2001-01-26 Gujarat India
2000-12-06 Turkmenistan
2000-11-25 Caspian Sea offshore Azerbaijan
1999-11-12 western Turkey
1999-11-08 Hindu Kush region Afghanistan
1999-08-17 Western Turkey
1999-03-04 Southern Iran
1998-05-30 Hindu Kush region Afghanistan
1998-03-14 Eastern Iran
71. Before 24th Sept 2013 Event in Pakistan
EMME Results, before the earthquake
73. Spatially Smoothed Seismicity
• Based on the
– Up-to-date seismicity
– Declustered catalogue
• Main Assumption:
– Earthquake's self-similarity: earthquakes occur at near clusters of
previous smaller earthquakes.
– Derived equally spaced [10 x 10 km] cells
– 53300 non-overlapping cells
– the earthquake rates determined for cells are spatially smoothed
using a one Gaussian smoothing kernel Frankel 1995]
– Kernel constant size equals to 25km
77. Summary
•Building a regional seismic hazard model is a collective
effort
•Aim at generating the up-to-date , flexible and scalable
database hat will permit continuous update, refinement, and
analysis.
•Data will be parameterized and input into the database with
a specific format.
Hazard
Software
“Black Box”
INPUT OUTPUT
“Easy Review” Box
Data
Interpretations
Assumptions
78. Summary
•Transparent computational procedure, with all input files
available as well as the software packages (Hazard
Modeler Toolkit, OpenQuake)
•Each dataset has certain degree of completeness, but
there is room for improvements;
•Specifically,
•The depth information of the events
•Maximum magnitude definition
•More parameterized faults
•Velocities from GPS data
•Revision of all source models
•What are the weakness points of each model?
•Road map to the final deliverable