This document summarizes a seismic hazard model for the Middle East region. It includes 3 area source models, 9 fault source models, and a spatially smoothed seismicity model developed based on a declustered earthquake catalog. The models were constructed through a collaborative process involving multiple experts. The key elements summarized are:
- 143 area source zones defined based on seismicity patterns and tectonic features.
- Fault sources were selected based on being capable and having slip rates above 0.1 mm/year, with 3 confidence classes.
- Maximum magnitudes were assigned through various methods with sensitivity analysis performed.
- A logic tree incorporates the alternative source models and characterizations.
- The models were developed to be stable
This study conducted a seismic risk assessment for Portugal using probabilistic seismic hazard analysis and developing new exposure and vulnerability models. A probabilistic seismic hazard model was developed considering logic trees for ground motion models, seismic source characterizations, and other parameters. An exposure model was created using a national building census to characterize Portugal's building stock. Fragility functions were developed for reinforced concrete and masonry structures. Probabilistic loss estimates were calculated at different spatial scales to identify regions in Portugal where risk mitigation measures should be prioritized.
The document discusses tools and datasets for seismic hazard analysis from site-specific to global scales. It describes the OpenQuake engine and Hazard Modeller's Toolkit (HMTK) which can be used for classical and event-based probabilistic seismic hazard analysis (PSHA) at various scales. The OpenQuake Ground Motion Toolkit helps with selection and weighting of ground motion prediction equations. These tools are applied in site-specific analyses, and for developing national, regional, and global seismic hazard models using various data sources on earthquakes, faults, and strain.
The document summarizes the products and applications of GEM's Hazard program. It outlines five global datasets created through international projects including historical earthquake archives, instrumental seismicity catalogs, active fault databases, and ground motion prediction equations. It also describes regional seismic hazard models compiled in a database and the OpenQuake open-source software for calculating seismic hazard and risk. Key applications of the products include use in building codes, insurance catastrophe modeling, and site-specific engineering analyses.
This document summarizes a seismic hazard model for the Middle East region. It includes 3 area source models, 9 fault source models, and a spatially smoothed seismicity model developed based on a declustered earthquake catalog. The models were constructed through a collaborative process involving multiple experts. The key elements summarized are:
- 143 area source zones defined based on seismicity patterns and tectonic features.
- Fault sources were selected based on being capable and having slip rates above 0.1 mm/year, with 3 confidence classes.
- Maximum magnitudes were assigned through various methods with sensitivity analysis performed.
- A logic tree incorporates the alternative source models and characterizations.
- The models were developed to be stable
This study conducted a seismic risk assessment for Portugal using probabilistic seismic hazard analysis and developing new exposure and vulnerability models. A probabilistic seismic hazard model was developed considering logic trees for ground motion models, seismic source characterizations, and other parameters. An exposure model was created using a national building census to characterize Portugal's building stock. Fragility functions were developed for reinforced concrete and masonry structures. Probabilistic loss estimates were calculated at different spatial scales to identify regions in Portugal where risk mitigation measures should be prioritized.
The document discusses tools and datasets for seismic hazard analysis from site-specific to global scales. It describes the OpenQuake engine and Hazard Modeller's Toolkit (HMTK) which can be used for classical and event-based probabilistic seismic hazard analysis (PSHA) at various scales. The OpenQuake Ground Motion Toolkit helps with selection and weighting of ground motion prediction equations. These tools are applied in site-specific analyses, and for developing national, regional, and global seismic hazard models using various data sources on earthquakes, faults, and strain.
The document summarizes the products and applications of GEM's Hazard program. It outlines five global datasets created through international projects including historical earthquake archives, instrumental seismicity catalogs, active fault databases, and ground motion prediction equations. It also describes regional seismic hazard models compiled in a database and the OpenQuake open-source software for calculating seismic hazard and risk. Key applications of the products include use in building codes, insurance catastrophe modeling, and site-specific engineering analyses.
During the first phase of GEM Risk, several key datasets and tools were developed by the scientific community and made available through GEM. This includes a building taxonomy, exposure models, vulnerability assessment guidelines, and global earthquake consequences and exposure databases that are accessible on the OpenQuake platform. GEM also conducted regional workshops and technical training to facilitate risk assessment collaboration and technology transfer.
This document summarizes a study that analyzed the damage scenarios for reinforced concrete precast industrial structures in Tuscany, Italy due to earthquakes. The study generated a population of building models based on inventory data and fragility curves. Nonlinear analyses were performed under earthquake ground motions. Limit states like yielding and collapse were defined. The results showed that accounting for both flexural and connection failures provided more accurate fragility curves compared to flexural failures alone. Connection failures were highly dependent on the assumed friction coefficient. Finally, probabilistic collapse maps for a Mw 6.5 scenario earthquake in Tuscany were presented.
The document summarizes the development of satellite modeling for the National Solar Radiation Database (NSRDB) to provide accurate surface solar radiation data. It describes the evolution from empirical to physical models using satellite measurements and ancillary data as inputs to radiative transfer models. Validation shows the new 2005-2012 dataset has a mean bias error of less than 5% for GHI and DNI compared to surface measurements, though uncertainty remains for cloudy cases. Future work aims to improve the model with higher resolution data and better representation of aerosols and surfaces.
5 IGARSS_Riishojgaard July 25 2011_rev2.pptgrssieee
The document discusses the Joint Center for Satellite Data Assimilation's (JCSDA) work related to the upcoming launch of the National Polar-orbiting Partnership (NPP) satellite. The JCSDA is preparing operational weather prediction services to assimilate data from NPP by improving radiative transfer models, developing emissivity databases, and conducting observing system simulation experiments. After launch, the JCSDA will monitor NPP data and work to incorporate it into operational weather forecasting systems to improve predictions and generate tens of billions of dollars in economic benefits annually.
This document summarizes work from an optimization subgroup in remote sensing. It discusses three main topics: 1) defining optimization problems and algorithms in remote sensing, 2) the role of optimization in remote sensing retrievals, and 3) potential projects focused on improving temperature and humidity profile retrievals from satellite data and intersatellite calibration. The subgroup explored using neural networks and Gaussian processes to develop atmospheric temperature and humidity profiles from HIRS satellite data.
The document discusses a methodology for improving wind speed forecasts through synergizing outputs from two numerical weather prediction (NWP) models - the Global Environmental Multiscale model (GEM) and the North American Mesoscale model (NAM). Wind speed measurements from four meteorological towers are used to evaluate the individual NWP models and their combined forecasts. Results show the combined GEM-NAM forecasts reduce root mean square error by up to 20% compared to the individual models, indicating improved forecast accuracy through optimal combination of the two NWP models.
This document discusses the Copernicus Programme and use of Sentinel satellite data for agriculture and forestry. It provides an overview of the Copernicus programme, including its three components: space, in-situ, and services. It describes the five Sentinel satellite missions and their characteristics. The document outlines how Sentinel data can be used for applications like crop monitoring, soil moisture mapping, and detection of clearcuts. It highlights the Copernicus Land Monitoring Service and available agriculture products. In conclusion, it discusses benefits of the open data policy and upcoming Copernicus user events.
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Math 390 - Machine Learning Techniques PresentationDarragh Punch
This document discusses various machine learning techniques for modeling solar radiation, including artificial neural networks (ANNs), support vector machines (SVMs), radial basis functions (RBFs), support vector regression (SVR), Gaussian processes (GP), and numerical weather prediction (NWP). ANNs can predict optimal photovoltaic system layouts and "learn" from examples. SVMs using RBF kernels are more accurate than other models for solar radiation forecasting. SVR provides better representations than multi-class SVMs. GP is the best predictor of solar irradiance. And NWP samples current weather to predict future conditions up to 6 hours ahead, relevant for long-term solar farm planning.
Quality of ground data for assessment and benchmarkingIrSOLaV Pomares
This document discusses the importance of assessing the quality of ground-based solar radiation data used for model development, benchmarking, and assessment. It outlines several existing quality control procedures from organizations like BSRN, ARM, and NREL that check for physically realistic values and consistency between radiation components. Common errors found in some databases are also described, such as errors in the recorded time reference affecting clearness index calculations and erroneous beam radiation near sunrise/sunset. The document raises questions about whether Task 36 should propose a general quality control procedure and which criteria should be included in a solar radiation data guide.
Met Éireann has expanded from monitoring Irish climate to conducting climate modelling. It was initially involved in regional climate modelling through projects like C4I. It has since joined the EC-Earth consortium to run its own global climate model. EC-Earth simulations will be contributed to CMIP5 and used for national climate impact research. Met Éireann also maintains regional modelling capabilities and plans high-resolution regional simulations.
1. The document discusses using sky imagers for short-term solar forecasting, as traditional methods lack sufficient spatial and temporal resolution for small-scale applications.
2. The proposed sky imager forecast model involves 7 steps: image analysis, cloud detection, cloud projection, shadow projection, irradiance modeling, predicting cloud motion to generate forecasts, and PV power modeling.
3. Accurate cloud detection, projection, and shadow projection are challenging due to issues like cloud inhomogeneity, perspective errors with distance from camera, and sensitivity to errors in estimated cloud base height.
Solar Radiation Estimation based on Digital Image ProcessingPrashant Pal
This document discusses a method for estimating solar radiation using digital image processing of sky camera images. Sky camera images are divided into three areas and pixel values from clear and cloudy day images are analyzed. Correlations between pixel values and solar altitude are used to create databases for clear and cloudy conditions. Beam, diffuse, and global radiation values are estimated based on averages of pixel values from the three areas and compared to measured values, achieving reasonably low error rates. The method provides information that can improve solar plant performance and operation based on meteorological conditions.
The European Copernicus programme with its Sentinel satellites is one of the most ambitious Earth observation programmes to date with all data being freely accessible. Copernicus addresses several thematic areas including land, marine, atmosphere, climate change, emergency management and security. Different satellite types have been and will be further launched; hence, weather independent Radar data, optical and infrared data are now available. In Europe the revisit time is between 3-5 days, allowing to monitor the same areas at high frequency. Actual land use, forest structure, and vegetation phases can be recorded promptly, to name only a few examples. While the Copernicus program is well perceived in the Earth observation community, the new data sets are still widely unnoticed or underused in the GIS community as well as in public administration, also due to the sheer amount of available data in the Petabyte range and the need of notable computational power. This is a great opportunity for specialized service providers to develop new applications for administration, science, and business in order to find new ways of retrieving information from Petabytes of raw data.
In our talk we will present an open source approach to a processing such data in a cloud based system, providing standardized OGC Web Services by GeoServer and MapProxy software. The backend of the system is able to timely post-process and analyze Sentinel data in an automated way using the GRASS GIS and GDAL software. We have developed a REST API based system that allows the user to automatically derive thematic data layers based on algorithms provided by the portal. This greatly simplifies the user’s life since own topical layers can be generated without the need of deep technical knowledge of software, hardware and time series management. We believe that this approach likely widens up the potential user group of the Copernicus program. At the same time, it connects two worlds that are often unnecessarily disentangled: the GIS and the remote sensing communities.
The presentation is completed by some examples and practical use cases, illustrating the idea of the workflow and the architecture of the portal.
The document summarizes research on simulating satellite brightness temperature (BT) data using land surface models and observations. Key points:
- Researchers developed a two-phase system to simulate gridded AMSR-E BT data using the Community Land Model (CLM), a microwave emissivity model, and calibration with SCE-UA algorithm and AMSR-E observations.
- The system calculates sub-grid land states from CLM, simulates BT from each patch, and calibrates wetland emissivity to minimize differences from observed BT.
- Results showed the calibrated wetland emissivities transferred well to another location and improved soil moisture estimates when assimilated using an ensemble Kalman filter
The document provides an introduction to seismic hazard analysis which includes four main steps:
1) Characterization of seismic sources and estimation of seismicity parameters for each source.
2) Selection of ground motion attenuation models.
3) Quantification of seismic hazard by calculating probabilities of exceeding certain ground motion levels.
4) Mapping of seismic hazard across a region.
It then discusses source characterization in more detail, including defining fault rupture zones, magnitude-area relations, earthquake catalogs, and magnitude frequency distributions models like characteristic and Gutenberg-Richter.
We perform stochastic finite fault simulations of ground motions in Istanbul, Turkey based on fault rupture scenarios. The region is divided into grids and synthetic site-specific ground motions are generated for each grid using calibrated source, path, and site models. The models are calibrated by simulating a 2011 M5.1 earthquake in the Marmara Sea. Grid-based synthetic ground motions are generated for Istanbul for a M7.4 earthquake scenario. Model parameters are verified by comparing to ground motion prediction equations.
This document discusses extending the Rangeland Hydrology and Erosion Model (RHEM) from hillslopes to watershed and large areas using the KINEROS2 and AGWA hydrology models. It provides an overview of KINEROS2 and AGWA capabilities for modeling hydrology, erosion, and sediment transport at various scales. It also discusses challenges in obtaining RHEM parameters over large areas and potential approaches using data from the National Resources Inventory, ecological site descriptions, remote sensing, and regression relationships. The document concludes with next steps around improving parameterization and integrating state and transition models and remote sensing data.
This document presents a Bayesian methodology for retrieving soil parameters like moisture from SAR images. It begins by introducing the importance of soil moisture monitoring and the opportunity provided by Argentina's upcoming SAOCOM SAR satellite. It then discusses limitations of traditional retrieval models in accounting for speckle noise and terrain heterogeneity. The document proposes a Bayesian approach using a multiplicative speckle model within a likelihood function to estimate soil moisture and roughness from SAR backscatter measurements. Simulation results show the Bayesian method retrieves soil moisture across the full measurement space and provides error estimates, with improved precision at higher numbers of looks.
The document describes the principles of operation and first results of SMOS, a satellite mission to measure soil moisture and ocean salinity. It discusses the basic principles of synthetic aperture radiometry used by SMOS and describes the MIRAS instrument, including its array topology, receivers, digital correlator system, and calibration system. It also addresses instrument performance metrics like angular resolution and radiometric sensitivity. Lastly, it discusses image reconstruction algorithms and geolocalization of retrieval products.
During the first phase of GEM Risk, several key datasets and tools were developed by the scientific community and made available through GEM. This includes a building taxonomy, exposure models, vulnerability assessment guidelines, and global earthquake consequences and exposure databases that are accessible on the OpenQuake platform. GEM also conducted regional workshops and technical training to facilitate risk assessment collaboration and technology transfer.
This document summarizes a study that analyzed the damage scenarios for reinforced concrete precast industrial structures in Tuscany, Italy due to earthquakes. The study generated a population of building models based on inventory data and fragility curves. Nonlinear analyses were performed under earthquake ground motions. Limit states like yielding and collapse were defined. The results showed that accounting for both flexural and connection failures provided more accurate fragility curves compared to flexural failures alone. Connection failures were highly dependent on the assumed friction coefficient. Finally, probabilistic collapse maps for a Mw 6.5 scenario earthquake in Tuscany were presented.
The document summarizes the development of satellite modeling for the National Solar Radiation Database (NSRDB) to provide accurate surface solar radiation data. It describes the evolution from empirical to physical models using satellite measurements and ancillary data as inputs to radiative transfer models. Validation shows the new 2005-2012 dataset has a mean bias error of less than 5% for GHI and DNI compared to surface measurements, though uncertainty remains for cloudy cases. Future work aims to improve the model with higher resolution data and better representation of aerosols and surfaces.
5 IGARSS_Riishojgaard July 25 2011_rev2.pptgrssieee
The document discusses the Joint Center for Satellite Data Assimilation's (JCSDA) work related to the upcoming launch of the National Polar-orbiting Partnership (NPP) satellite. The JCSDA is preparing operational weather prediction services to assimilate data from NPP by improving radiative transfer models, developing emissivity databases, and conducting observing system simulation experiments. After launch, the JCSDA will monitor NPP data and work to incorporate it into operational weather forecasting systems to improve predictions and generate tens of billions of dollars in economic benefits annually.
This document summarizes work from an optimization subgroup in remote sensing. It discusses three main topics: 1) defining optimization problems and algorithms in remote sensing, 2) the role of optimization in remote sensing retrievals, and 3) potential projects focused on improving temperature and humidity profile retrievals from satellite data and intersatellite calibration. The subgroup explored using neural networks and Gaussian processes to develop atmospheric temperature and humidity profiles from HIRS satellite data.
The document discusses a methodology for improving wind speed forecasts through synergizing outputs from two numerical weather prediction (NWP) models - the Global Environmental Multiscale model (GEM) and the North American Mesoscale model (NAM). Wind speed measurements from four meteorological towers are used to evaluate the individual NWP models and their combined forecasts. Results show the combined GEM-NAM forecasts reduce root mean square error by up to 20% compared to the individual models, indicating improved forecast accuracy through optimal combination of the two NWP models.
This document discusses the Copernicus Programme and use of Sentinel satellite data for agriculture and forestry. It provides an overview of the Copernicus programme, including its three components: space, in-situ, and services. It describes the five Sentinel satellite missions and their characteristics. The document outlines how Sentinel data can be used for applications like crop monitoring, soil moisture mapping, and detection of clearcuts. It highlights the Copernicus Land Monitoring Service and available agriculture products. In conclusion, it discusses benefits of the open data policy and upcoming Copernicus user events.
New features presentation: meteodyn WT 4.8 software - Wind EnergyJean-Claude Meteodyn
New feature of meteodyn WT, CFD software for wind resource assessment and wind park optimisation. Worldwide terrain database, convergence improvements and others improvements.
Math 390 - Machine Learning Techniques PresentationDarragh Punch
This document discusses various machine learning techniques for modeling solar radiation, including artificial neural networks (ANNs), support vector machines (SVMs), radial basis functions (RBFs), support vector regression (SVR), Gaussian processes (GP), and numerical weather prediction (NWP). ANNs can predict optimal photovoltaic system layouts and "learn" from examples. SVMs using RBF kernels are more accurate than other models for solar radiation forecasting. SVR provides better representations than multi-class SVMs. GP is the best predictor of solar irradiance. And NWP samples current weather to predict future conditions up to 6 hours ahead, relevant for long-term solar farm planning.
Quality of ground data for assessment and benchmarkingIrSOLaV Pomares
This document discusses the importance of assessing the quality of ground-based solar radiation data used for model development, benchmarking, and assessment. It outlines several existing quality control procedures from organizations like BSRN, ARM, and NREL that check for physically realistic values and consistency between radiation components. Common errors found in some databases are also described, such as errors in the recorded time reference affecting clearness index calculations and erroneous beam radiation near sunrise/sunset. The document raises questions about whether Task 36 should propose a general quality control procedure and which criteria should be included in a solar radiation data guide.
Met Éireann has expanded from monitoring Irish climate to conducting climate modelling. It was initially involved in regional climate modelling through projects like C4I. It has since joined the EC-Earth consortium to run its own global climate model. EC-Earth simulations will be contributed to CMIP5 and used for national climate impact research. Met Éireann also maintains regional modelling capabilities and plans high-resolution regional simulations.
1. The document discusses using sky imagers for short-term solar forecasting, as traditional methods lack sufficient spatial and temporal resolution for small-scale applications.
2. The proposed sky imager forecast model involves 7 steps: image analysis, cloud detection, cloud projection, shadow projection, irradiance modeling, predicting cloud motion to generate forecasts, and PV power modeling.
3. Accurate cloud detection, projection, and shadow projection are challenging due to issues like cloud inhomogeneity, perspective errors with distance from camera, and sensitivity to errors in estimated cloud base height.
Solar Radiation Estimation based on Digital Image ProcessingPrashant Pal
This document discusses a method for estimating solar radiation using digital image processing of sky camera images. Sky camera images are divided into three areas and pixel values from clear and cloudy day images are analyzed. Correlations between pixel values and solar altitude are used to create databases for clear and cloudy conditions. Beam, diffuse, and global radiation values are estimated based on averages of pixel values from the three areas and compared to measured values, achieving reasonably low error rates. The method provides information that can improve solar plant performance and operation based on meteorological conditions.
The European Copernicus programme with its Sentinel satellites is one of the most ambitious Earth observation programmes to date with all data being freely accessible. Copernicus addresses several thematic areas including land, marine, atmosphere, climate change, emergency management and security. Different satellite types have been and will be further launched; hence, weather independent Radar data, optical and infrared data are now available. In Europe the revisit time is between 3-5 days, allowing to monitor the same areas at high frequency. Actual land use, forest structure, and vegetation phases can be recorded promptly, to name only a few examples. While the Copernicus program is well perceived in the Earth observation community, the new data sets are still widely unnoticed or underused in the GIS community as well as in public administration, also due to the sheer amount of available data in the Petabyte range and the need of notable computational power. This is a great opportunity for specialized service providers to develop new applications for administration, science, and business in order to find new ways of retrieving information from Petabytes of raw data.
In our talk we will present an open source approach to a processing such data in a cloud based system, providing standardized OGC Web Services by GeoServer and MapProxy software. The backend of the system is able to timely post-process and analyze Sentinel data in an automated way using the GRASS GIS and GDAL software. We have developed a REST API based system that allows the user to automatically derive thematic data layers based on algorithms provided by the portal. This greatly simplifies the user’s life since own topical layers can be generated without the need of deep technical knowledge of software, hardware and time series management. We believe that this approach likely widens up the potential user group of the Copernicus program. At the same time, it connects two worlds that are often unnecessarily disentangled: the GIS and the remote sensing communities.
The presentation is completed by some examples and practical use cases, illustrating the idea of the workflow and the architecture of the portal.
The document summarizes research on simulating satellite brightness temperature (BT) data using land surface models and observations. Key points:
- Researchers developed a two-phase system to simulate gridded AMSR-E BT data using the Community Land Model (CLM), a microwave emissivity model, and calibration with SCE-UA algorithm and AMSR-E observations.
- The system calculates sub-grid land states from CLM, simulates BT from each patch, and calibrates wetland emissivity to minimize differences from observed BT.
- Results showed the calibrated wetland emissivities transferred well to another location and improved soil moisture estimates when assimilated using an ensemble Kalman filter
The document provides an introduction to seismic hazard analysis which includes four main steps:
1) Characterization of seismic sources and estimation of seismicity parameters for each source.
2) Selection of ground motion attenuation models.
3) Quantification of seismic hazard by calculating probabilities of exceeding certain ground motion levels.
4) Mapping of seismic hazard across a region.
It then discusses source characterization in more detail, including defining fault rupture zones, magnitude-area relations, earthquake catalogs, and magnitude frequency distributions models like characteristic and Gutenberg-Richter.
We perform stochastic finite fault simulations of ground motions in Istanbul, Turkey based on fault rupture scenarios. The region is divided into grids and synthetic site-specific ground motions are generated for each grid using calibrated source, path, and site models. The models are calibrated by simulating a 2011 M5.1 earthquake in the Marmara Sea. Grid-based synthetic ground motions are generated for Istanbul for a M7.4 earthquake scenario. Model parameters are verified by comparing to ground motion prediction equations.
This document discusses extending the Rangeland Hydrology and Erosion Model (RHEM) from hillslopes to watershed and large areas using the KINEROS2 and AGWA hydrology models. It provides an overview of KINEROS2 and AGWA capabilities for modeling hydrology, erosion, and sediment transport at various scales. It also discusses challenges in obtaining RHEM parameters over large areas and potential approaches using data from the National Resources Inventory, ecological site descriptions, remote sensing, and regression relationships. The document concludes with next steps around improving parameterization and integrating state and transition models and remote sensing data.
This document presents a Bayesian methodology for retrieving soil parameters like moisture from SAR images. It begins by introducing the importance of soil moisture monitoring and the opportunity provided by Argentina's upcoming SAOCOM SAR satellite. It then discusses limitations of traditional retrieval models in accounting for speckle noise and terrain heterogeneity. The document proposes a Bayesian approach using a multiplicative speckle model within a likelihood function to estimate soil moisture and roughness from SAR backscatter measurements. Simulation results show the Bayesian method retrieves soil moisture across the full measurement space and provides error estimates, with improved precision at higher numbers of looks.
The document describes the principles of operation and first results of SMOS, a satellite mission to measure soil moisture and ocean salinity. It discusses the basic principles of synthetic aperture radiometry used by SMOS and describes the MIRAS instrument, including its array topology, receivers, digital correlator system, and calibration system. It also addresses instrument performance metrics like angular resolution and radiometric sensitivity. Lastly, it discusses image reconstruction algorithms and geolocalization of retrieval products.
The document discusses applications of the integral equation model (IEM) in microwave remote sensing for retrieving land surface parameters. It presents the IEM and its advantages over traditional models. It then describes using IEM and AIEM simulations to develop parameterized emission models and inversion algorithms for retrieving soil moisture from satellites like AMSR-E and sensors like SMOS and SMAP. Validation results using ground measurements show the algorithms can accurately estimate soil moisture.
The document discusses applications of the integral equation model (IEM) in microwave remote sensing for retrieving land surface parameters. It presents the IEM and its advantages over traditional models. It then describes using IEM and AIEM simulations to develop parameterized emission models and inversion algorithms for soil moisture retrieval from satellites like AMSR-E and SMOS/SMAP. Validation results using ground measurements show the algorithms can accurately estimate soil moisture.
The document discusses applications of the integral equation model (IEM) in microwave remote sensing for retrieving land surface parameters. It presents the IEM and its advantages over traditional models. It then describes using IEM and AIEM simulations to develop parameterized emission models and inversion algorithms for retrieving soil moisture from satellites like AMSR-E and sensors like SMOS and SMAP. Validation results using experimental radiometer data show the algorithms can accurately estimate soil moisture.
Fracture prediction using low coverage seismic data in area of complicated st...Mario Prince
This document presents a workflow for predicting fractures in a limestone reservoir using 3D seismic data with low fold coverage in an area with complicated structures in Colombia. Key steps included: 1) applying interpolation and azimuthal division to overcome data limitations, 2) performing PSTM on azimuthal volumes to maintain structure while enhancing image quality, and 3) using relative impedance attributes to detect anisotropy and predict fracture orientation and intensity, with two dominant orientations identified. Comparison to well data showed excellent agreement between seismic-derived and FMI-measured fracture orientations, validating the technique for reliable fracture prediction with low coverage seismic data.
This document summarizes a study that used wide-swath interferometric synthetic aperture radar (InSAR) time series to map large-scale ground deformation over the Danakil depression in the Afar region of Ethiopia between 2006 and 2009. The time series analysis revealed deformation signals consistent with magmatic intrusions and inflation/deflation of volcanic centers. Modeling of the deformation supported deep magma intrusion beneath the central segment and lateral magma propagation and chamber inflation beneath Dabbahu volcano in the northern segment. The study demonstrated the potential of wide-swath InSAR time series for mapping long-wavelength ground deformation over large areas.
This document summarizes a study that used wide-swath interferometric synthetic aperture radar (InSAR) time series to map large-scale ground deformation over the Danakil depression in the Afar region of Ethiopia between 2006 and 2009. The time series analysis revealed deformation signals consistent with magmatic intrusions and inflation/deflation of volcanic centers. Modeling of the deformation supported deep magma intrusion beneath the central segment and lateral magma propagation and chamber inflation beneath Dabbahu volcano in the northern segment. The study demonstrated the potential of wide-swath InSAR time series for mapping long-wavelength ground deformation over large areas.
Presentation made by Prof. Adriano Camps (Universitat Politècnica de Catalunya) at ICMARS 2010 (India, 16-December-2010) on the MIRAS instrument aboard ESA's SMOS mission.
This document discusses a methodology for seismic risk assessment in Huánuco, Peru. The objectives are to contribute to reducing seismic risk in Huánuco by providing tools to help decide intervention criteria. The methodology involves evaluating seismic hazard, structural vulnerability of existing buildings, specific and overall seismic risk based on hazard and vulnerability, and potential damage levels through risk assessment. Seismic risk is calculated by convolving hazard probabilities, vulnerability, and economic values.
Seismic Modeling ASEG 082001 Andrew LongAndrew Long
This document discusses tools for modeling elastic wave propagation to aid in seismic survey planning. It summarizes three main modeling techniques: recursive reflectivity methods, ray tracing methods, and full wavefield methods using finite-differencing. Ray tracing is useful for optimizing survey geometry but not reflectivity studies, while reflectivity and finite-difference methods model full wavefields and are better for amplitude studies like AVO. Integrating these modeling tools with real data and rock physics analysis allows comprehensive understanding of wave propagation for effective survey planning addressing all acquisition parameters and seismic phenomena.
This document summarizes a new methodology for probabilistic seismic hazard analysis (PSHA) that addresses common problems in the field. The methodology incorporates paleoearthquake data, accounts for uncertainties in earthquake parameters, relaxes assumptions about seismicity models, and does not require delineating source zones. It estimates hazard through peak ground acceleration and spectral acceleration curves. The methodology was applied to produce seismic hazard maps for South and Sub-Saharan Africa showing 10% probabilities of exceedance in 50 years. Computer codes implementing the methodology are available from the author.
The document summarizes research using remote sensing data and quantitative analysis to identify and characterize alluvial fans. Satellite imagery was used to calculate surface roughness as a proxy for distinguishing alluvial areas. Digital elevation models from SRTM data were analyzed to delineate geometric parameters of landforms. A fuzzy logic model populated with roughness, elevation, and curvature data was able to classify terrain into categories corresponding to different parts of alluvial fans. The method provided initial identification and spatial extent of alluvial fans while also assigning fuzzy membership values.
The document summarizes the objectives and methodology of the SHARE project, which aims to harmonize seismic hazard assessment across Europe and the Mediterranean region. The project develops a community-based seismic hazard model through harmonizing data, modeling approaches, engineering requirements, and other factors. It presents several novel source models at the regional scale and uses a logic tree to incorporate epistemic uncertainty in ground motion predictions. Quality assurance is performed and the results will be disseminated by November 2012 to create a new reference hazard model for the region.
Probabilistic Seismic Hazard Analysis Of Dehradun City , Uttrakhandijceronline
Dehradun is very old city and also rapidly growing urban area located in valley at foothills of Garhwal Himalayas. Dehradun city and adjoining region in western Himalayas is a is a very active seismic region of Himalayan belt , stretching from Pamir - Hindukush to Arkans in Burma.According to seismic zoning map of India , Dehradun city lies in Zone 4 and expected MSK intensity 8 .Dehradun city is located in the vicinity of twenty four independent seismic source zones which in reality are active faults. This creates uncertainties in size , location and the rate of recurrence of earthquakes. Probabilistic seismic hazard analysis provides a framework in which these uncertainties can be identified , quantified and combined in a rational manner to provide a more complete picture of the seismic hazard . This study presents a PSHA of the Dehradun city using the attenuation relationship given by Cornell et al (1979) in order to determinate various levels of earthquake-caused ground motion that will be exceeded in a given future time period.
Flight Dynamics Software Presentation Part I Version 5Antonios Arkas
This document describes an orbit determination simulator and its key features:
- It uses a weighted least-squares estimator to process range and angular tracking measurements from multiple Earth stations to determine orbital state. It can estimate parameters like reflectivity coefficient, ballistic coefficient, and antenna biases.
- It provides outputs like the determined orbit, validity metrics, covariance analysis, residuals graphs, and confidence ellipsoids. It can also propagate determined state covariance over time.
- The simulator was validated against another flight dynamics software by comparing results from processing real tracking data. Determined states and other parameters showed close agreement.
- Consider covariance analysis is performed to assess impact of neglected parameters like antenna biases. This is done through formal
Final project for Geo-Engineering Techniques for Unstable SlopesAlireza Babaee
The document summarizes geological and geophysical studies of an unstable slope in northern Lecco, Italy. Geological studies included characterizing rock types, discontinuities, strength properties, and slope stability analyses. Photogrammetry and laser scanning were used to generate point clouds and orthophotos of the slope. Geophysical studies involved ground penetrating radar to map subsurface discontinuities in 3D. Together these integrated techniques characterized the rock mass and identified planar sliding along discontinuity set A as the main slope failure mechanism.
This document summarizes an integrated probabilistic risk assessment for Turkey. It presents models for seismic hazard, exposure, physical vulnerability, and economic loss for Turkey's major cities. Seismic hazard was modeled using ground motion prediction equations in OpenQuake. Building exposure was captured from census data. Fragility curves related ground shaking to building damage. Combined models calculated physical risk and economic loss. A socio-economic vulnerability index incorporated demographic indicators. Integrating physical risk and vulnerability produced an overall integrated risk assessment for cities like Istanbul and Van.
The document describes the GEM Foundation's efforts to create a centralized database of global seismic hazard models using common data formats and open-source software. This will allow models to be more easily compared, reproduced and inspected. It will also facilitate combining models and generating new data. Currently the database includes major models from regions around the world. Quality assurance testing has revealed some differences between models when reproduced, calling for further investigation.
The document discusses the development of the GEM Vulnerability Database. The database contains over 750 vulnerability models including fragility functions, vulnerability functions, damage-to-loss functions, and capacity curves. These models describe the probability of damage or loss given various ground motion intensity measures. The database facilitates modeling, comparison of functions, and sharing of results with the scientific community. It contains functions for 37 countries/regions.
The document discusses the OpenQuake-engine software for seismic risk assessment. It can perform probabilistic and scenario-based hazard and risk calculations considering various uncertainties. Different calculators within OpenQuake allow scenario risk assessment, scenario damage assessment, probabilistic event-based risk analysis, and benefit-cost analysis of retrofitting options. The software is open source and can be run on single computers or cloud platforms.
This document discusses an integrated risk modeling toolkit and database for earthquake risk assessment. It presents frameworks for integrated risk assessment and modeling social and economic vulnerability. Methods are described for selecting indicators, standardizing data, conducting statistical analysis, weighting factors, and linking results to physical risk estimates. The toolkit allows incorporation of data on populations, economy, infrastructure, and other factors. Areas for further improvement include accounting for uncertainties, qualitative analysis, and application to specific use cases.
Social Vulnerability Datasets through the OpenQuake Platform and Description of a Case-Scenario of Integrated Risk and Resilience using OpenQuake Tools.
This document discusses city scenario applications of the EMME (Earthquake Model of the Middle East) project. Seven cities - Mashhad, Iran; Gulshan-Karachi, Pakistan; Irbid, Jordan; Tbilisi, Georgia; Yerevan, Armenia; and Tyr City, Lebanon - were selected for deterministic seismic risk assessments involving specified earthquake scenarios. For each city, information on building inventories, site conditions, vulnerability, and expected damage distributions from scenario earthquakes is presented. The document concludes that the city scenario activities provided valuable risk information for local municipalities that could be updated over time.
The document discusses the South America Integrated Risk Assessment (SARA) project, which aims to develop risk models for 13 countries in South America with a total population of over 402 million people. The project focuses on developing exposure and vulnerability models for major cities in the region with informal construction. Several countries, including Colombia, Chile, Ecuador, Peru, and Venezuela, plan to create detailed exposure models and fragility functions for major cities to analyze risk and inform mitigation efforts.
Vitor Silva of the GEM Foundation in Italy analyzed the costs and benefits of retrofitting buildings in Nepal. The analysis considered 2221 locations, 9144 assets across 5 categories, and an area of around 140,000 square kilometers. Models were used to calculate expected damage and losses from earthquakes of different magnitudes, and to compare the annual losses expected with and without retrofitting to determine the benefit-cost ratio of retrofitting. Maps of seismic hazard and optimal retrofit designs were also produced to inform decision making.
The EMME (Earthquake Model of the Middle East) project developed a probabilistic seismic hazard model and seismic risk assessments for multiple countries in the Middle East region from 2009-2014. The project involved compiling strong motion data, developing ground motion prediction equations, performing probabilistic seismic hazard computations, building inventories and vulnerability assessments, and validating models with past earthquake damage observations. Key to the success of the EMME project was obtaining a high level of contribution from partner countries and basing all aspects of the model on local data and expertise.
- Expert elicitation was used to develop fragility functions characterizing building vulnerability to earthquakes around the world. Thirteen experts evaluated vulnerability for generic building types in eight countries, and twelve US and one Canadian experts evaluated selected building types in the US.
- Cooke's method was used to score experts based on their accuracy on seed questions and assign weights to their responses on target questions. This allowed fragility curves to be developed accounting for expert uncertainties.
- The exercises generated over 50 new fragility functions for use in earthquake modeling, providing critical data where empirical models are lacking. Further research is needed to better understand the expert scoring approach.
The document discusses earthquake risk in Nepal and collaboration between the National Society for Earthquake Technology-Nepal (NSET) and the Global Earthquake Model (GEM). It notes that Nepal sits at a plate boundary and faces significant earthquake hazards. Over 60% of buildings in Kathmandu could be destroyed by a major quake, leaving over 1.5 million homeless. While NSET has conducted risk assessments and education, more remains to be done. The document calls for increased collaboration between NSET and GEM to further research, training, and risk reduction efforts.
This document presents a toolkit for integrated risk assessment that combines physical risk and social vulnerability. It describes representing concepts of social vulnerability and integrated risk through data and various statistical and expert opinion approaches. The document then discusses operationalizing concepts of social vulnerability and integrated risk through the development of an open-source integrated risk modeling toolkit and SVIR data viewer. Finally, it introduces a self-evaluation tool for urban earthquake resilience.
Geoscience Australia is working with Papua New Guinea to develop earthquake scenarios and a national seismic hazard map using open-source GEM tools. Under an agreement with DFAT, Geoscience Australia is strengthening Papua New Guinea's capacity to create earthquake scenarios and develop the first national seismic hazard map. The seismic hazard assessment uses a source model with 24 area sources, 2 complex faults, 53,186 point sources, and 4 tectonic regimes. Probabilistic seismic hazard maps of PNG were created using the OpenQuake engine and 11 ground motion prediction equations.
This document summarizes an end-to-end risk model developed for Pavia, Italy using GEM (Global Earthquake Model) tools. Over 1200 buildings in Pavia were surveyed to develop a detailed exposure dataset. Sixteen building typologies were defined and fragility and vulnerability functions from literature were used. A probabilistic risk calculation was run using a hazard model from SHARE and the exposure model. This provided loss curves, maps and an estimated annual average loss around 0.6% of total replacement cost.
The document discusses the Global Earthquake Model (GEM), an organization that works to advance scientific understanding of earthquake risk. It has developed global databases on earthquakes, faults, exposure, and vulnerability, and provides open-source software and tools to facilitate risk assessment. GEM engages participants from public organizations, private companies, and other partners to collaboratively develop trusted science and transparent models and data. The organization aims to help improve risk management and inform decision-making worldwide.
The aim of GED4GEM is to build a comprehensive, multi-scale and statistically accurate database of population and buildings, to asses the physical and economic exposure of a given area to earthquakes.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Sérgio Sacani
Context. The observation of several L-band emission sources in the S cluster has led to a rich discussion of their nature. However, a definitive answer to the classification of the dusty objects requires an explanation for the detection of compact Doppler-shifted Brγ emission. The ionized hydrogen in combination with the observation of mid-infrared L-band continuum emission suggests that most of these sources are embedded in a dusty envelope. These embedded sources are part of the S-cluster, and their relationship to the S-stars is still under debate. To date, the question of the origin of these two populations has been vague, although all explanations favor migration processes for the individual cluster members. Aims. This work revisits the S-cluster and its dusty members orbiting the supermassive black hole SgrA* on bound Keplerian orbits from a kinematic perspective. The aim is to explore the Keplerian parameters for patterns that might imply a nonrandom distribution of the sample. Additionally, various analytical aspects are considered to address the nature of the dusty sources. Methods. Based on the photometric analysis, we estimated the individual H−K and K−L colors for the source sample and compared the results to known cluster members. The classification revealed a noticeable contrast between the S-stars and the dusty sources. To fit the flux-density distribution, we utilized the radiative transfer code HYPERION and implemented a young stellar object Class I model. We obtained the position angle from the Keplerian fit results; additionally, we analyzed the distribution of the inclinations and the longitudes of the ascending node. Results. The colors of the dusty sources suggest a stellar nature consistent with the spectral energy distribution in the near and midinfrared domains. Furthermore, the evaporation timescales of dusty and gaseous clumps in the vicinity of SgrA* are much shorter ( 2yr) than the epochs covered by the observations (≈15yr). In addition to the strong evidence for the stellar classification of the D-sources, we also find a clear disk-like pattern following the arrangements of S-stars proposed in the literature. Furthermore, we find a global intrinsic inclination for all dusty sources of 60 ± 20◦, implying a common formation process. Conclusions. The pattern of the dusty sources manifested in the distribution of the position angles, inclinations, and longitudes of the ascending node strongly suggests two different scenarios: the main-sequence stars and the dusty stellar S-cluster sources share a common formation history or migrated with a similar formation channel in the vicinity of SgrA*. Alternatively, the gravitational influence of SgrA* in combination with a massive perturber, such as a putative intermediate mass black hole in the IRS 13 cluster, forces the dusty objects and S-stars to follow a particular orbital arrangement. Key words. stars: black holes– stars: formation– Galaxy: center– galaxies: star formation
PPT on Alternate Wetting and Drying presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...Advanced-Concepts-Team
Presentation in the Science Coffee of the Advanced Concepts Team of the European Space Agency on the 07.06.2024.
Speaker: Diego Blas (IFAE/ICREA)
Title: Gravitational wave detection with orbital motion of Moon and artificial
Abstract:
In this talk I will describe some recent ideas to find gravitational waves from supermassive black holes or of primordial origin by studying their secular effect on the orbital motion of the Moon or satellites that are laser ranged.
Sexuality - Issues, Attitude and Behaviour - Applied Social Psychology - Psyc...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
JAMES WEBB STUDY THE MASSIVE BLACK HOLE SEEDSSérgio Sacani
The pathway(s) to seeding the massive black holes (MBHs) that exist at the heart of galaxies in the present and distant Universe remains an unsolved problem. Here we categorise, describe and quantitatively discuss the formation pathways of both light and heavy seeds. We emphasise that the most recent computational models suggest that rather than a bimodal-like mass spectrum between light and heavy seeds with light at one end and heavy at the other that instead a continuum exists. Light seeds being more ubiquitous and the heavier seeds becoming less and less abundant due the rarer environmental conditions required for their formation. We therefore examine the different mechanisms that give rise to different seed mass spectrums. We show how and why the mechanisms that produce the heaviest seeds are also among the rarest events in the Universe and are hence extremely unlikely to be the seeds for the vast majority of the MBH population. We quantify, within the limits of the current large uncertainties in the seeding processes, the expected number densities of the seed mass spectrum. We argue that light seeds must be at least 103 to 105 times more numerous than heavy seeds to explain the MBH population as a whole. Based on our current understanding of the seed population this makes heavy seeds (Mseed > 103 M⊙) a significantly more likely pathway given that heavy seeds have an abundance pattern than is close to and likely in excess of 10−4 compared to light seeds. Finally, we examine the current state-of-the-art in numerical calculations and recent observations and plot a path forward for near-future advances in both domains.
Microbial interaction
Microorganisms interacts with each other and can be physically associated with another organisms in a variety of ways.
One organism can be located on the surface of another organism as an ectobiont or located within another organism as endobiont.
Microbial interaction may be positive such as mutualism, proto-cooperation, commensalism or may be negative such as parasitism, predation or competition
Types of microbial interaction
Positive interaction: mutualism, proto-cooperation, commensalism
Negative interaction: Ammensalism (antagonism), parasitism, predation, competition
I. Mutualism:
It is defined as the relationship in which each organism in interaction gets benefits from association. It is an obligatory relationship in which mutualist and host are metabolically dependent on each other.
Mutualistic relationship is very specific where one member of association cannot be replaced by another species.
Mutualism require close physical contact between interacting organisms.
Relationship of mutualism allows organisms to exist in habitat that could not occupied by either species alone.
Mutualistic relationship between organisms allows them to act as a single organism.
Examples of mutualism:
i. Lichens:
Lichens are excellent example of mutualism.
They are the association of specific fungi and certain genus of algae. In lichen, fungal partner is called mycobiont and algal partner is called
II. Syntrophism:
It is an association in which the growth of one organism either depends on or improved by the substrate provided by another organism.
In syntrophism both organism in association gets benefits.
Compound A
Utilized by population 1
Compound B
Utilized by population 2
Compound C
utilized by both Population 1+2
Products
In this theoretical example of syntrophism, population 1 is able to utilize and metabolize compound A, forming compound B but cannot metabolize beyond compound B without co-operation of population 2. Population 2is unable to utilize compound A but it can metabolize compound B forming compound C. Then both population 1 and 2 are able to carry out metabolic reaction which leads to formation of end product that neither population could produce alone.
Examples of syntrophism:
i. Methanogenic ecosystem in sludge digester
Methane produced by methanogenic bacteria depends upon interspecies hydrogen transfer by other fermentative bacteria.
Anaerobic fermentative bacteria generate CO2 and H2 utilizing carbohydrates which is then utilized by methanogenic bacteria (Methanobacter) to produce methane.
ii. Lactobacillus arobinosus and Enterococcus faecalis:
In the minimal media, Lactobacillus arobinosus and Enterococcus faecalis are able to grow together but not alone.
The synergistic relationship between E. faecalis and L. arobinosus occurs in which E. faecalis require folic acid
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
Anti-Universe And Emergent Gravity and the Dark Universe
EMME project_OQRelease
1.
2. Hazard Modelers: Laurentiu DANCIU, Karin SESETYAN and
Mine B. DEMIRCIOGLU,
Project Coordinators: Domenico GIARDINI and Mustafa ERDIK
EMME Consortium Condtributing to hazard components:
METU, SAU (TR), IIEES (IR), AUB (LB), YU (JO),
Upesh (PK), IJSU (GE), SCI (ARM), ANAS (AZ)
EMME-HAZ-2014: SEISMIC
HAZARD MODEL and RESULTS
for THE MIDDLE EAST REGION
3. SEISMIC HAZARD COMPONENTS
OUR MAJOR AIM WAS TO BUILD
A regional consensus model
Homogenized across national boundaries
5. All steps of the seismic hazard
assessment have to be:
– Validated
– Benchmarked
– Reproducible
– Standardized
– Inter-Comparable
– Testable
6. EQ Catalog
Harmonized in terms of
Mw
Total: 27174 Events
Historical part (-1900)
Early and modern
instrumental (~2006)
Declustering Method:
Grunthal (1985)
After Declustering
10524 Events
18 Completeness Super-
Zones
Mw>=6.00
EMME14 Catalogue
Completeness Super-Zones
7. Two fully independent source zonation models:
– Area source model
• Active shallow and stable continental areal sources
• Subduction interface modeled as complex fault
• Deep areal sources
– All activity computed from seismicity
– Fault source and background seismicity model
• Fault sources in 3D
• Background seismicity
• Subduction interface modeled as complex fault
• Deep seismicity
– Fault activity computed from slip rates
8. Maximum Magnitude
Upper-bound magnitude to the earthquake recurrence (frequency-magnitude)
curve. Maximum Magnitude assessment (Super-Zones)
– Historical seismicity record
– Location uncertainties
– Analogies to tectonic regions
– Added increment (0.30)
10. Area Source Model
Classical area source zones based on the tectonic findings and their
correlation and up-to-day seismicity
Derived from seismicity patterns
Ensure the zonation adequately reflect this pattern
Surface projection of identified active faults (capable of generating
earthquakes)
12. Area Source Model
EMCA source model integration and harmonization
New tectonic regionalization: stable continental regions (yellow)
13. Source Characterization
Homogeneous, declustered catalogue
Completeness defined for 18 super zones
Maximum likelihood approach (Weichert 1984)
– Truncated Guttenberg-Richter Magnitude Frequency Distribution
• 10a – annual number of events of magnitude greater or equal to zero
• b-GR value
Truncated at each assigned maximum magnitude
For each source three magnitude-frequency-distributions were derived
A Matlab* toolbox was developed
19. Fault source model derived from the faults database collected within WP2
– Total number: 3397 fault segments, total Km: 91551km
EMME Faults Dataset
20. Fault Sources
Criteria to select “capable” or active faults to be used for hazard
assessment:
– Identified active faults [capable of earthquakes]: Northern Anatolian
Faults, Marmara Faults, Zagros Transform Faults
– At least 0.10mm/year (1m in 1000years - Neocene)
– Maximum magnitude equal to 6.00
– Fully parameterized:
• Geometry
• Slip-rates
– Confidence Classes:
• Class A: complete information provided by the compiler
• Class B: partial information provided by compiler
• Class C: limited information provided
• Class D: only top trace available
22. Fault Source Model
• 15 km buffer zone around the surface projection of the fault sources
• M>=6.0 in the buffer zones assigned to fault sources
• M<6.0 in the buffer zones from smoothed seismicity
• Smoothed background seismicity outside the buffer zones
23. Fault Source Model
Slip rate
Fault length / aspect ratio
Maximum Magnitude
Anderson & Luco (1983) Recurrence Model 2
b-value from the completeness super zones
36. Hazard Computed for Spectral ordinates
• PGA
• SA (T=0.1 s)
• SA (T=0.15 s)
• SA (T=0.2 s)
• SA (T=0.25 s)
• SA (T=0.3 s)
• SA (T=0.5 s)
• SA (T=0.75 s)
• SA (T=1.0 s)
• SA (T=2.0 s)
37.
38.
39.
40.
41. Summary
• Building a regional seismic hazard model is a collective effort
• Aim at generating the up-to-date , flexible and scalable database
that will permit continuous update, refinement, and analysis.
• Data will be parameterized and input into the database with a
specific format.
43. Except where otherwise noted, this work is licensed under:
creativecommons.org/licenses/by-nc-nd/4.0/
Please attribute to the GEM Foundation with a link to -
www.globalquakemodel.org