The NGA-West 2 project expands the existing NGA ground motion database to include over 8,600 strong motion records from 334 shallow crustal earthquakes worldwide between 2003-2011. This more than doubles the size of the original NGA database. Extensive new metadata on earthquake sources, propagation paths, and site conditions were collected. In addition, all time series were reprocessed using a uniform methodology and new orientation-independent response spectra were calculated. The updated database will be used by researchers to develop revised ground motion prediction equations as part of the NGA-West 2 project.
This document describes the PEER NGA-West2 ground motion database, which expands the existing NGA database to include over 21,000 additional ground motion records from 600 small-to-moderate California earthquakes as well as data from major international earthquakes since 2003. The database contains seismological metadata, processed time series, and response spectra for magnitudes ranging from 3 to 7.9 and distances from 0.05 to 1533 km. It more than doubles the size of the previous NGA database for moderate-large quakes and will be used to update ground motion prediction equations.
This document summarizes the calibration of the broadband photometric system for the RCT 1.3-meter Robotic Telescope. It finds that the linear color transformations and extinction corrections are consistent with similar KPNO facilities, with a photometric precision of 10% at 1 sigma. Some instrumental errors were identified that likely contributed to the overall uncertainty, related to engineering and maintenance issues for the new robotic facility. A preliminary verification showed the calibration solution is robust, perhaps to a higher precision than indicated by the initial calibration. The RCT has been executing regular science operations since 2009.
In this deck from the 2018 HPC User Forum in Tucson, Christine Goulet from the Southern California Earthquake Center presents: HPC Use for Earthquake Research.
"The Southern California Earthquake Center (SCEC) was founded as a Science & Technology Center in 1991, with joint funding by the NSF and the U. S. Geological Survey. SCEC coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. This research program is investigator-driven and supports core research and education in seismology, tectonic geodesy, earthquake geology, and computational science. The SCEC community advances earthquake system science through three basic activities: (a) gathering information from seismic and geodetic sensors, geologic field observations, and laboratory experiments; (b) synthesizing knowledge of earthquake phenomena through physics-based modeling, including system-level hazard modeling; and communicating our understanding of seismic hazards to reduce earthquake risk and promote community resilience."
Watch the video: https://wp.me/p3RLHQ-imT
Learn more: https://www.scec.org/about
and
http://hpcuserforum.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This document proposes a mission to send a probe to 1000 AU within 50 years to explore the very local interstellar medium. It would use a gravity assist at Jupiter to eliminate angular momentum, fall into 4 solar radii from the sun for a high-speed propulsion burn, and reach speeds of 20 AU/year. Required technologies include high-Isp propulsion, thermal shields, long-life electronics, and autonomous operation. The proposed concept uses solar thermal propulsion and liquid hydrogen, carried on an Atlas V launch vehicle. The probe would perform in situ measurements of the interstellar medium and escape the heliosphere to study boundary regions.
Knapp, wilkins 2018 - gridded satellite (grid sat) goes and conus data-anno...Hiram Abif Meza Landero
This document describes the Gridded Satellite (GridSat) data, which provides GOES satellite data in a modern gridded format. The GridSat data undergoes four processing steps: (1) temporal resampling to produce evenly spaced time steps, (2) spatial remapping to produce gridded data with even spacing, (3) calibrating the original satellite measurements and storing them as brightness temperatures or reflectance, and (4) calculating additional spatial variability metrics. The GridSat data is provided over two domains - GridSat-GOES covers the Western Hemisphere hourly, while GridSat-CONUS covers the contiguous US every 15 minutes.
This document discusses the concept of global constellations of stratospheric satellites (StratoSats) maintained by trajectory control systems. It proposes maintaining tens to hundreds of small, long-duration balloons at an altitude of 35 km to provide continuous, global earth observations. Key points discussed include StratoSat systems design, promising earth science missions like measuring the Earth's radiation budget, and potential demonstration missions to validate the concept like a hurricane intercept mission or radiometer calibration experiment.
NOAA does an excellent job of generating an disseminating data to meet the primary mission of Preservation of Life and Property. There is an unrealized opportunity to exploit the data for research and profit. Much of the data is hidden deep in archives with community specific portals for access. Modern technologies allow new methods to expose more data to wider audiences in order to stimulate innovation and discovery. NOAA is currently experimenting with cloud
technologies through the big data partnership by making high value data sets such as GOES East available on the cloud through cloud provider partners. Specifically: 1. To understand and predict changes in climate, weather, oceans and coasts; 2. To share that knowledge and information with others; and 3. To conserve and manage coastal and marine ecosystems and resources. There is an unrealized opportunity to exploit NOAA?s vast data holdings for research and profit. Much of the data is hidden deep in archives with community specific portals for access. Modern technologies allow new methods to expose more data to wider audiences in order to stimulate innovation and discovery. NOAA is currently experimenting with cloud technologies through the big data partnership by making high value data sets such as GOES East available on the cloud through the partners.
The document discusses using earth observation (EO) data to monitor freshwater quality and quantity. It provides an overview of current capabilities to derive water quality parameters like chlorophyll-a and suspended sediments from satellites. Methods are described to classify different optical water types and select the best algorithm for each type. Ongoing work includes developing a global lakes observatory to monitor 1,000 lakes using EO and integrating data from multiple platforms and sources. EO shows potential to improve freshwater monitoring for research and management.
This document describes the PEER NGA-West2 ground motion database, which expands the existing NGA database to include over 21,000 additional ground motion records from 600 small-to-moderate California earthquakes as well as data from major international earthquakes since 2003. The database contains seismological metadata, processed time series, and response spectra for magnitudes ranging from 3 to 7.9 and distances from 0.05 to 1533 km. It more than doubles the size of the previous NGA database for moderate-large quakes and will be used to update ground motion prediction equations.
This document summarizes the calibration of the broadband photometric system for the RCT 1.3-meter Robotic Telescope. It finds that the linear color transformations and extinction corrections are consistent with similar KPNO facilities, with a photometric precision of 10% at 1 sigma. Some instrumental errors were identified that likely contributed to the overall uncertainty, related to engineering and maintenance issues for the new robotic facility. A preliminary verification showed the calibration solution is robust, perhaps to a higher precision than indicated by the initial calibration. The RCT has been executing regular science operations since 2009.
In this deck from the 2018 HPC User Forum in Tucson, Christine Goulet from the Southern California Earthquake Center presents: HPC Use for Earthquake Research.
"The Southern California Earthquake Center (SCEC) was founded as a Science & Technology Center in 1991, with joint funding by the NSF and the U. S. Geological Survey. SCEC coordinates fundamental research on earthquake processes using Southern California as its principal natural laboratory. This research program is investigator-driven and supports core research and education in seismology, tectonic geodesy, earthquake geology, and computational science. The SCEC community advances earthquake system science through three basic activities: (a) gathering information from seismic and geodetic sensors, geologic field observations, and laboratory experiments; (b) synthesizing knowledge of earthquake phenomena through physics-based modeling, including system-level hazard modeling; and communicating our understanding of seismic hazards to reduce earthquake risk and promote community resilience."
Watch the video: https://wp.me/p3RLHQ-imT
Learn more: https://www.scec.org/about
and
http://hpcuserforum.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This document proposes a mission to send a probe to 1000 AU within 50 years to explore the very local interstellar medium. It would use a gravity assist at Jupiter to eliminate angular momentum, fall into 4 solar radii from the sun for a high-speed propulsion burn, and reach speeds of 20 AU/year. Required technologies include high-Isp propulsion, thermal shields, long-life electronics, and autonomous operation. The proposed concept uses solar thermal propulsion and liquid hydrogen, carried on an Atlas V launch vehicle. The probe would perform in situ measurements of the interstellar medium and escape the heliosphere to study boundary regions.
Knapp, wilkins 2018 - gridded satellite (grid sat) goes and conus data-anno...Hiram Abif Meza Landero
This document describes the Gridded Satellite (GridSat) data, which provides GOES satellite data in a modern gridded format. The GridSat data undergoes four processing steps: (1) temporal resampling to produce evenly spaced time steps, (2) spatial remapping to produce gridded data with even spacing, (3) calibrating the original satellite measurements and storing them as brightness temperatures or reflectance, and (4) calculating additional spatial variability metrics. The GridSat data is provided over two domains - GridSat-GOES covers the Western Hemisphere hourly, while GridSat-CONUS covers the contiguous US every 15 minutes.
This document discusses the concept of global constellations of stratospheric satellites (StratoSats) maintained by trajectory control systems. It proposes maintaining tens to hundreds of small, long-duration balloons at an altitude of 35 km to provide continuous, global earth observations. Key points discussed include StratoSat systems design, promising earth science missions like measuring the Earth's radiation budget, and potential demonstration missions to validate the concept like a hurricane intercept mission or radiometer calibration experiment.
NOAA does an excellent job of generating an disseminating data to meet the primary mission of Preservation of Life and Property. There is an unrealized opportunity to exploit the data for research and profit. Much of the data is hidden deep in archives with community specific portals for access. Modern technologies allow new methods to expose more data to wider audiences in order to stimulate innovation and discovery. NOAA is currently experimenting with cloud
technologies through the big data partnership by making high value data sets such as GOES East available on the cloud through cloud provider partners. Specifically: 1. To understand and predict changes in climate, weather, oceans and coasts; 2. To share that knowledge and information with others; and 3. To conserve and manage coastal and marine ecosystems and resources. There is an unrealized opportunity to exploit NOAA?s vast data holdings for research and profit. Much of the data is hidden deep in archives with community specific portals for access. Modern technologies allow new methods to expose more data to wider audiences in order to stimulate innovation and discovery. NOAA is currently experimenting with cloud technologies through the big data partnership by making high value data sets such as GOES East available on the cloud through the partners.
The document discusses using earth observation (EO) data to monitor freshwater quality and quantity. It provides an overview of current capabilities to derive water quality parameters like chlorophyll-a and suspended sediments from satellites. Methods are described to classify different optical water types and select the best algorithm for each type. Ongoing work includes developing a global lakes observatory to monitor 1,000 lakes using EO and integrating data from multiple platforms and sources. EO shows potential to improve freshwater monitoring for research and management.
Site-dependent Spectra: Ground Motion Records in TurkeyAli Osman Öncel
This document discusses site-dependent ground motion spectra derived from 112 strong motion records from 57 earthquakes in Turkey between 1976-2003. The authors develop horizontal attenuation relationships and compare the derived spectral shapes to those in Turkish and UBC seismic codes. They find that corner periods are consistent with UBC but the Turkish code yields wider constant spectral acceleration plateaus. The results allow generating site-distance-magnitude specific design spectra for probabilistic seismic hazard assessments in Turkey.
This is a proposal which I have submitted to the USGS Earthquake Hazard Program during my stay in Canada, but it did not work since I have been informed that the research area should be focussed over the San Andreas Fault.
This study develops empirical correlations between cumulative absolute velocity (CAV) and spectral accelerations (Sa) using ground motion records from the NGA database. CAV-Sa correlations are influenced by rupture distance and presence of velocity pulses. Piecewise linear fitting equations are provided to quantify the correlations for various periods from 0.01 to 10 seconds. The correlations provide a useful way to characterize the joint occurrence of CAV and Sa, which can be applied in ground motion selection.
1) Satellite altimetry data from missions like TOPEX/Poseidon can be used to forecast water elevations in the Brahmaputra and Ganges rivers up to 10 days in advance with reasonable accuracy.
2) The upcoming SWOT mission is expected to improve upon these forecasts by providing higher precision water elevation data as well as capturing water extent, which will help with flood inundation forecasts.
3) SWOT data can also be used to monitor water levels in important reservoirs in the Indus River basin, though layover effects from the surrounding terrain need further study.
The document analyzes temperature data collected hourly from 1992 to present at multiple depths from 3 coastal sites around Santa Catalina Island, California. It finds that the sites have similar seasonal temperature trends with hottest months from July to September and coldest from December to April. However, the East End and Little Harbor sites experienced greater variability likely due to stronger ocean currents. The high-resolution temperature data set will be publicly available and provides important context for biological research in the region.
This document summarizes the Geologic Time Scale 2004, which provides an updated framework for understanding Earth's history by integrating stratigraphic and chronometric data. Major developments since 1989 include refined international stratigraphic units, new high-precision dating techniques, and statistical methods. The construction of GTS2004 incorporated different techniques depending on data availability and involved specialists from various fields. Anticipated advances by 2008 include formally defining all Phanerozoic boundaries and improving dating and stratigraphy of certain intervals.
The document discusses the role of science and operations in developing the James Webb Space Telescope mission. It describes the science goals that JWST aims to address, including detecting the first galaxies and studying star and planet formation. It outlines the key instruments onboard and discusses how STScI will manage science operations and the ground system. STScI has provided input during development to optimize science return and operations efficiency. Challenges include balancing momentum management with stray light avoidance and ensuring sufficient early funding.
This report analyzed oceanographic patterns of sea surface temperature (SST) and chlorophyll concentration using remotely sensed data at global and regional scales. At the global scale, SST and chlorophyll distributions followed expected patterns driven by ocean currents and upwelling/downwelling zones. Regionally along the Australian East Coast, SST and chlorophyll patterns revealed the influence of the East Australian Current, with seasonal and interannual variability observed. Comparison of remote sensing and in situ data showed good agreement for SST near the surface but limitations for subsurface measurements, highlighting the need for calibration and depth profiling through field work.
Greetings all,
Nowadays, several datasets are -or will be- available in a near future to improve operational forecasting in most aspects, like the
ocean dynamics modeling, and the assimilation efficiency, that aims now to optimize the combination of temperature/salinity in
situ profiles, drifter's velocities, and sea surface height deduce from altimeter's data and GRACE or future Goce geoid. But also
strengthen forecasting system's applications, like the climate monitoring. For all these issues, an optimal use of ocean data,
always too sparse and not enough numerous, is mandatory.
Such studies are at the heart of this Newsletter issue. It begins with a Rio M.H. and Hernandez F. review of the Goce Mission,
dedicated to focus and document the shortest scales of the Earth's gravity field. Goce satellite is due to fly in December 2007.
With the next article Guinéhut S. and Larnicol G. investigate the influence of the in situ temperature profiles sampling on the
thermosteric sea level estimation. They show that the impact is not negligible, and can introduce large errors in the estimation. In
the second article, Benkiran M. and Greiner E. are evaluating the benefits of the drifter's velocities assimilation in the Mercator
Océan 1/3° Tropical and North Atlantic operational system. A description of the assimilation scheme upgrade to take into account
velocity control is given. Castruccio F. & al. describe in the third article the performance of an improved MDT reference for
altimetric data assimilation. They concentrate their study on the Tropical Pacific Ocean. Finally, the Newsletter comes to an end
with the Benkiran M. article. In his study, based on the 1/3° Mercator system, the impact of several altimeters data on the
assimilation performance is assessed
Have a good read
Editorial – Jan/Feb/Mar 2013 – Impact of the loss/addition of satellite altimetry on operational
products
Greengs all,
This issue is dedicated to the study of the impact of the loss or addion of satellite almetry on operaonal products and systems.
The first news feature by Larnicol et al. is presenng the GODAE OceanView Observing System Evaluaon Task Team which primary objecve is to
support observaonal agencies by demonstrang the impact of observaons on operaonal forecast and reanalysis systems. Its secondary objecve
is to improve the performance of operaonal ocean forecast systems.
The second paper by Labroue et al. is reminding us about the main 2012 events within the satellite almetry constellaon. For the past two decades,
we have been used to take for granted the presence of several satellites flying together. The loss of Envisat in April 2012 and the decision to
put Jason-1 on its end of life orbit is a crude reminder of this constellaon fragility. Hence during 2012, the DUACS and MyOcean Sea Level TAC
teams have contributed to secure the almetry component in the frame of operaonal oceanography.
The third paper by Labroue et al. is displaying the potenal offered by Cryosat-2 for the mesoscale signal. The added value brought by Cryosat-2 as a
complement to the exisng almetry constellaon is discussed as well as how Cryosat-2 could contribute to secure the almetry constellaon and
thus the operaonal oceanography. Cryosat-2 mission has been introduced into the Near Real Time Sea Level system since February 2012 and has
been added to the Delayed Time system in April 2012.
The fourth paper by Remy et al. addresses the impact of the change of the satellite constellaon on the French Mercator Ocean analysis and forecas
ng systems. The impact of the loss of the ENVISAT and Jason1 along track Sea Level Anomaly data in the beginning of the year 2012 in the real
me products is studied. A dedicated set of Observing System Experiments (OSEs) is performed and preliminary results are shown. An OSE involves
running a copy of an exisng assimilaon run where some observaons are excluded. The difference between this run and the original run assimila
ng all the observaons allows a detailed assessment of the impact the observaons have on the assimilaon system.
Finally, the fi8h paper by Lea et al. is showing a number of Observing System Experiments (OSEs) to assess the impact of the observing network on
FOAM, the UK Met Office’s ocean assimilaon and forecasng system, as part of GODAE OceanView. A parallel version of the FOAM operaonal
system was run, during April 2011, withholding Jason-2 almeter observaons. Withholding Jason-2 removed 43% of the almeter data and resulted
in a 4% increase in the RMS SSH observaon-minus-background differences and around ±2ºC small scale changes in 100m temperature as well as
around ±0.2 psu changes in surface salinity.
We will meet again in April 2013 for a j
Computational Training and Data Literacy for Domain ScientistsJoshua Bloom
This document discusses training domain scientists in computational and data skills. It notes the increasing amount of data in fields like astronomy and challenges of traditional approaches. It advocates teaching skills like statistics, machine learning, and programming. Examples are given of bootcamps, seminars and degree programs in these areas at UC Berkeley taught by CS and statistics faculty. Challenges discussed include fitting such training into formal curricula and ensuring participation from underrepresented groups. The creation of collaborative spaces is proposed to better connect domain scientists with methodological experts to help scientists address the growing role of data in their fields.
The document discusses GEO Grid's activities during the 2011 Tohoku earthquake and tsunami in Japan, including providing satellite imagery, hazard maps, and geological data through online portals and services. It describes how GEO Grid established a disaster task force to process and deliver satellite data from NASA, JAXA, and other sources to support response and recovery efforts. Key services and data included satellite imagery of damage areas from ASTER, crustal deformation maps from PALSAR interferometry, and shaking maps from the QuiQuake system.
WE2.L10 - NASA's Evolving Approaches to Maximizing Applications Return from o...grssieee
1. NASA is working to maximize the societal benefits and applications return from its Earth observing satellites by focusing more on applications and engaging users early in the design process.
2. NASA conducts applications workshops for individual missions and holds cross-agency workshops to understand user needs and develop partnerships to enable applications of satellite data.
3. NASA is working to transition from focusing solely on science requirements to also considering capabilities for applications through adjustments to satellite design and partnerships with other agencies and users.
WE1.L10 - USE OF NASA DATA IN THE JOINT CENTER FOR SATELLITE DATA ASSIMILATIONgrssieee
The document discusses the use of NASA satellite data in weather and environmental analysis by the Joint Center for Satellite Data Assimilation (JCSDA). The JCSDA is an interagency partnership that works to improve forecast models through better use of satellite observations. It assimilates many NASA sensors operationally, including MODIS, AIRS, and Jason altimetry, and is working to prepare other sensors like SMAP for assimilation testing. Highlights are presented on atmospheric, ocean, and land data assimilation using NASA data to improve analysis and forecasts.
WE2.L10.4: OPERATION ICEBRIDGE: USING INSTRUMENTED AIRCRAFT TO BRIDGE THE OBS...grssieee
Operation IceBridge uses instrumented aircraft to collect data on polar ice sheets, ice shelves, and sea ice between NASA's ICESat satellite missions. It produces a 17-year dataset measuring elevation changes using laser altimetry. In addition to elevation data, IceBridge collects the most comprehensive set of instruments to provide a 3D view of polar regions, including instruments measuring ice thickness, snow depth, and water depth. It is the largest airborne survey ever conducted of Earth's polar ice.
The document summarizes a study that measured the mass of the exoplanet Kepler-78b using radial velocity measurements from the HARPS-N spectrograph. The study found Kepler-78b has a mass of 1.86 Earth masses and a density of 5.57 grams per cubic centimeter, similar to Earth. This makes Kepler-78b the smallest exoplanet with accurately measured mass and radius, and the most similar to Earth in terms of its mass, radius, and density.
Multicomponent Seismic - A Measure_of_Full-wave_MotionRobert Stewart
This document provides an overview of multicomponent seismic exploration and its value. It discusses how multicomponent seismic aims to fully record vibrations in the earth using multiple sensors to enhance traditional P-wave data and create S-wave and surface wave images. Of the additional wave types, converted waves (P-to-S on reflection) have found the most use in resource exploration by imaging below gas and discriminating lithology. The document outlines the history and improvements in multicomponent acquisition methods and processing, and highlights increasing commercial applications and case studies demonstrating its value. It concludes by discussing ongoing areas for further advancing multicomponent seismic methods and applications.
Hydraulic fracturing stimulation designs are moving towards tighter spaced clusters, longer stage length, and more proppant volumes. However, effectively evaluating the hydraulic fracturing stimulation efficiency remains a challenge. Distributed fiber optic sensing, which includes DAS and DTS, can continuously monitor the hydraulic fracturing stimulation downhole and be compared with other monitoring technology such as microseismic.
The DAS and DTS data, when integrated with the microseismic, highlight processes relevant to the completion design and allow for a better understanding and interpretation of each dataset.
This paper outlines a workflow to improve processing and interpretation of DAS and DTS data. In addition,
an estimate of the slurry distribution can be made. These methods will be demonstrated for a horizontal
Wolfcamp well in the Permian Basin. Here we compare key aspects of the microseismic, DAS, and DTS
results in several fracture stages to understand the downhole geomechanical processes. In order to interpret
the DTS data a thermal model is developed (using DTS data) to simulate the temperature behavior after
pumping has ceased. A slurry distribution is obtained by matching the simulated temperature with the
measured temperature from DTS. In addition, the DAS data signal is studied in the frequency domain and
the dominant frequencies are identified that are mostly related to fluid flow and to reduce the background
noise. This time frequency analysis enhances the ability to monitor and optimize well treatments.
After reducing the background noise, the acoustic intensity is correlated to the slurry distribution. The fluid
distribution data from DAS and DTS are compared with the microseismic and near field strain to better
understand the completion processes. We utilized fiber optic microseismic to better understand and
compare it to conventional microseismic.
Finally, we highlight the dynamics of strain and microseismic signature as fluid moves from an offset well
completion into the prior stimulated fiber well to better understand the reservoir and far field effects of the
completion.
Considerations on the collection of data from bio-argo floats across sampling...SeaBirdScientific
Ian D. Walsh, Ph. D, Joel Reiter, Dan Quittman, David J. Murphy, Thomas O. Mitchell, Ph.D. Sea-Bird Scientific. GAIC 2015 Meeting, Galway, Ireland, 14 – 18 Sept. 2015.
ABSTRACT
The flexibility of the current generation of float sensor packages peovides an opportunity to craft mission specific sampling schemes that balance the collection of data for specific sampling goals with the practicalities of float operation. Autonomous floats operate within constraints of battery life and data transfer rates.
For simplicity of data transfer and handling, most float data sets are transmitted after binning on pressure. Within a given pressure bin different instruments will be sampling within a particular defined sequence. A sampling sequence should be balanced towards minimizing energy consumption while maximizing data accuracy of each instrument. As the number of sensors increases and the breadth of mission parameters expands it becomes more difficult to optimize data sequencing and reporting.
We consider methods to reduce the size of the problem by setting rules for sequence development and test those rules relative to field data. We examine a set of data from a float that was equipped with internal memory that captured the full set of sample data taken during the profiling mission.
Comparing the ‘raw’ data and the transmitted data we examine the variance around the transmitted data and discuss the impact of data sequencing on the data.
Metadata syncronisation with GeoNetwork - a users perspectiveARDC
Metadata synchronisation with GeoNetwork - a users perspective: making metadata great again.
Presented at the ANDS facilitated GeoNetwork Community of Practice on April 3rd, 2017 in Canberra.
Toward Real-Time Analysis of Large Data Volumes for Diffraction Studies by Ma...EarthCube
Talk at the EarthCube End-User Domain Workshop for Rock Deformation and Mineral Physics Research.
By Martin Kunz, Lawrence Berkeley National Laboratory
Site-dependent Spectra: Ground Motion Records in TurkeyAli Osman Öncel
This document discusses site-dependent ground motion spectra derived from 112 strong motion records from 57 earthquakes in Turkey between 1976-2003. The authors develop horizontal attenuation relationships and compare the derived spectral shapes to those in Turkish and UBC seismic codes. They find that corner periods are consistent with UBC but the Turkish code yields wider constant spectral acceleration plateaus. The results allow generating site-distance-magnitude specific design spectra for probabilistic seismic hazard assessments in Turkey.
This is a proposal which I have submitted to the USGS Earthquake Hazard Program during my stay in Canada, but it did not work since I have been informed that the research area should be focussed over the San Andreas Fault.
This study develops empirical correlations between cumulative absolute velocity (CAV) and spectral accelerations (Sa) using ground motion records from the NGA database. CAV-Sa correlations are influenced by rupture distance and presence of velocity pulses. Piecewise linear fitting equations are provided to quantify the correlations for various periods from 0.01 to 10 seconds. The correlations provide a useful way to characterize the joint occurrence of CAV and Sa, which can be applied in ground motion selection.
1) Satellite altimetry data from missions like TOPEX/Poseidon can be used to forecast water elevations in the Brahmaputra and Ganges rivers up to 10 days in advance with reasonable accuracy.
2) The upcoming SWOT mission is expected to improve upon these forecasts by providing higher precision water elevation data as well as capturing water extent, which will help with flood inundation forecasts.
3) SWOT data can also be used to monitor water levels in important reservoirs in the Indus River basin, though layover effects from the surrounding terrain need further study.
The document analyzes temperature data collected hourly from 1992 to present at multiple depths from 3 coastal sites around Santa Catalina Island, California. It finds that the sites have similar seasonal temperature trends with hottest months from July to September and coldest from December to April. However, the East End and Little Harbor sites experienced greater variability likely due to stronger ocean currents. The high-resolution temperature data set will be publicly available and provides important context for biological research in the region.
This document summarizes the Geologic Time Scale 2004, which provides an updated framework for understanding Earth's history by integrating stratigraphic and chronometric data. Major developments since 1989 include refined international stratigraphic units, new high-precision dating techniques, and statistical methods. The construction of GTS2004 incorporated different techniques depending on data availability and involved specialists from various fields. Anticipated advances by 2008 include formally defining all Phanerozoic boundaries and improving dating and stratigraphy of certain intervals.
The document discusses the role of science and operations in developing the James Webb Space Telescope mission. It describes the science goals that JWST aims to address, including detecting the first galaxies and studying star and planet formation. It outlines the key instruments onboard and discusses how STScI will manage science operations and the ground system. STScI has provided input during development to optimize science return and operations efficiency. Challenges include balancing momentum management with stray light avoidance and ensuring sufficient early funding.
This report analyzed oceanographic patterns of sea surface temperature (SST) and chlorophyll concentration using remotely sensed data at global and regional scales. At the global scale, SST and chlorophyll distributions followed expected patterns driven by ocean currents and upwelling/downwelling zones. Regionally along the Australian East Coast, SST and chlorophyll patterns revealed the influence of the East Australian Current, with seasonal and interannual variability observed. Comparison of remote sensing and in situ data showed good agreement for SST near the surface but limitations for subsurface measurements, highlighting the need for calibration and depth profiling through field work.
Greetings all,
Nowadays, several datasets are -or will be- available in a near future to improve operational forecasting in most aspects, like the
ocean dynamics modeling, and the assimilation efficiency, that aims now to optimize the combination of temperature/salinity in
situ profiles, drifter's velocities, and sea surface height deduce from altimeter's data and GRACE or future Goce geoid. But also
strengthen forecasting system's applications, like the climate monitoring. For all these issues, an optimal use of ocean data,
always too sparse and not enough numerous, is mandatory.
Such studies are at the heart of this Newsletter issue. It begins with a Rio M.H. and Hernandez F. review of the Goce Mission,
dedicated to focus and document the shortest scales of the Earth's gravity field. Goce satellite is due to fly in December 2007.
With the next article Guinéhut S. and Larnicol G. investigate the influence of the in situ temperature profiles sampling on the
thermosteric sea level estimation. They show that the impact is not negligible, and can introduce large errors in the estimation. In
the second article, Benkiran M. and Greiner E. are evaluating the benefits of the drifter's velocities assimilation in the Mercator
Océan 1/3° Tropical and North Atlantic operational system. A description of the assimilation scheme upgrade to take into account
velocity control is given. Castruccio F. & al. describe in the third article the performance of an improved MDT reference for
altimetric data assimilation. They concentrate their study on the Tropical Pacific Ocean. Finally, the Newsletter comes to an end
with the Benkiran M. article. In his study, based on the 1/3° Mercator system, the impact of several altimeters data on the
assimilation performance is assessed
Have a good read
Editorial – Jan/Feb/Mar 2013 – Impact of the loss/addition of satellite altimetry on operational
products
Greengs all,
This issue is dedicated to the study of the impact of the loss or addion of satellite almetry on operaonal products and systems.
The first news feature by Larnicol et al. is presenng the GODAE OceanView Observing System Evaluaon Task Team which primary objecve is to
support observaonal agencies by demonstrang the impact of observaons on operaonal forecast and reanalysis systems. Its secondary objecve
is to improve the performance of operaonal ocean forecast systems.
The second paper by Labroue et al. is reminding us about the main 2012 events within the satellite almetry constellaon. For the past two decades,
we have been used to take for granted the presence of several satellites flying together. The loss of Envisat in April 2012 and the decision to
put Jason-1 on its end of life orbit is a crude reminder of this constellaon fragility. Hence during 2012, the DUACS and MyOcean Sea Level TAC
teams have contributed to secure the almetry component in the frame of operaonal oceanography.
The third paper by Labroue et al. is displaying the potenal offered by Cryosat-2 for the mesoscale signal. The added value brought by Cryosat-2 as a
complement to the exisng almetry constellaon is discussed as well as how Cryosat-2 could contribute to secure the almetry constellaon and
thus the operaonal oceanography. Cryosat-2 mission has been introduced into the Near Real Time Sea Level system since February 2012 and has
been added to the Delayed Time system in April 2012.
The fourth paper by Remy et al. addresses the impact of the change of the satellite constellaon on the French Mercator Ocean analysis and forecas
ng systems. The impact of the loss of the ENVISAT and Jason1 along track Sea Level Anomaly data in the beginning of the year 2012 in the real
me products is studied. A dedicated set of Observing System Experiments (OSEs) is performed and preliminary results are shown. An OSE involves
running a copy of an exisng assimilaon run where some observaons are excluded. The difference between this run and the original run assimila
ng all the observaons allows a detailed assessment of the impact the observaons have on the assimilaon system.
Finally, the fi8h paper by Lea et al. is showing a number of Observing System Experiments (OSEs) to assess the impact of the observing network on
FOAM, the UK Met Office’s ocean assimilaon and forecasng system, as part of GODAE OceanView. A parallel version of the FOAM operaonal
system was run, during April 2011, withholding Jason-2 almeter observaons. Withholding Jason-2 removed 43% of the almeter data and resulted
in a 4% increase in the RMS SSH observaon-minus-background differences and around ±2ºC small scale changes in 100m temperature as well as
around ±0.2 psu changes in surface salinity.
We will meet again in April 2013 for a j
Computational Training and Data Literacy for Domain ScientistsJoshua Bloom
This document discusses training domain scientists in computational and data skills. It notes the increasing amount of data in fields like astronomy and challenges of traditional approaches. It advocates teaching skills like statistics, machine learning, and programming. Examples are given of bootcamps, seminars and degree programs in these areas at UC Berkeley taught by CS and statistics faculty. Challenges discussed include fitting such training into formal curricula and ensuring participation from underrepresented groups. The creation of collaborative spaces is proposed to better connect domain scientists with methodological experts to help scientists address the growing role of data in their fields.
The document discusses GEO Grid's activities during the 2011 Tohoku earthquake and tsunami in Japan, including providing satellite imagery, hazard maps, and geological data through online portals and services. It describes how GEO Grid established a disaster task force to process and deliver satellite data from NASA, JAXA, and other sources to support response and recovery efforts. Key services and data included satellite imagery of damage areas from ASTER, crustal deformation maps from PALSAR interferometry, and shaking maps from the QuiQuake system.
WE2.L10 - NASA's Evolving Approaches to Maximizing Applications Return from o...grssieee
1. NASA is working to maximize the societal benefits and applications return from its Earth observing satellites by focusing more on applications and engaging users early in the design process.
2. NASA conducts applications workshops for individual missions and holds cross-agency workshops to understand user needs and develop partnerships to enable applications of satellite data.
3. NASA is working to transition from focusing solely on science requirements to also considering capabilities for applications through adjustments to satellite design and partnerships with other agencies and users.
WE1.L10 - USE OF NASA DATA IN THE JOINT CENTER FOR SATELLITE DATA ASSIMILATIONgrssieee
The document discusses the use of NASA satellite data in weather and environmental analysis by the Joint Center for Satellite Data Assimilation (JCSDA). The JCSDA is an interagency partnership that works to improve forecast models through better use of satellite observations. It assimilates many NASA sensors operationally, including MODIS, AIRS, and Jason altimetry, and is working to prepare other sensors like SMAP for assimilation testing. Highlights are presented on atmospheric, ocean, and land data assimilation using NASA data to improve analysis and forecasts.
WE2.L10.4: OPERATION ICEBRIDGE: USING INSTRUMENTED AIRCRAFT TO BRIDGE THE OBS...grssieee
Operation IceBridge uses instrumented aircraft to collect data on polar ice sheets, ice shelves, and sea ice between NASA's ICESat satellite missions. It produces a 17-year dataset measuring elevation changes using laser altimetry. In addition to elevation data, IceBridge collects the most comprehensive set of instruments to provide a 3D view of polar regions, including instruments measuring ice thickness, snow depth, and water depth. It is the largest airborne survey ever conducted of Earth's polar ice.
The document summarizes a study that measured the mass of the exoplanet Kepler-78b using radial velocity measurements from the HARPS-N spectrograph. The study found Kepler-78b has a mass of 1.86 Earth masses and a density of 5.57 grams per cubic centimeter, similar to Earth. This makes Kepler-78b the smallest exoplanet with accurately measured mass and radius, and the most similar to Earth in terms of its mass, radius, and density.
Multicomponent Seismic - A Measure_of_Full-wave_MotionRobert Stewart
This document provides an overview of multicomponent seismic exploration and its value. It discusses how multicomponent seismic aims to fully record vibrations in the earth using multiple sensors to enhance traditional P-wave data and create S-wave and surface wave images. Of the additional wave types, converted waves (P-to-S on reflection) have found the most use in resource exploration by imaging below gas and discriminating lithology. The document outlines the history and improvements in multicomponent acquisition methods and processing, and highlights increasing commercial applications and case studies demonstrating its value. It concludes by discussing ongoing areas for further advancing multicomponent seismic methods and applications.
Hydraulic fracturing stimulation designs are moving towards tighter spaced clusters, longer stage length, and more proppant volumes. However, effectively evaluating the hydraulic fracturing stimulation efficiency remains a challenge. Distributed fiber optic sensing, which includes DAS and DTS, can continuously monitor the hydraulic fracturing stimulation downhole and be compared with other monitoring technology such as microseismic.
The DAS and DTS data, when integrated with the microseismic, highlight processes relevant to the completion design and allow for a better understanding and interpretation of each dataset.
This paper outlines a workflow to improve processing and interpretation of DAS and DTS data. In addition,
an estimate of the slurry distribution can be made. These methods will be demonstrated for a horizontal
Wolfcamp well in the Permian Basin. Here we compare key aspects of the microseismic, DAS, and DTS
results in several fracture stages to understand the downhole geomechanical processes. In order to interpret
the DTS data a thermal model is developed (using DTS data) to simulate the temperature behavior after
pumping has ceased. A slurry distribution is obtained by matching the simulated temperature with the
measured temperature from DTS. In addition, the DAS data signal is studied in the frequency domain and
the dominant frequencies are identified that are mostly related to fluid flow and to reduce the background
noise. This time frequency analysis enhances the ability to monitor and optimize well treatments.
After reducing the background noise, the acoustic intensity is correlated to the slurry distribution. The fluid
distribution data from DAS and DTS are compared with the microseismic and near field strain to better
understand the completion processes. We utilized fiber optic microseismic to better understand and
compare it to conventional microseismic.
Finally, we highlight the dynamics of strain and microseismic signature as fluid moves from an offset well
completion into the prior stimulated fiber well to better understand the reservoir and far field effects of the
completion.
Considerations on the collection of data from bio-argo floats across sampling...SeaBirdScientific
Ian D. Walsh, Ph. D, Joel Reiter, Dan Quittman, David J. Murphy, Thomas O. Mitchell, Ph.D. Sea-Bird Scientific. GAIC 2015 Meeting, Galway, Ireland, 14 – 18 Sept. 2015.
ABSTRACT
The flexibility of the current generation of float sensor packages peovides an opportunity to craft mission specific sampling schemes that balance the collection of data for specific sampling goals with the practicalities of float operation. Autonomous floats operate within constraints of battery life and data transfer rates.
For simplicity of data transfer and handling, most float data sets are transmitted after binning on pressure. Within a given pressure bin different instruments will be sampling within a particular defined sequence. A sampling sequence should be balanced towards minimizing energy consumption while maximizing data accuracy of each instrument. As the number of sensors increases and the breadth of mission parameters expands it becomes more difficult to optimize data sequencing and reporting.
We consider methods to reduce the size of the problem by setting rules for sequence development and test those rules relative to field data. We examine a set of data from a float that was equipped with internal memory that captured the full set of sample data taken during the profiling mission.
Comparing the ‘raw’ data and the transmitted data we examine the variance around the transmitted data and discuss the impact of data sequencing on the data.
Metadata syncronisation with GeoNetwork - a users perspectiveARDC
Metadata synchronisation with GeoNetwork - a users perspective: making metadata great again.
Presented at the ANDS facilitated GeoNetwork Community of Practice on April 3rd, 2017 in Canberra.
Toward Real-Time Analysis of Large Data Volumes for Diffraction Studies by Ma...EarthCube
Talk at the EarthCube End-User Domain Workshop for Rock Deformation and Mineral Physics Research.
By Martin Kunz, Lawrence Berkeley National Laboratory
This document compares wind and wave data from two sources - BMKG Cilacap land station and NOAA Pangandaran satellite recordings - at Bojong Salawe Beach, Indonesia. The data was downscaled from 4 recordings per day to 24 using empirical and linear equations. Analysis found similarities in wind direction and wave direction before and after downscaling, validating the downscaling methods. However, significant differences were found in wave height between the data sources. BMKG data showed unpredictable maximum wave heights annually while NOAA data followed a regular pattern, important for construction planning.
Ecological Society of America Workshop on Incentives for Data SharingTom Moritz
The document describes sap flow data collected from multiple branches of manzanita shrubs from December 2007 to July 2008 using heat dissipation probes. The data was collected every 5 minutes using a datalogger and includes measurements of sap flow from 13 different branches as well as date/time stamps and other metadata. The data is intended to correlate physiological activity with below-ground measures of root growth and CO2 production.
FASTNET is a project that brings together real-time and historical aerosol data sets into several web-based consoles to aid in analysis of aerosol events. It synthesizes data from various sources, like monitors, satellites, and models, to provide a broader characterization of events than individual data sets alone. The consoles allow users to browse and analyze data to understand features and potential causes of aerosol episodes. FASTNET is freely accessible online and its products have the potential to provide context to air quality data.
The document summarizes an empirical ground motion model developed as part of the PEER Next Generation Attenuation (NGA) project. Key points:
- The model predicts peak ground acceleration, velocity, displacement, and response spectra for shallow crustal earthquakes in active tectonic regions.
- It is based on over 1,500 recordings from 64 earthquakes ranging in magnitude from 4.3 to 7.9 and distances from 0.1 to 199 km.
- The model accounts for magnitude, distance, faulting style, depth, directivity, site conditions, and variability between events and recordings.
TGS has built an extensive database of digital well logs and related data from Russia over many years. They have well packages including scanned log images where available, standardized LAS files, and processed LAS+ files ready for interpretation. Their database contains about 24,000 wells covering major Russian basins, including over 21,000 from West Siberia. Additionally, TGS has digital databases with information on around 55,000 and 50,000 Russian wells respectively, plus seismic and license data.
79 roger n. anderson - 6826483 - petroleum reservoir simulation and charact...Mello_Patent_Registry
This patent describes a system for integrating multiple reservoir simulation software applications into a single interface. The system provides a workflow that allows data to be passed between applications, outputs to be optimized by reconciling inconsistencies, and historical analysis of assumptions and results. This facilitates remote collaborative reservoir analysis using disparate tools while reducing errors from incompatible data formats or conclusions between applications.
This document summarizes a computer physics communications article about the conditions database system for the COMPASS experiment. The key points are:
1) COMPASS integrated a conditions database system to manage time-dependent detector condition, calibration, and geometry alignment information using software from CERN.
2) The conditions database consists of administration tools, a data handling library, and software to transfer data from detector controls to the database.
3) Performance tests on the COMPASS computing farm showed the conditions database system was able to efficiently manage the large volumes of time-dependent experimental data needed for the COMPASS experiment.
2005-01-08 MANE-VU Status Report on CATT and FASTNETRudolf Husar
CATT and FASTNET are inter-RPO projects that provide tools for analyzing aerosol and trajectory data. They have been integrated into the DataFed.Net infrastructure, which provides a variety of web-based applications and data catalogs for accessing, viewing, analyzing, and interpreting fast, current, and slow aerosol and meteorological data.
CATT and FASTNET are inter-RPO projects that provide tools for analyzing aerosol and trajectory data. They have been integrated into the DataFed.Net infrastructure, which provides a variety of web-based applications and data catalogs for accessing, viewing, analyzing, and interpreting fast, current, and slow aerosol and meteorological data. This document provides an overview of the types of data and analysis tools available through DataFed.Net for studying aerosol events and transport patterns.
Supercharging your Apache OODT deployments with the Process Control SystemChris Mattmann
The document discusses the Process Control System (PCS), a component of the Apache OODT framework. PCS provides capabilities for data management, pipeline processing, and resource management. It has been deployed for several NASA Earth science missions to automate processing and manage large volumes of science data. Customizing PCS for a new mission involves configuring servers, specifying product metadata and processing rules, and defining compute resource policies.
A robust data treatment approach for fuel cells system analysisISA Interchange
The document describes a robust approach for analyzing data from fuel cell stack testing. It addresses challenges in handling large amounts of data from multiple devices. The approach includes:
1) Developing an interface in Excel to automate data handling and analysis across Excel and MATLAB for improved efficiency.
2) Using dynamic time warping to synchronize data sequences and align them based on time for better comparison.
3) Applying data reconciliation to optimally adjust measurements so they obey physical constraints like conservation of voltage sums, improving accuracy of analysis.
2009 HEP Science Network Requirements Workshop Final Reportbutest
The document summarizes the proceedings of a workshop organized by the Energy Sciences Network (ESnet) and the Office of High Energy Physics (HEP) to characterize the networking requirements of HEP science programs over the next 10 years. Key points discussed include:
- The HEP community has large, distributed data needs that will continue growing with projects like the LHC. More LHC Tier-3 sites and Tier-2 to Tier-2 traffic are anticipated.
- The two LHC Tier-1 sites in the US predict needing 40-50Gbps capacity in 2-5 years and 100-200Gbps in 5-10 years to support HEP traffic.
- There are
What Do Ground Motion Prediction Equations Tell Us?Ali Osman Öncel
Ground motion prediction equations (GMPEs) provide simple equations to estimate ground motion levels based on magnitude, distance, site conditions, and other variables. While useful for engineering applications, GMPEs tell us little about ground motion variability near faults as they provide average motions from many recordings. Near-fault recordings of large earthquakes can provide more insight, showing variations in amplitude and polarization due to nonuniform fault slip, site effects, and fault zone effects. The dense network of stations recording the 2004 M6.0 Parkfield earthquake revealed less spatial variability for longer period ground motions compared to higher frequencies.
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
Summary of current radiometric calibration coefficients for Landsat MSS, TM, ETM+,
and EO-1 ALI sensors
Gyanesh Chander a,⁎, Brian L. Markham b, Dennis L. Helder c
a SGT, Inc. 1 contractor to the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center, Sioux Falls, SD 57198-0001, USA
b National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), Greenbelt, MD 20771, USA
c South Dakota State University (SDSU), Brookings, SD 57007, USA
🔥🔥🔥🔥🔥🔥🔥🔥🔥
إضغ بين إيديكم من أقوى الملازم التي صممتها
ملزمة تشريح الجهاز الهيكلي (نظري 3)
💀💀💀💀💀💀💀💀💀💀
تتميز هذهِ الملزمة بعِدة مُميزات :
1- مُترجمة ترجمة تُناسب جميع المستويات
2- تحتوي على 78 رسم توضيحي لكل كلمة موجودة بالملزمة (لكل كلمة !!!!)
#فهم_ماكو_درخ
3- دقة الكتابة والصور عالية جداً جداً جداً
4- هُنالك بعض المعلومات تم توضيحها بشكل تفصيلي جداً (تُعتبر لدى الطالب أو الطالبة بإنها معلومات مُبهمة ومع ذلك تم توضيح هذهِ المعلومات المُبهمة بشكل تفصيلي جداً
5- الملزمة تشرح نفسها ب نفسها بس تكلك تعال اقراني
6- تحتوي الملزمة في اول سلايد على خارطة تتضمن جميع تفرُعات معلومات الجهاز الهيكلي المذكورة في هذهِ الملزمة
واخيراً هذهِ الملزمة حلالٌ عليكم وإتمنى منكم إن تدعولي بالخير والصحة والعافية فقط
كل التوفيق زملائي وزميلاتي ، زميلكم محمد الذهبي 💊💊
🔥🔥🔥🔥🔥🔥🔥🔥🔥
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.pptHenry Hollis
The History of NZ 1870-1900.
Making of a Nation.
From the NZ Wars to Liberals,
Richard Seddon, George Grey,
Social Laboratory, New Zealand,
Confiscations, Kotahitanga, Kingitanga, Parliament, Suffrage, Repudiation, Economic Change, Agriculture, Gold Mining, Timber, Flax, Sheep, Dairying,
How Barcodes Can Be Leveraged Within Odoo 17Celine George
In this presentation, we will explore how barcodes can be leveraged within Odoo 17 to streamline our manufacturing processes. We will cover the configuration steps, how to utilize barcodes in different manufacturing scenarios, and the overall benefits of implementing this technology.
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...EduSkills OECD
Andreas Schleicher, Director of Education and Skills at the OECD presents at the launch of PISA 2022 Volume III - Creative Minds, Creative Schools on 18 June 2024.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
Wcee2012 5599
1. PEER NGA-West2 Database:
A Database of Ground Motions Recorded in
Shallow Crustal Earthquakes in Active Tectonic Regions
T.D. Ancheta & Y. Bozorgnia
Pacific Earthquake Engineering Research Center, Univeristy of California, Berkeley, CA,
U.S.A.
R. Darragh & W.J. Silva
Pacific Engineering and Analysis, El Cerrito, CA, U.S.A.
B. Chiou
California Department of Transportation, Sacramento, CA, U.S.A.
J.P. Stewart
University of California, Los Angeles, CA, U.S.A.
D.M. Boore & R. Graves
U.S. Geological Survey, Menlo Park & Pasadena, CA, U.S.A.
N.A. Abrahamson
Pacific Gas & Electric Company, San Francisco, CA, U.S.A.
K.W. Campbell
EQECAT, Inc., Beaverton, OR, U.S.A.
I.M. Idriss
University of California, Davis, CA, U.S.A.
R.R. Youngs
AMEC Geomatrix, Oakland, CA, U.S.A.
G.M. Atkinson
University of Western Ontario, London, Ont. Canada
SUMMARY
The NGA-West 2 project database expands on the current PEER NGA ground-motion database to include
worldwide ground-motion data recorded from shallow crustal earthquakes in active tectonic regimes post 2003.
Since 2003, numerous well-recorded events have occurred worldwide, including the 2003 M6.6 Bam (Iran),
2004 M6 Parkfield (CA), 2008 M7.9 Wenchuan (China), 2009 M6.3 L’Aquila (Italy), 2010 M7 El Mayor-
Cucupah (CA and Mexico), 2010 M7 Darfield (NZ), 2011 M6.1 Christchurch (NZ), and several well-recorded
shallow crustal earthquakes in Japan, among other events. The NGA database has been extensively expanded to
include the recorded ground-motion data, and metadata, in these and other recent events. The updated strong-
motion database (NGA-West 2) currently includes 8611 three-component records from 334 shallow crustal
events. The updated database has a magnitude range of 3.4 to 7.9, and a distance range of 0.05 to 1533 km. The
estimated or measured time-averaged shear-wave velocity in the top 30 meters at the recording sites (i.e., Vs30)
ranges from 94 to 2100 m/sec. The NGA-West 2 database more than doubles the size of the previous NGA
database. The database includes uniformly processed time series as well as response spectral ordinates for 111
periods ranging from 0.01 to 20 seconds and 11 different damping ratios. Extensive metadata have also been
collected and added to the database. The database is currently being utilized by NGA researchers to update the
2008 ground-motion prediction equations (GMPEs).
Keywords: earthquake, strong motion, database, metadata
2. 1. INTRODUCTION
The importance of a common high-quality ground-motion database was recognized in the NGA
project. Having project investigators use a common database fostered collaboration between ground
motion prediction equation (GMPE) development groups and made model-to-model comparisons
more meaningful. The scope of the NGA-West 2 database update are to add new ground-motion data
and to improve metadata as well as new supporting information to the NGA database to aid the update
of the NGA-West GMPEs. A major element of the improved database is the inclusion of significant
shallow crustal events occurred post 2003, the cut-off date for events in the previous NGA database.
This paper begins with a brief description and overview of NGA-West 2 database. The new events
included in the updated data set are highlighted. The four different metadata sources (source, path,
site, time series and spectra) are described afterward focusing new features added during the NGA-
West 2 project.
1.1. Database Overview
The NGA-West 2 database started with the PEER NGA database
(http://peer.berkeley.edu/peer_ground_motion_database) of ground motion recording from shallow
crustal events in active tectonic regions that was completed in 2006 (Chiou et al., 2008). The NGA
database was at the time the largest set of uniformly processed set of ground motions available. The
NGA-West 2 database continues the similar methodology of data collection as in the previous
database to now include records prior to February 2011. The NGA database consists of a set of strong
motion records (text files) and metadata tables. The entire set of records was uniformly processed with
the PEER strong motion processing algorithm developed by Pacific Engineering and Analysis (PEA)
which is detailed below (Darragh et al. 2004). The metadata tables were developed under direction of
different working groups and final information entered into data tables went through a significant
review process. Each working group contained a panel of experts within that field of interest (e.g.
seismologists for the Earthquake Source Working Group). A summary flat file was created from the
records and metadata tables and contains the key information used by the individual NGA-West 2
GMPE model developers. In addition to the three-component as-recorded spectra a new orientation-
independent rotational spectra called RotDnn (Boore, 2010) is provided. The available spectra are now
provided for periods 0.01 to 20 seconds and for 11 different damping ratios ranging from 0.1 to 30%.
Figure 1. Map of the epicentre distribution of the 334 events. Open circles are events in the previous NGA
database and solid stars are events added in the NGA-West 2 database.
3. Currently the NGA-West 2 database contains 8611 multi-component records from 334 events. Figure
1 shows the distribution of the hypocenter locations and highlights the 178 events that were added.
Figure 2 shows a comparison of the magnitude-closest distance distribution between the records in the
NGA and NGA-West 2. This new set contains roughly double the number of events and records from
the previous database.
Figure 2. Magnitude-distance distribution of strong-motion records in the NGA-West 2 database.
1.2. Metadata Tables
As in the previous database, four metadata tables were created by different working groups: record
catalog, finite source table, a site table, and propagation path table. The data tables used in the NGA
project (Chiou et al. 2008) were used as the start of the current tables. In addition to adding
information to the databases from additional station/events there was a significant review of existing
metadata. New metadata (i.e. Rx) added to each data table are described below. Additionally,
improvements have been made toward uniformity and transparency in metadata collection and
estimation. A subset of the pertinent information in all the tables are summarized in the database flat
file which is used by the various working groups in the NGA-West 2 project.
The record catalog is a list of the strong motion recordings included in the database. The record
catalog also contains the spectra, intensity measures (PGA, PGV and PGD), and selected filter corner
frequencies. Each record is given a unique record sequence number (RSN) as it is added. The
earthquake source table contains earthquake source information for the selected events such as
magnitude, hypocenter location, finite fault dimensions (as available), and seismic moment. Events are
given a unique earthquake ID number (EQID) as they are added. The site table is a collection of site
information for each recording station such as station location, Vs30, codes indicating how Vs30 was
established, and various proxies used for Vs30 estimation. A station is given a unique station sequence
number (SSN) as it is added. The three ID numbers (RSN, SSN, and EQID) facilitate a linkage
between the tables and aided in the creation of the final flat file.
4. The following sections give an overview of added metadata, improvement on the metadata and record
processing, and review of the new database.
1.3. NGA-West 2 Flat File
The NGA-West 2 flat file serves as a single file of key metadata and ground-motion parameters to be
used by NGA developers in regression analysis. It is formed by merging a subset of information
within all four tables. The flat file now contains more than 130 columns of metadata, PGA, PGV, PGD
and pseudo absolute spectral acceleration at 111 periods. A total of 55 flat files are created for spectral
acceleration for the three as-recorded components, RotDnn (to be explained later), and GMRotI50 at
11 different damping ratios.
2. TIME SERIES AND SPECTRA
As in the previous NGA database, Pacific Engineering and Analysis (PEA) collected digitized but
otherwise unprocessed accelerograms from various agencies around the world and uniformly
processed the raw accelerograms. The PEER processing procedure, including instrument correction,
bandpass filtering (removal of unwanted noise), and baseline correction, is described in Darragh et al.
(2004). A major change from that procedure is the systematic use of acausal Butterworth filter,
whereas previously causal filter was the preferred type of filter. Records were also re-evaluated to
extend their usable frequency (and hence reprocessing), for identifying late triggers, for alignment in
absolute time, and for potential co-located instruments. The processed time series were used to
calculate various types of response spectra as described in Section 2.2.
2.1. Time Series Processing and Station Flags
To improve the quality of the accelergrams and intensity measures included in the database a
significant effort was made toward uniformity in the data processing and the evaluation of the
completeness of the recorded event. More than 1600 records that were originally passed-through
without correction for the original NGA database had their raw accelerograms collected and processed
using the PEER methodology. Stations in the previous database that were co-located with another
station were removed from the NGA-West 2 database. The entire record set was evaluated for late p-
/s-triggers (e.g. missing portions of the p-wave/s-wave). Records with portions of the p-wave or s-
wave lost due to a late trigger were flagged.
2.2. RotDnn Spectra
The NGA-West 2 project has selected an orientation independent spectra that is not calculated from
the geometric mean of the two horizontal components called RotDnn (Boore, 2010). The ‘D’ stands
for a period dependent rotation angle and ‘nn’ is the fractile of the rotated spectra sorted by amplitude.
Therefore, the rotated spectra over a range of periods will have a non-uniform set of rotation angles.
RotDnn gives the spectra from the smallest to largest for any rotation angle. The median amplitude
over all non-redundant angles, RotD50, will be used in the development of the updated ground motion
prediction equations (GMPE). A comparison of RotD50 to GMRotI50 by Boore (2010) indicates a
slight increase in the overall variation in RotD50.
The period dependent minimum and maximum rotated spectral amplitude, RotD00 and RotD100, will
be utilized by the Directionality and Directivity working groups to develop equations to convert the
RotD50 to the RotD100 (maximum spectral amplitude) and new directivity equations.
The RotDnn values were calculated using the program nga2psa_rot_gmrot.for (personal comm,
Boore). Reasons for not providing rotated spectra were: missing horizontal component azimuth,
missing a horizontal component, misalignment in absolute time of the two horizontal components.
5. 3. FINITE SOURCE MODELS
The 173 added earthquakes were given source parameters based on the available studies on each
event. For large well-recorded events, multiple finite fault models were available and reviewed. As in
the previous database, the method of selecting the preferred geometry was based on the type of data
used in the inversion (i.e. GPS, geodedic, teleseismic, and strong motion). For the events with multiple
solutions, the fault model that used strong motion data in the inversion was preferred. The preferred
finite fault model went through several iterations of review by the earthquake source working group,
which in some cases involved the review of reports that were in press.
Trimming the areal extent of the finite fault plane of newly added earthquakes is performed using the
methodology used in the NGA project. An example application of the finite fault model selection and
fault trimming is provided in Stewart et al. (2012a) for one of the events in the database (L’Aquila,
Italy).
4. PATH METRICS
Metadata entered into the propagation path table included various distance measures, hanging wall
indicator, radiation pattern coefficients, and directivity parameters. Two new distance measures have
been added, Rx and Ry, which are related to the generalized coordinate T and U defined in Appendix A
of Spudich and Chiou (2008). Furthermore, directivity parameters have been expanded to include
multiple models of directivity that are being developed within the NGA-West 2 project. The set of
directivity models considered is intended to aid the direct inclusion of directivity effects into the
updated NGA GMPEs.
In the previous NGA database, distance metrics such as the Joyner-Boore distance and the distance to
the closest point on fault rupture are missing when the finite fault model is not available. A decision is
made by the NGA-West 2 developers to adopt the method to simulate finite fault planes for events
without a finite fault model but with minimal information of hypocenter, magnitude, and fault plane
solution (or style of faulting). The goal of the simulation routine is to obtain an approximate fault
geometry that may be used to compute distance metrics and a few other path metadata that require
knowledge of finite fault geometry. The methodology is briefly described here but detailed in Youngs
(2006), which was reproduced as Appendix B of Chiou and Youngs (2008). In the simulation routine,
the missing fault plane information is simulated by random sampling of pertinent probabilistic
distributions of fault area, fault aspect ratio, and hypocenter position on the fault plane. With the
random sampling, the routine simulates a set of 101 random fault planes that are spun and slid in space
but locked to the given hypocenter location. For the set of simulated fault planes the median distance
from site to the rupture is estimated. The selected fault plane is the simulated plane that best fits the set
of median closest distances to the rupture.
5. SITE DATABASE
The site database is a collection of station information for the stations included in the NGA-West 2
database. The metadata collected for each station included station identification, station location,
instrument type and housing (GMX letters), recommended values of Vs30 (time-averaged shear-wave
velocity to 30-meter depth), codes indicating the basis for the recommended Vs30, various proxies used
for Vs30 estimation (slope, terrain, geology, geotechnical categories; details in Stewart et al. 2012b),
and isoseismal depths (i.e., depths to shear-wave velocity horizons from various 3D velocity models).
The site database started with the site table from the NGA project and a major effort was made to
update and improve the completeness of the information, especially with respect to Vs30 and isoseismal
depth information.
6. The assignment of Vs30 to a site follows the following general protocols:
1. When a Vs profile is available to a profile depth (zp) > 30 m, Vs30 is computed from the profile.
2. When a Vs profile is available to a profile depth of 10 < zp < 30 m, Vsz is computed to depth zp
and Vs30 is computed from Vsz using available correlations (Boore, 2004; Boore et al., 2011).
3. When no profile is available but regionally-calibrated proxy-based relationship between
surface geology and Vs30 is available, that relationship is used. The utilized relationships apply
to California and Italy, as described in Stewart et al. (2012b).
4. Other proxies are used when the above conditions are not met, including slope-, terrain-, and
geotechnical category-based proxies. In Taiwan, elevation-based proxies are also considered.
The values of Vs30 are assigned uncertainties in the site database. For the case of proxy-based
estimates, the uncertainties are derived from data analysis as described in Stewart et al. (2012b), and
tend to be relatively high for rock sites as compared to soil sites.
Depths to shear-wave velocity horizons (1.0, 1.5, and 2.5 km/s) were collected for 3D velocity models
and boring measurements for basins in northern and southern California, Japan, and Taiwan. Depths
were extracted for sites within the CVM-S4 and the CVM-H11.9 models southern California and for
sites within an updated version of the model described by Boatwright et al. (2004) for northern
California sites. For sites in Japan, depths were extracted from the Japan Seismic Hazard Information
Station.
ACKNOWLEDGEMENTS
This study was sponsored by the Pacific Earthquake Engineering Research Center (PEER) and funded by the
California Earthquake Authority, the California Department of Transportation, and the Pacific Gas & Electric
Company. Any opinions, findings, and conclusions or recommendations expressed in this material are those of
the authors and do not necessarily reflect those of the sponsoring agencies.
REFERENCES
Boatwright, J, L Blair, R Catchings, M Goldman, F Perosi, and C Steedman (2004). Using Twelve Years of
USGS Refraction Lines to Calibrate the Brocher and Others (1997) 3D Velocity Model of the Bay Area,
U.S. Geological Survey Open File Report 2004–1282.
Boore, D.M. (2004). “Estimating Vs30 (or NEHRP site classes) from shallow velocity models (depth < 30 m), ”
Bull. Seism. Soc. Am., 94, 591 – 597.
Boore, D. M. (2010). Orientation-independent, nongeometric-mean measures of seismic intensity from two
horizontal components of motion. Bull. Seismol. Soc. Am. 100, 1830-1835.
Boore, D.M., Thompson, E.M. and Cadet, H. (2011). Regional correlations of Vs30 and velocities averaged over
depths less than and greater than 30 m. Bull. Seism. Soc. Am., 101, 3046-3059.
Boore, D. M., Watson-Lamprey, J., Abrahamson, N. A. (2006). Orientation-independent measures of ground
motion. Bull. Seismol. Soc. Am. 96, 1502-1511.
Chiou, B., Darragh, R., Gregor, N. (2008). NGA project strong-motion database. Earthquake Spectra 24, 23-44.
Chiou, B., and Youngs, R.R. (2008). NGA model for average horizontal component of peak ground motion and
response spectra. PEER report.
Darragh, R., Silva, W.J., Gregor, N. (2004). Strong motion record processing procedures for the PEER center.
Proceedings of COSMOS Workshop on Strong-Motion Processing, ull. Seismol. Soc. Am. 1-12.
Spudich, P., Chiou, B. (2008). Directivity in NGA earthquake ground motions: analysis using isochrone theory.
Earthquake Spectra. 24, 279-298.
Stewart, J.P., Lanzo,G., Pagliaroli, A., Scasserra, G., DiCapua, G., Peppolini, S., Darragh, R. and Gregor, N.
(2012a). Ground motion recordings from the Mw 6.3 2009 L’Aquila earthquake in Italy and their
engineering implications, Earthquake Spectra, 28:1, 317-345.
Stewart, J.P., Seyhan, E., Boore, D.M.,, Campbell, K.W., Erdik, M., Silva, W.J., Di Alessandro, C., and
Bozorgnia, Y. (2012b). Site effects in parametric ground motion models for the GEM-PEER global GMPEs
project, 15th
WCEE, Portugal. Submitted.