The document describes a student research project that used seismic data from 10 stations near Neal Hot Springs, Oregon to detect small local earthquakes through numerical cross correlation. The student created filtered templates from 3 known earthquakes to scan 20 months of seismic data recorded between 2011-2012. This process identified 9 additional earthquakes, beyond the original 3 that had been visually identified. Finding more small earthquakes will help interpret local fault activity and assess if nearby geothermal exploration affects seismicity. The research demonstrates applying new computational techniques to gain higher resolution from existing seismic data and better understand subsurface geology.
Cosmic rays and clouds: using open science to clear the confusionBenjamin Laken
A talk presented at the conference of Sun Climate Connections (SCC2015) in Kiel, Germany (March 18th, 2015) at the final meeting of the TOSCA group.
A video of Dr Jasa Calogovic presenting the talk live at SCC2015 is available on our YouTube channel at https://www.youtube.com/user/DrBenLaken
In it, we advocate an open science approach to address the conflicting results within the field of Sun – climate connections, particularly concerning the hypothesised role of cosmic rays on clouds. We briefly summarise some of the conflict within the field and discuss how it may arise. We then outline a project we are currently developing to enable users to isolate specific cloud types from the MODIS dataset, and examine a range of properties in a composite analysis (such as area coverage, optical depth, liquid water content). The composites make use of statistical methods we have previously outlined. More information can be found at http://www.benlaken.com
Air quality challenges and business opportunities in China: Fusion of environ...CLIC Innovation Ltd
MMEA (The Measurement, Monitoring and Environmental Efficiency Assessment) research program final seminar presentation by Dr. Ari Karppinen, Finnish Meteorological Institute
Fuqing Zhang, Professor, Department of Meteorology and Department of Statistics; Director, Penn State Center for Advanced Data Assimilation and Predictability Techniques;
Pennsylvania State University - November 2017 UCAR Congressional Briefing
2013 ASPRS Track, Ozone Modeling for the Contiguous United States by Michael ...GIS in the Rockies
Ozone (O3) is a powerful oxidizer (e.g. reacting with oxygen). Ozone in the upper atmosphere is considered beneficial due to the ability of the compound to filter harmful UV rays generated from the sun. However, ground level concentrations of ozone influence animal and plant health. In animals, one symptom of ground level ozone is lung tissue damage resulting in respiratory complications. Excess ozone in plants can cause excessive water loss; thus, emulate drought conditions. Ozone simulates the stomata cell in plant leaves so that these cells do not function properly. That is the stomata cells do not close completely, resulting in excess water loss (Smith et al. 2008). Anthropogenic ozone can be created via internal combustion engines and coal fired power plants.
Collecting data from the Environmental Protection Agency (EPA) CASTnet site for the time periods 1990 to 2010 I use spatial interpolation techniques to create an ozone surface concentration for the contiguous United States.
Cosmic rays and clouds: using open science to clear the confusionBenjamin Laken
A talk presented at the conference of Sun Climate Connections (SCC2015) in Kiel, Germany (March 18th, 2015) at the final meeting of the TOSCA group.
A video of Dr Jasa Calogovic presenting the talk live at SCC2015 is available on our YouTube channel at https://www.youtube.com/user/DrBenLaken
In it, we advocate an open science approach to address the conflicting results within the field of Sun – climate connections, particularly concerning the hypothesised role of cosmic rays on clouds. We briefly summarise some of the conflict within the field and discuss how it may arise. We then outline a project we are currently developing to enable users to isolate specific cloud types from the MODIS dataset, and examine a range of properties in a composite analysis (such as area coverage, optical depth, liquid water content). The composites make use of statistical methods we have previously outlined. More information can be found at http://www.benlaken.com
Air quality challenges and business opportunities in China: Fusion of environ...CLIC Innovation Ltd
MMEA (The Measurement, Monitoring and Environmental Efficiency Assessment) research program final seminar presentation by Dr. Ari Karppinen, Finnish Meteorological Institute
Fuqing Zhang, Professor, Department of Meteorology and Department of Statistics; Director, Penn State Center for Advanced Data Assimilation and Predictability Techniques;
Pennsylvania State University - November 2017 UCAR Congressional Briefing
2013 ASPRS Track, Ozone Modeling for the Contiguous United States by Michael ...GIS in the Rockies
Ozone (O3) is a powerful oxidizer (e.g. reacting with oxygen). Ozone in the upper atmosphere is considered beneficial due to the ability of the compound to filter harmful UV rays generated from the sun. However, ground level concentrations of ozone influence animal and plant health. In animals, one symptom of ground level ozone is lung tissue damage resulting in respiratory complications. Excess ozone in plants can cause excessive water loss; thus, emulate drought conditions. Ozone simulates the stomata cell in plant leaves so that these cells do not function properly. That is the stomata cells do not close completely, resulting in excess water loss (Smith et al. 2008). Anthropogenic ozone can be created via internal combustion engines and coal fired power plants.
Collecting data from the Environmental Protection Agency (EPA) CASTnet site for the time periods 1990 to 2010 I use spatial interpolation techniques to create an ozone surface concentration for the contiguous United States.
On the routing overhead in infrastructureless multihop wireless networksNarendra Singh Yadav
Routing in infrastructureless multihop wireless networks is a challenging task and has received a vast amount of attention from researchers. This has lead to development of many different routing protocols each having their own superiorities and pitfalls making it very difficult to decide on a better protocol under vulnerable scenarios in such networks. In this paper the performance of three routing protocols (DSR, AODV and CBRP) in terms of routing overhead in bytes and in packets is presented under growing density and varying mobility in different traffic conditions. The simulation results show that CBRP outperforms both DSR and AODV in all scenarios.
Performance Evaluation and Comparison of Ad-Hoc Source Routing ProtocolsNarendra Singh Yadav
Mobile ad hoc network is a reconfigurable network of mobile nodes connected by multi-hop wireless links and capable of operating without any fixed infrastructure support. In order to facilitate communication within such self-creating, self-organizing and self-administrating network, a dynamic routing protocol is needed. The primary goal of such an ad hoc network routing protocol is to discover and establish a correct and efficient route between a pair of nodes so that messages may be delivered in a timely manner. Route construction should be done with a minimum of overhead and bandwidth consumption. This paper examines two routing protocols, both on-demand source routing, for mobile ad hoc networks– the Dynamic Source Routing (DSR), an flat architecture based and the Cluster Based Routing Protocol (CBRP), a cluster architecture based and evaluates both routing protocols in terms of packet delivery fraction normalized routing load, average end to end delay by varying speed of nodes, traffic sources and mobility.
Performance Comparison and Analysis of Table-Driven and On-Demand Routing Pro...Narendra Singh Yadav
Mobile ad hoc network is a collection of mobile nodes communicating through wireless channels without any existing network infrastructure or centralized administration. Because of the limited transmission range of wireless network interfaces, multiple "hops" may be needed to exchange data across the network. In order to facilitate communication within the network, a routing protocol is used to discover routes between nodes. The primary goal of such an ad hoc network routing protocol is correct and efficient route establishment between a pair of nodes so that messages may be delivered in a timely manner. Route construction should be done with a minimum of overhead and bandwidth consumption. This paper examines two routing protocols for mobile ad hoc networks– the Destination Sequenced Distance Vector (DSDV), the table- driven protocol and the Ad hoc On- Demand Distance Vector routing (AODV), an On –Demand protocol and evaluates both protocols based on packet delivery fraction, normalized routing load, average delay and throughput while varying number of nodes, speed and pause time.
This compare and evaluate two routing protocols DSR and CBRP in ad hoc networks. DSR is based on flat topology and CBRP is of cluster based. Both are compared in different number of scenarios and their performances are compared in terms of pdf, channel utilization, nrl, average end to end delay and control overheads.
This presentation is about landslide and i prepared this to know about the knowledge of landslide and how to do during landslide for safe. I hope to see your comments.
Impacts of High-Voltage Power Transmission Lines ProjectNandar Nwe (Glory)
This is presentation about impacts of high-voltage power transmission Lines Project. All impact can be face in construction or after construction. EIA report can be apply this impacts.
Toward Real-Time Analysis of Large Data Volumes for Diffraction Studies by Ma...EarthCube
Talk at the EarthCube End-User Domain Workshop for Rock Deformation and Mineral Physics Research.
By Martin Kunz, Lawrence Berkeley National Laboratory
Some engineering and scientific computer models that have high dimensional input space are actually only affected by a few essential input variables. If these active variables are identified, it would reduce the computation in the estimation of the Gaussian process (GP) model and help
researchers understand the system modeled by the computer simulation. More importantly, reducing the input dimensions would also increase the prediction accuracy, as it alleviates the "curse of dimensionality" problem.
In this talk, we propose a new approach to reduce the input dimension of the Gaussian process model. Specifically, we develop an optimization method to identify a convex combination of a subset of kernels of lower dimensions from a large candidate set of kernels, as the correlation function for the GP model. To make sure a sparse subset is selected, we add a penalty on the weights of kernels. Several numerical examples are shown to show the advantages of the
method. The proposed method has many connections with the existing methods including active subspace, additive GP, and composite GP models in the Uncertainty Quantification literature.
On the routing overhead in infrastructureless multihop wireless networksNarendra Singh Yadav
Routing in infrastructureless multihop wireless networks is a challenging task and has received a vast amount of attention from researchers. This has lead to development of many different routing protocols each having their own superiorities and pitfalls making it very difficult to decide on a better protocol under vulnerable scenarios in such networks. In this paper the performance of three routing protocols (DSR, AODV and CBRP) in terms of routing overhead in bytes and in packets is presented under growing density and varying mobility in different traffic conditions. The simulation results show that CBRP outperforms both DSR and AODV in all scenarios.
Performance Evaluation and Comparison of Ad-Hoc Source Routing ProtocolsNarendra Singh Yadav
Mobile ad hoc network is a reconfigurable network of mobile nodes connected by multi-hop wireless links and capable of operating without any fixed infrastructure support. In order to facilitate communication within such self-creating, self-organizing and self-administrating network, a dynamic routing protocol is needed. The primary goal of such an ad hoc network routing protocol is to discover and establish a correct and efficient route between a pair of nodes so that messages may be delivered in a timely manner. Route construction should be done with a minimum of overhead and bandwidth consumption. This paper examines two routing protocols, both on-demand source routing, for mobile ad hoc networks– the Dynamic Source Routing (DSR), an flat architecture based and the Cluster Based Routing Protocol (CBRP), a cluster architecture based and evaluates both routing protocols in terms of packet delivery fraction normalized routing load, average end to end delay by varying speed of nodes, traffic sources and mobility.
Performance Comparison and Analysis of Table-Driven and On-Demand Routing Pro...Narendra Singh Yadav
Mobile ad hoc network is a collection of mobile nodes communicating through wireless channels without any existing network infrastructure or centralized administration. Because of the limited transmission range of wireless network interfaces, multiple "hops" may be needed to exchange data across the network. In order to facilitate communication within the network, a routing protocol is used to discover routes between nodes. The primary goal of such an ad hoc network routing protocol is correct and efficient route establishment between a pair of nodes so that messages may be delivered in a timely manner. Route construction should be done with a minimum of overhead and bandwidth consumption. This paper examines two routing protocols for mobile ad hoc networks– the Destination Sequenced Distance Vector (DSDV), the table- driven protocol and the Ad hoc On- Demand Distance Vector routing (AODV), an On –Demand protocol and evaluates both protocols based on packet delivery fraction, normalized routing load, average delay and throughput while varying number of nodes, speed and pause time.
This compare and evaluate two routing protocols DSR and CBRP in ad hoc networks. DSR is based on flat topology and CBRP is of cluster based. Both are compared in different number of scenarios and their performances are compared in terms of pdf, channel utilization, nrl, average end to end delay and control overheads.
This presentation is about landslide and i prepared this to know about the knowledge of landslide and how to do during landslide for safe. I hope to see your comments.
Impacts of High-Voltage Power Transmission Lines ProjectNandar Nwe (Glory)
This is presentation about impacts of high-voltage power transmission Lines Project. All impact can be face in construction or after construction. EIA report can be apply this impacts.
Toward Real-Time Analysis of Large Data Volumes for Diffraction Studies by Ma...EarthCube
Talk at the EarthCube End-User Domain Workshop for Rock Deformation and Mineral Physics Research.
By Martin Kunz, Lawrence Berkeley National Laboratory
Some engineering and scientific computer models that have high dimensional input space are actually only affected by a few essential input variables. If these active variables are identified, it would reduce the computation in the estimation of the Gaussian process (GP) model and help
researchers understand the system modeled by the computer simulation. More importantly, reducing the input dimensions would also increase the prediction accuracy, as it alleviates the "curse of dimensionality" problem.
In this talk, we propose a new approach to reduce the input dimension of the Gaussian process model. Specifically, we develop an optimization method to identify a convex combination of a subset of kernels of lower dimensions from a large candidate set of kernels, as the correlation function for the GP model. To make sure a sparse subset is selected, we add a penalty on the weights of kernels. Several numerical examples are shown to show the advantages of the
method. The proposed method has many connections with the existing methods including active subspace, additive GP, and composite GP models in the Uncertainty Quantification literature.
Digital Scientific Notations are expected to solve some of the problems we have in computational science today. In particular, they should make the verification of computational science by human scientists possible again. Leibniz is a Digital Scientific Notation for the physical sciences currently under development. This presentation gives an overview of the current state.
Terra Seismic can predict most major earthquakes (M6.2 or greater) at least 2 - 5 months before they will strike. Global earthquake prediction is based on determinations of the stressed areas that will start to behave abnormally before
major earthquakes. The size of the observed stressed areas roughly corresponds to estimates calculated from Dobrovolsky’s formula. To identify abnormalities and make predictions, Terra Seismic applies various methodologies, including satellite remote sensing methods and data from ground-based
instruments. We currently process terabytes of information daily, and use more than 80 different multiparameter prediction systems. Alerts are issued if the abnormalities are confirmed by at least five different systems. We observed that geophysical patterns of earthquake development and stress accumulation
are generally the same for all key seismic regions. Thus, the same earthquake prediction methodologies and systems can be applied successfully worldwide. Our technology has been used to retrospectively test data gathered since 1970 and it successfully detected about 90 percent of all significant quakes over the last 50 years.
This paper introduces the Artifi cial Neural Networks (ANN) function to model probabilistic dependencies, in supervised classification tasks for discrimination between earthquakes and explosions problems. ANNs are regarded as the discriminating tools to classify the natural seismic events (earthquakes) from the artifi cial ones (Man-made explosions) based on the seismic signals recorded at regional distances. The bulk of our novel is to improve the obtained numerical results using this advance technique. The ANNs, by testing the different types of seismic features, showed the potential application of this method to discriminate the classes. During the above study, we found out that the Neural Networks have been used in a fully innovative manner in this work. Here the ARMA coefficients filters detects
the type of the source whenever a natural or artificial source changes the nature of the background noise of the seismograms. During the above study, we found out that this algorithm is sometimes capable to alarm the further natural seismological events just a little before the onset.
Recovery of aftershock sequences using waveform cross correlation: from catas...Ivan Kitov
Description of a software package for signal detection and association using waveform cross correlation. Recovery of aftershock sequences of the largest events: Sumatra 2004 and Tohoku 2011. Finding of a small aftershock of the September 9, 2016 DPRK test.
Self-organzing maps in Earth Observation Data Cube AnalysisLorena Santos
Earth Observation (EO) Data Cubes infrastructures model
analysis-ready data generated from remote sensing images as multidimensional cubes (space, time and properties), especially for satellite image time series analysis. These infrastructures take advantage of big data technologies and methods to store, process and analyze the big amount of Earth observation satellite images freely available nowadays. Recently, EO Data Cubes infrastructures and satellite image time series analysis
have brought new opportunities and challenges for the Land Use and Cover Change (LUCC) monitoring over large areas. LUCC have caused a great impact on tropical ecosystems, increasing global greenhouse gases emissions and reducing the planet’s biodiversity. This paper presents the
utility of Self-Organizing Maps (SOM) neural network method in the
process to extract LUCC information from EO Data Cubes infrastructures, using image time series analysis. Most classification techniques to create LUCC maps from satellite image time series are based on supervised learning methods. In this context, SOM is used as a method to assess land use and cover samples and to evaluate which spectral bands and vegetation indexes are best suitable for the separability of land use and cover classes. A case study is described in this work and shows the potential of SOM in this application
The benefit of hindsight in observational science - Retrospective seismologica...
summer-studentship-report-PDF
1. !
!
!
!
!
Summer Studentships 2013-2014
!
Final Project Report
!
!
!
!
!
!!
!
!
!
!!!!
!
!
!
!
!
!
!
!
!
!
Project Details
Student Name: Roydon Nutsford
Name of supervisor(s): Kasper van Wijk
Supervisor department: Physics
Project title:
!Seismic Investigation of the Neal Hot Springs Geothermal Area
Supervisor letter included: Yes ☐ No ☐
Page | !1
2. !
!
!
!
!
!
Brief statement on how the studentship has contributed to your career development
(1 page limit):
!
!!!Participation in this studentship has allowed me to further improve my skills in the field of geophysics. I have gained
a true appreciation for the purpose of research and how it can make a real impact in the field, and on a career. Part
of the studentship involved learning to write Python code from the ground up. Understanding this still emerging
language in the Geophysics work force today will give me a cutting edge. The ability to use computers to analyse data
will propel me in a career were staying current is important in ever changing fields.
!Through the proceedings of the research I was exposed to real earthquake databases which taught me to read,
comprehend and make use of large data networks from all over the world. This contributes experience to an area of
key skills which will be vital in the work place.
Finally the studentship program has enforced the universal requirement to present scientific data and findings in an
interesting, informative and appropriate way for readers to best understand.
!
!
!
!
!
!
!
Research abstract – not more than 250 words:
!
!!!The aim of the research project is to detect small local earthquakes which are potentially covered or
disguised by background noise. The testing location is Neal Hot Springs, an active geothermal site that
supplies a 23MW geothermal electric power plant near Vale, Oregon.
!In May 2011, 10 temporary seismic stations were set up to record the earth’s surface movements over 20
months. Numerical cross correlation is used to identify similar wave forms. The goal is to locate all of
the local earthquakes, and then differentiate between geothermal seismic activity and natural seismic
activity in the area.
Python code was used to scan seismic data against a nearby ‘template’ earthquake event found from a
previous study (Shaltry, 2013) by Daniel Shaltry, a student at Boise State University. The program
returned coefficients which define the similarity of the waveforms. The method relies on earthquake
pressure, Love and Rayleigh waves to produce similar waveforms, when the earthquakes originate from
similar locations. With three earthquakes already visually identified by Shaltry, the program spotted 9
more. These can be further studied to deduce whether they are a product of surrounding local active
faults, or whether they are the result of the geothermal exploration.
!!
!
!
!
!
!
Page | !2
3. !
!
!
Summary of research and its significance (1 page limit):
!
!
!
The Geophysics group from Boise State University set up and recorded seismic data in the Neal Hot
Springs area, near Vale, Oregon. Colwell (2012) From their study they identified three local
earthquakes. In this research I used computer analysis to gain higher resolution from the data in order to
locate more earthquakes. Extracting the data from the Incorporated Research Institution for Seismology
(IRIS) database, I used ObSpy toolbox to compare the earthquake waveforms against known seismic
events within the local vicinity, found in the previous study. The hypothesis that earthquakes which
originate in close proximity to one another will produce similar wave forms was tested by scanning the
data for high cross correlation values using Austin Holland xcor code. Holland (2013) If the cross
correlation coefficient was 0.7 or higher, and occurred on at least 5 stations I classified it as an event.
!
With this in mind the goal was to locate more small earthquakes to interpret the activity of local faults
and assess whether the Neal Hot Spring or the power generation plant increases local seismicity. The
correlation code does not match with large earthquakes at large proximity due to a combination of
different earthquake mechanisms and earth lateral variability deforming the waveform to a state where
the coefficient is not high enough. The significance of this study is testing the previous results with new
technology, and trying to increase the ability to distinguish small earthquakes from ambient noise. The
cross correlation process allows the waveforms to be evaluated, giving information about the subsurface
geology in the area, and how this earth material affects earthquake waveforms. Further research
involves studying the distance between stations, and how much the waveform is deformed in that
period. In particular, the effect of high ground water content, in such cases as a geothermal region.
!
!
!
!
!
!
!
Research report - Aims, methods, results & discussion (5 page limit):
!
!
!
Aim:
To test the previous hypothesis with a new technology/methodology and determine the local seismic
activity of the Neal Hot Springs region with respect to earthquakes originating from local faults and to
investigate the waveform variability of earthquakes in, and passing through the area.
!
!
!
!
!
Page | !3
4. !
!
Method:
For this research project the raw data was produced by students of Boise State University, under the
supervision of Dr Van Wijk, and consisted of 10 stations, each station with a L-22 seismometer, data
acquisition system, GPS, solar panel and battery. The stations ran continuously from May 2011 until
November 2012 and recorded vertical, northward and eastward propagation at a rate of 250 samples per
second. This data was sent to the Incorporated Research Institution for Seismology (IRIS) database. This
data was then sourced and written into a python 2.6.7 code, with the ObSpy 0.9 toolbox handling the
stream data. Boise State University student Daniel Shaltry found three low magnitude earthquakes of
close proximity to the stations. I used these earthquakes as template events. I used a code to extract the
first 5 seconds of each of these earthquakes, capturing the P and S wave arrivals. The next section of the
code looped through the time between May 2011 and November 2012. Xcorr, a cross correlation code
written by Austin Holland was then implemented to compare the template events with the trace data.
Each individual template was correlated with each individual station using parallel computing. An
acceptable correlation coefficient of 0.7 was required from five or more stations to be considered an
earthquake event. The high correlations occurred where the trace data waveforms matched those of
template earthquakes closely, meaning they are of the same origin, and therefore allowing stronger
interpretations to be drawn.
!
Results.
The raw data was retrieved from IRIS and the three earthquakes spotted by Daniel Shaltry were found.
The raw data contained 10 traces which recorded for 20 months. Each trace corresponded to a recording
station. A small script of python code was required to trim the data down to a short template. The
template captures the p-wave and s-wave arrival times of the most representative trace in, along with
the first few seconds of vigorous shaking.
!
!
Original template
!
(figure 1): -Raw data showing first 6 seconds of an earthquake waveform. The wave arrivals are clear but
there is strong noise.
!
Page | !4
5. !
The Original template has discrete wave arrivals; however, the true waveform is hidden under
considerable noise. The waveform was smoothed by reducing the noise through a series of filters, which
cleaned the raw data as seen in figure 2 below. This made available a manageable template for the
algorithm. The correlation code acting on the cleaned data was able to gain consistent correlation
coefficients of a higher value. To maximise the chance of finding earthquakes I created filtered
templates from all three known waveforms.
!
Filtered template - Earthquake 1
!
(figure 2): Filtered template of waveform 1, ready to be cross-correlated against the data set. Displays a
much smoother representation of the waveform.
!
!
!
!
!
Filtered template – Earthquake 2
!
(figure 3): Filtered template of waveform 2, ready to be cross-correlated against the data set. Using
multiple waveforms from within the area will increase the potential hits with the code.
!
Page | !5
6. !
!
Filtered template – Earthquake 3
!
(figure 3): Filtered template of waveform 3, ready to be cross-correlated against the data set.
!
!
The above templates were used to scan the data from the 1st May 2011 till the 1st November 2011. In
order to obtain consistency, the same filters applied to the templates were also transferred to the raw
data as the correlation code requires two parameters to control the output it creates. These can be set
to alert when earthquakes of any required similarity are found. In my program I have defined an event as
a correlation value of 0.7, across at least 5 stations. This allows only similar waveforms to be alerted,
and nearly eliminates noise as correlations are required over five stations. The program loops over two
minute intervals. This opens a window were correlations can occur at any time. To minimise the
probability of random correlations meeting the 5 station requirement, I set the required correlation at
0.7, which is a high waveform similarity which does not occur commonly.
!
Known earthquake events were picked up by the code during testing. This proved the proficiency of the
code. From the XN data set, Figure 4 displays the 3rd earthquake, including the data from all 10 station
traces, showing distinctive earthquake waveforms. This earthquake was approximately 13km away. This
can be estimated through the wave arrival times at different station locations.
!
!
!
!
!
!
!
!
!
Page | !6
7. 3rd known event across all 10 stations
!
(Figure 4) Two minute loop captured by Python code, showing the arrival times of earthquake 3 at each
station. The magnitude and arrival time vary, however the waveform remains relatively consistent. This
event was picked up on 8 stations. Station 10 recorded a 1, as it is the direct trace of template event.
Two of the remaining stations recorded above 0.9, whilst 5 others recorded above 0.8 in the correlation
code.
!
!
Page | !7
8. !
Event detection count
!
(Figure 5) displays the times of each event detected by the Python code.
!
!
!
!
!
!
!
!
Discussion:
!
The data produced 9 potential earthquake events over four months. The parameters imposed on the
code were strictly allowing times only with a high probability of containing an earthquake to be
detected. There are no previously sighted events in this period and therefore the nine potential events
show that there could in fact be higher seismic activity in the area than previously expected. As the
events have not been noticed by the previous study, it indicates the earthquakes are small and not
clearly visible to the naked eye.
Month number of events Loop times
June 5 2011-06-06T12:46
2011-06-06T21:20
2011-06-17T18:12
2011-06-17T18:16
2011-06-18T06:16
July 1 2011-08-14T00:00
August 0
September 3 2011-09-25T23:22
2011-09-25T23:24
2011-09-26T00:36
Page | !8
9. The code has not yet been passed over the second period of data due to various problems. The standard
deviation algorithm was the biggest slowdown in the code as it used up large amounts of computer
memory. This forced the
!
!
data to be feed into the code in short two minute intervals, reducing the efficiency resulting in slow
data processing. Another challenge was caused by the data being sourced from Washington, which
occasionally took too long causing the code to crash. This made running any extended duration of data
infeasible.
Future work is to continue processing the data over the entire recording period. This will give data pre
and post the operation of the hydrothermal power station. Understanding the seismic activity will allow
conclusions to be drawn with respect to the production of artificial seismic due to the activities of the
power station. Further investigation is required to finalise if the found events are in fact earthquakes as
it is possible that random noise has occurred at five stations within a two minute loop. A manual check
will not confirm the event as it is likely to be unreadable by the naked eye. With an entire data set, the
location of individual events can be triangulated to pin point the origin of the event. This will provide
evidence as to where in the region has been most active, and raise the question of why.
Another aspect I would like to carry further is the waveform deformation with distance through the
different surface composition. This can be done through analysing the correlation coefficient at different
linear distances from the source. If distinct changes occur it could even have relevance to fault mapping
or structural geologic mapping.
!
.
!
!
!
!
Conclusion
!
The summer project has allowed me to learn python code, and more about seismology. I have applied a
complex code to real world data gained in a hydrothermal region. The results have been generated after
scanning 4 months of data in order to check for earthquakes of similar origin. The code recorded 9
potential events over this period. Further analysis is required to complete the entire research period.
Once these events have been confirmed as natural seismic and not artificial noise, then further
conclusions can be drawn about the seismic activity in the area.
!
!
!
!
!
!
!
!
Page | !9
10. !
!
!
!
!
!
!
!
!
!
References (1 page limit):
!
Holland A.
Earthquakes triggered by hydraulic fracturing in South-Central Oklahoma. Bull. seism. Soc. Am.
2013;103:1784-1792.
!
Rodgers, L J. Nicewander, J. Alan, W.
Thirteen ways to look at the correlation coefficient. Am. Stat.
1988;42(1):59-66.
!
!
Shaltry, D. Colwell, C. Liberty, L. van Wijk, k
Seismic Investigation of the Neal Hot Springs Geothermal Area
2013
!
Colwell, C. et al,
Integrated Geophysical Exploration of a Known Geothermal Resource: Neal Hot Springs.
2012
!
Van Wijk, K. Channel, T. Viskupic, K. Smith, M L.
Teaching Geophysics with a Vertical-Component Seismometer
51, 552 (2013); doi: 10.1119/1.4830072
!
!
http://www.iris.edu/hq/
http://www.codecademy.com/
http://docs.obspy.org/
https://github.com/obspy/obspy/wiki
Page | !10