This document provides an overview of gravity and seismic geophysical exploration methods. It begins with introductions to gravity, its units of measurement, and factors that cause gravity variations. It then discusses gravity data acquisition, processing steps like tidal and elevation corrections to derive anomaly maps, and interpretation. For seismic exploration, it describes data acquisition using common midpoint gathers and factors like fold, followed by processing steps like normal moveout correction and stacking to improve signal-to-noise ratio and imaging resolutions. It concludes with discussions on filtering, migration, and how these improve subsurface representations.
Definition
Geophysics is the application of method of physics to the
study of the earth.
On the other sense, it is a subject of natural science
concerned with the physical processes and the physical
properties of the earth and its surrounding space
environment and the use of co-ordinate methods for the
analysis.
It involves the application of physical theories and
measurements to discover the properties and processes of the
earth.
The presentation comprises the Gravity Method, It's anomaly, reduction, and its applications. The Gravity method is commonly used in Geology specifically in Geophysics.
Definition
Geophysics is the application of method of physics to the
study of the earth.
On the other sense, it is a subject of natural science
concerned with the physical processes and the physical
properties of the earth and its surrounding space
environment and the use of co-ordinate methods for the
analysis.
It involves the application of physical theories and
measurements to discover the properties and processes of the
earth.
The presentation comprises the Gravity Method, It's anomaly, reduction, and its applications. The Gravity method is commonly used in Geology specifically in Geophysics.
A Gravity survey is an indirect (surface) means of calculating the density pr...Shahid Hussain
A Gravity survey is an indirect (surface) means of calculating the density property of subsurface materials. The higher the gravity values, the denser the rock beneath.
https://planet-geology.com/geology-gate-2021-crash-course/
Solution to the GATE 2018 Geology and Geophysics (Geology option) Examination. GATE is the national level examination that is used to test subject-specific knowledge. GATE score is used by universities for awarding admissions to their graduate programmes and by government companies to recruit technical professionals.
Planet-G provides online GSI and GATE coaching for Geology students:
Visit our channel at: https://www.youtube.com/channel/UC8GLL_Ppud7U51HA0tFRYvw
https://geologyplanet.wordpress.com/
Gravity and magnetic methods are an essential part of oil exploration. They do not replace seismic. Rather, they add to it. Despite being comparatively low-resolution, they have some very big advantages.
These geophysical methods passively measure natural variations in the earth’s gravity and magnetic fields over a map area and then try to relate these variations to geologic features in the subsurface. Lacking a controlled source, such surveys are usually environmentally unobjectionable.
Gravity anomaly across reagional structuresAmit K. Mishra
Gravity Anomaly across continents and ocean, gravity anomaly across mid-oceanic ridges, gravity anomaly across orogenic belts, and gravity anomaly across subduction zones.
A Gravity survey is an indirect (surface) means of calculating the density pr...Shahid Hussain
A Gravity survey is an indirect (surface) means of calculating the density property of subsurface materials. The higher the gravity values, the denser the rock beneath.
https://planet-geology.com/geology-gate-2021-crash-course/
Solution to the GATE 2018 Geology and Geophysics (Geology option) Examination. GATE is the national level examination that is used to test subject-specific knowledge. GATE score is used by universities for awarding admissions to their graduate programmes and by government companies to recruit technical professionals.
Planet-G provides online GSI and GATE coaching for Geology students:
Visit our channel at: https://www.youtube.com/channel/UC8GLL_Ppud7U51HA0tFRYvw
https://geologyplanet.wordpress.com/
Gravity and magnetic methods are an essential part of oil exploration. They do not replace seismic. Rather, they add to it. Despite being comparatively low-resolution, they have some very big advantages.
These geophysical methods passively measure natural variations in the earth’s gravity and magnetic fields over a map area and then try to relate these variations to geologic features in the subsurface. Lacking a controlled source, such surveys are usually environmentally unobjectionable.
Gravity anomaly across reagional structuresAmit K. Mishra
Gravity Anomaly across continents and ocean, gravity anomaly across mid-oceanic ridges, gravity anomaly across orogenic belts, and gravity anomaly across subduction zones.
Surface and soil moisture monitoring, estimations, variations, and retrievalsJenkins Macedo
This presentation explored five leading articles in the remotely sensed and in situ surface and soil moisture monitoring, estimations, variations, and retrievals for global environmental change. The presentation gives insight to the purpose of each study, subjects of investigations, methods used to collect and analyze data sets, results and implications, and conclusions. This project is in fulfillment of the course on remote sensing for global environmental change and precedes our preview on water resources monitoring. This project was conducted by Christina Geller, 5th year accelerated graduate student in Geographic Information Systems for Development, and Environment and Jenkins Macedo, 2nd year graduate students in Environmental Science and Policy at the Department of International Development, Community, and Environment (IDCE) at Clark University. All academic materials used in this study were appropriately referenced (see bibliography for details).
Towards the identification of the primary particle nature by the radiodetecti...Ahmed Ammar Rebai PhD
Radio signal from extensive air showers EAS studied by the CODALEMA experiment have been detected by means of the classic short fat antennas array working in a slave trigger mode by a particle scintillator array. It is shown that the radio shower wavefront is curved with respect to the plane wavefront hypothesis. Then a new tting model (parabolic model) is proposed to fit the radio signal time delay distributions in an event-by-event basis. This model take
into account this wavefront property and several shower geometry parameters such as: the existence of an apparent localised radio-emission source located at a distance Rc from the antenna array of and the radio shower core on the
ground. Comparison of the outputs from this model and other reconstruction models used in the same experiment show:
1)- That the radio shower core is shifted from the particle shower core in a statistic analysis approach.
2)- The capability of the radiodetection method to reconstruct the curvature radius with a statistical error less than 50 g.cm−2 .
Finally a preliminary study of the primary particle nature has been performed based on a comparison between data and Xmax distribution from Aires Monte-Carlo simulations for the same set of events.
First Observation of the Earth’s Permanent FreeOscillation s on Ocean Bottom ...Sérgio Sacani
The Earth’s hum is the permanent free oscillations of the Earth recorded in the absence ofearthquakes, at periods above 30 s. We present the first observations of its fundamental spheroidaleigenmodes on broadband ocean bottom seismometers (OBSs) in the Indian Ocean. At the ocean bottom,the effects of ocean infragravity waves (compliance) and seafloor currents (tilt) overshadow the hum. In ourexperiment, data are also affected by electronic glitches. We remove these signals from the seismic traceby subtracting average glitch signals; performing a linear regression; and using frequency-dependentresponse functions between pressure, horizontal, and vertical seismic components. This reduces the longperiod noise on the OBS to the level of a good land station. Finally, by windowing the autocorrelation toinclude only the direct arrival, the first and second orbits around the Earth, and by calculating its Fouriertransform, we clearly observe the eigenmodes at the ocean bottom.
Development of Methodology for Determining Earth Work Volume Using Combined S...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Composite sea level prediction in the Mediterranean
Sea - comparisons with observations
By Florent Lyard and Laurent Roblou
Abstract
In this presentation, we focus on the sea level recorded and modelled in the Mediterranean Sea during the year
2002. Two dynamical models are made available to us, the first one designed to solve the ocean circulation
(Mercator Psy2-v1 (Newsletter Mercator N°8)) and the second one to solve the tide and storm surge processes
(Mog2D). We challenge the assumption that a combined use of those two models (i.e. through a full or partial
summation) should provide an optimal sea level predicting tool. By comparing with tide gauge measurements, the
predicting skills of models, alone and/or combined together, are estimated for different frequency ranges. The
two major conclusions that can be drawn from this study is that first a combination of low-pass filtered Mercator
plus Mog2D closely fits the recorded data, and second the Mog2D low frequency sea level signal is surprisingly
needed in this combination to obtain the best prediction (instead of the low-pass filtered Inverted Barometer
(IB)). Further investigations will be necessary to understand precisely the reasons of the latter finding.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
3. INTRODUCTION
WHAT IS GRAVITY?
standard gravity g= 9.80665 m/s² 9.81 m/s²
gravity unit, gu = micrometer per second,
μm/s2
c.g.s., unit of gravity, milligal
mgal = 10-3 gal= 10-3 cms-2 = 10 gu-3
4. Gravitational Acceleration
The gravity method is based on two laws derived by Sir Isaac Newton,
which he described in Philosophiæ Naturalis Principia Mathematica
(July 1967):
• Universal Law of Gravitation
• Second Law of Motion
Units
standard gravity g = 9.80665 m/s² = 9.81 m/s²
gravity unit, gu = micrometer per second, μm/s²
c.g.s. unit of gravity, milligal =1 mgal = 10-3 gal= 10-3 cm m/s²
5. Gravity variations over Earth’s surface
Variations are due to:
(A) Variations in gravity with latitude
1. Shape of the Earth (gE < gP)
2. Variations in mass distribution (gE > gP)
3. Rotation of the Earth (gE < gP)
(B) Topography (elevation)
The Earth is not smooth
(C) Heterogeneities within the Earth (local geology)
1.Local variations in mass distribution
2.Observable changes in surface g
3.Material property is density
7. Gravity Exploration
Methodology
1. Planning of gravity survey.
2. Data Acquisition
a. Gravity measurement
b. Elevation determination
c. Establishment of base station
d. Profile layout
3. Data Processing (Reduction)
4. Interpretation
5. Gravity modeling
8. Gravity data Processing (reduction)
• What to do with the gravity data that you have
collected?
• Series of processing steps to remove unwanted signals
from the data (i.e., gravity variations that are not
caused by local density variations)
1.Temporal corrections
a. Tidal effects
b. Instrument drift
2.Spatial corrections
a. Latitude
b. Elevation
9. 1. Tidal effects – gravitational pull of sun and moon cause bulges in
the Earth’s surface
- ocean tides (water) and Earth tides (land)
-effects are well-understood and can be accurately corrected
2. Instrument drift – due to temperature-induced changes in the
spring and temporal changes in the elastic properties of the spring
10. Latitude correction
• The theoretical value of gravity with latitude is given by the
Geodetic Reference System for 1967 (GRS67) equation:
g(θ) = 978031.846(1+ 0.0053024sin2 θ − 0.0000058sin2 2θ)
11. Elevation corrections
• g depends on the distance from the
centre of the Earth (r):
Free air correction
F.A.C = 0.3086 Δh
Bouguer Correction
B.C = 0.00004193 ρ Δ h
Summary of Elevation Corrections
Site elevation F.A.C B.C
Below reference level Subtract Add
Above reference level Add Subtract
12. Free Air Anomaly Determination
• Free air anomaly is the difference between Observed gravity
and theoretical gravity after applying necessary corrections
on observed gravity.
F.A.A = (gobs ±D.C ±F.A.C) – Th. Value of “g”
Bouguer Anomaly Determination
• Bouguer Anomaly is the difference between observed gravity
and theoretical gravity after applying necessary corrections
on observed gravity.
B.A = (gobs ± DC ±F.A.C ± L.C ±T.C) – Th. Value of g
15. Seismic Exploration
1. Planning of Seismic survey.
2. Data Acquisition
Seismic data are recorded in the field
a. on magnetic tape.
b. Cartridge
c. Now a days on CD’s
3. Data Processing (Reduction)
4. Interpretation
5. Seismic modeling
16. Acquisition involves many
different reciever configurations,
including laying geophones or
seismometers on the surface of the
Earth or seafloor, towing hydrophones
behind a marine seismic vessel,
suspending hydrophones vertically in
the sea or placing geophones in a
wellbore (as in a vertical seismic
profile) to record the seismic signal.
A source, such as a vibrator unit,
dynamite shot, or an air gun,
generates acoustic or
elastic vibrations that travel into
the Earth, pass through strata
with different seismic responses
and filtering effects, and return
to the surface to be recorded as
seismic data.
Seismic acquisition
32. Midpoints
1
2
3
45 6 7 8 13
Fold or Multiplicity is the number of times that the same midpoint is sampled by
different shots and different receivers.
or
It is number of reflections from one common depth.
Fold
Common Midpoint Method (CMP Method)
33. Midpoints
1
2
3
45 6 7 8 138
Maximum Fold is achieved after the 6th shot
Fold
Common Midpoint Method (CMP Method)
34. Fold
The number of recorded channels
The number of station intervals between geophones
groups and source points
The nominal fold and stacking density of the resulting
CDP section
The nominal fold of a given fold geometry can be
derived from the equation;
F=1/2 C . (dx)/(dg) . (dx)/(ds)
Where F = Fold
C = Number of recording channels
dx = Spacing between Stations
dg =Spacing between geophone groups
ds = Spacing between source points
35. Seismic data processing
Processing seismic data consists of
applying a sequence of computer programs,
each designed to achieve one step
along the path from field tape to record section.
Ten to twenty programs are
usually used in a processing sequence,
wisely selected from a library of several two hundred programs.
In each Processing Software library
there may be several programs designed to
produce the same effect by different approaches.
For example one
program may operate in time domain
while another works in frequency domain,
yet they may yield similar results.
36. Objectives
To improve the signal to noise ratio
Isolation of the wanted signals
Reflections isolated from multiples and surface
waves
To obtain a higher resolution
by adapting the waveform of the signals
To obtain a realistic image
by geometrical correction
To obtain information about the subsurface
A near to Geological Section
37. Geometric Corrections
A seismic trace on a field monitor shows reflected
energy bursts from subsurface rock layer interfaces.
We will measure the travel times from source down to
reflector and back to geophone and use them together with
average velocity to compute depths to the various reflectors.
However before we use these reflected energy bursts
and their travel times, we must apply several corrections to
compensate for geometric effects.
These corrections include
1. Static corrections
2. Dynamic corrections
38. Static Correction
Often called statics, a bulk shift of a seismic trace in time during
seismic processing.
Removing near surface effects requires two corrections;
1.A weathering correction
2.An elevation correction
A common static correction is the weathering correction, which
compensates for a layer of low seismic velocity material near the
surface of the Earth.
Other corrections compensate for differences in topography and
differences in the elevations of sources and receivers (Datum
Correction or elevation correction).
39. Weathering Correction
A weathering correction replaces the actual travel time
through the weather layer by a computed travel time.
A method of compensating for delays in seismic reflection or
refraction times induced by low-velocity layers such as the
weathered layer near the Earth's surface.
The weather layer does vary in thickness
If the thickness of the weathered layer is not known, then it
can be determined by one or more uphole surveys.
40.
41. • By definition, the
weathering for the
emerging wave path in
fig.
WC = - (dw)/(Vw) +
(dw)/(Vc)
Or
WC = - dw (VC-
Vw)/VwVc
Since Vc>Vw the
weathering correction is
always negative.
42. Vc is known as the correction velocity.
Vc may be
• Known from previous experience.
• Measured by uphole surveys.
• Determined from first arrival refracted along the
base of the weathered layer.
The normal routine is to compute a weathering corrections at every
shot hole and to estimate corrections for other stations by
interpolation.
46. Dynamic Correction
One of the steps of processing the data is to rearrange the traces to
make CDP gathers (Fig).
The traces from different record which correspond to same depth
point location are collected together into a single record.
The traces are normally arranged with in this gather record in order
of increasing offset distance.
Then the reflected signals from a single horizontal interface align
along a hyperbola as shown in figure.
48. Normal Move Out (NMO)
The term normal move out or NMO means the
variation in reflection arrival time with offset
distance from source to receiver.
Before stacking, the traces must be shifted to its
original place by NMO.
A reflection typically arrives first at the receiver
nearest the source.
The offset between the source and other
receivers induces a delay in the arrival time of a
reflection from a horizontal surface at depth.
A plot of arrival times versus offset has a
hyperbolic shape.
Move out correction is time correction applied
to each offset.
50. The common
reflecting point on a
reflector, or the halfway
point when a wave
travels from a source to a
reflector to a receiver, is
shared by numerous
locations. Move out
corrections and stacking,
or summing of traces,
result in redundancy of
the data that improves
the signal-to-noise ratio.
Stacking
52. Filtering of Seismic Data
Seismic data, in general,
• Contain noise signals along with seismic reflection signals.
These noise signals interfere
• With the interpretation of the seismic signals
• Degrade the quality of the subsurface images
• That can be obtained by further processing.
It is, therefore,
• Very desirable to suppress the noise
• That is present in the recorded data
• Before processing it for imaging.
54. Migration
Seismic migration is the process by which seismic
events are geometrically re-located in either space or
time to the location the event occurred in the
subsurface rather than the location that it was
recorded at the surface.
Thereby creating a more accurate image of
the subsurface.
This process is necessary to overcome the limitations
of geophysical methods imposed by areas of complex
geology.
Such as: fault, salt bodies, folding, etc.