M-OSRP's research objective is to identify and address outstanding seismic processing challenges. Their goal is to develop new processing methods that provide additional options when current methods are insufficient or fail. Their focus is on making technically challenging exploration and production areas more feasible. Seismic processing challenges often arise when methods' assumptions are violated, such as inadequate data or subsurface information. M-OSRP decides whether to develop new methods/approaches to satisfy assumptions, or new methods without those assumptions. Their current focus is on developing fundamentally new concepts to address the major challenge of removing multiples that interfere with primaries without damaging the primaries.
2018 National Tanks Conference & Exposition: HRSC Data VisualizationAntea Group
Two of our High-Resolution Site Characterization (HRSC) Data Visualization posters featured at the 2018 NTC Conference in Louisville, KY.
1. Using Data Management and 3-Dimensional Data Visualization to Generate More Complete Conceptual Site Models and Streamline Site Closure
2. High-Resolution Site Characterization (HRSC) and 3-Dimensional Data Visualization for a Fractured Rock Site: A Path to Streamlined Closure
This article provides an alternative perspective on full-waveform inversion (FWI) methods, arguing that current FWI approaches are indirect model-matching methods rather than direct inversion. The article aims to 1) caution against claims that FWI is a final solution, 2) propose a direct inverse approach, and 3) provide an overview comparing indirect FWI and direct inversion methods. Indirect FWI methods incorrectly use modeling equations run backwards rather than true direct inversion, limiting the understanding and effectiveness of the solutions.
Wavelet estimation for a multidimensional acoustic or elastic earthArthur Weglein
A new and general wave theoretical wavelet estimation
method is derived. Knowing the seismic wavelet
is important both for processing seismic data and for
modeling the seismic response. To obtain the wavelet,
both statistical (e.g., Wiener-Levinson) and deterministic
(matching surface seismic to well-log data) methods
are generally used. In the marine case, a far-field
signature is often obtained with a deep-towed hydrophone.
The statistical methods do not allow obtaining
the phase of the wavelet, whereas the deterministic
method obviously requires data from a well. The
deep-towed hydrophone requires that the water be
deep enough for the hydrophone to be in the far field
and in addition that the reflections from the water
bottom and structure do not corrupt the measured
wavelet. None of the methods address the source
array pattern, which is important for amplitude-versus-
offset (AVO) studies.
The document discusses (1) a new business model and IBM proposal for cost-effective seismic processing through cloud computing, (2) an invitation to participate in a pilot project using IBM cloud computing for multiple attenuation, (3) speedups of internal multiple attenuation achieved by ConocoPhillips, (4) upcoming deliverables from M-OSRP that address industry challenges and will benefit from IBM's cloud computing, and (5) an invitation to an upcoming technical review meeting.
The document discusses the inverse scattering series (ISS) approach for eliminating internal multiples in land seismic data. ISS methods do not require information about the subsurface and can predict the traveltimes and amplitudes of all internal multiples. The authors apply 1D and 1.5D ISS algorithms to synthetic and field data from Saudi Arabia and achieve encouraging results, successfully attenuating most internal multiples while preserving primaries. However, further work is needed to more accurately predict multiples to handle interference between primaries and multiples.
The inverse scattering series for tasks associated with primaries: direct non...Arthur Weglein
The inverse scattering series for tasks associated with primaries: direct non-linear inversion of 1D elastic media. In this paper, research on direct inversion for two pa-
rameter acoustic media (Zhang and Weglein, 2005) is
extended to the three parameter elastic case. We present
the first set of direct non-linear inversion equations for
1D elastic media (i.e., depth varying P-velocity, shear
velocity and density). The terms for moving mislocated
reflectors are shown to be separable from amplitude
correction terms. Although in principle this direct
inversion approach requires all four components of elastic
data, synthetic tests indicate that consistent value-added
results may be achieved given only ˆDPP measurements.
We can reasonably infer that further value would derive
from actually measuring ˆDPP , ˆD PS, ˆDSP and ˆDSS as
the method requires. The method is direct with neither
a model matching nor cost function minimization.
Internal multiple attenuation using inverse scattering: Results from prestack 1 & 2D acoustic and
elastic synthetics
R. T. Coates*, Schlumberger Cambridge Research, A. B. Weglein, Arco Exploration and Production Technology
Summary
The attenuation of internal multiples in a multidimensional
earth is an important and longstanding problem in exploration
seismics. In this paper we report the results of applying
an attenuation algorithm based on the inverse scattering
series to synthetic prestack data sets generated in on
and two dimensional earth models. The attenuation algorithm
requires no information about the subsurface structure
or the velocity field. However, detailed information about
the source wavelet is a prerequisite. An attractive feature of:
the attenuation algorithm is the preservation of the amplitude
(and phase) of primary events in the data; thus allowing for
subsequent AVO and other true amplitude processing.
2018 National Tanks Conference & Exposition: HRSC Data VisualizationAntea Group
Two of our High-Resolution Site Characterization (HRSC) Data Visualization posters featured at the 2018 NTC Conference in Louisville, KY.
1. Using Data Management and 3-Dimensional Data Visualization to Generate More Complete Conceptual Site Models and Streamline Site Closure
2. High-Resolution Site Characterization (HRSC) and 3-Dimensional Data Visualization for a Fractured Rock Site: A Path to Streamlined Closure
This article provides an alternative perspective on full-waveform inversion (FWI) methods, arguing that current FWI approaches are indirect model-matching methods rather than direct inversion. The article aims to 1) caution against claims that FWI is a final solution, 2) propose a direct inverse approach, and 3) provide an overview comparing indirect FWI and direct inversion methods. Indirect FWI methods incorrectly use modeling equations run backwards rather than true direct inversion, limiting the understanding and effectiveness of the solutions.
Wavelet estimation for a multidimensional acoustic or elastic earthArthur Weglein
A new and general wave theoretical wavelet estimation
method is derived. Knowing the seismic wavelet
is important both for processing seismic data and for
modeling the seismic response. To obtain the wavelet,
both statistical (e.g., Wiener-Levinson) and deterministic
(matching surface seismic to well-log data) methods
are generally used. In the marine case, a far-field
signature is often obtained with a deep-towed hydrophone.
The statistical methods do not allow obtaining
the phase of the wavelet, whereas the deterministic
method obviously requires data from a well. The
deep-towed hydrophone requires that the water be
deep enough for the hydrophone to be in the far field
and in addition that the reflections from the water
bottom and structure do not corrupt the measured
wavelet. None of the methods address the source
array pattern, which is important for amplitude-versus-
offset (AVO) studies.
The document discusses (1) a new business model and IBM proposal for cost-effective seismic processing through cloud computing, (2) an invitation to participate in a pilot project using IBM cloud computing for multiple attenuation, (3) speedups of internal multiple attenuation achieved by ConocoPhillips, (4) upcoming deliverables from M-OSRP that address industry challenges and will benefit from IBM's cloud computing, and (5) an invitation to an upcoming technical review meeting.
The document discusses the inverse scattering series (ISS) approach for eliminating internal multiples in land seismic data. ISS methods do not require information about the subsurface and can predict the traveltimes and amplitudes of all internal multiples. The authors apply 1D and 1.5D ISS algorithms to synthetic and field data from Saudi Arabia and achieve encouraging results, successfully attenuating most internal multiples while preserving primaries. However, further work is needed to more accurately predict multiples to handle interference between primaries and multiples.
The inverse scattering series for tasks associated with primaries: direct non...Arthur Weglein
The inverse scattering series for tasks associated with primaries: direct non-linear inversion of 1D elastic media. In this paper, research on direct inversion for two pa-
rameter acoustic media (Zhang and Weglein, 2005) is
extended to the three parameter elastic case. We present
the first set of direct non-linear inversion equations for
1D elastic media (i.e., depth varying P-velocity, shear
velocity and density). The terms for moving mislocated
reflectors are shown to be separable from amplitude
correction terms. Although in principle this direct
inversion approach requires all four components of elastic
data, synthetic tests indicate that consistent value-added
results may be achieved given only ˆDPP measurements.
We can reasonably infer that further value would derive
from actually measuring ˆDPP , ˆD PS, ˆDSP and ˆDSS as
the method requires. The method is direct with neither
a model matching nor cost function minimization.
Internal multiple attenuation using inverse scattering: Results from prestack 1 & 2D acoustic and
elastic synthetics
R. T. Coates*, Schlumberger Cambridge Research, A. B. Weglein, Arco Exploration and Production Technology
Summary
The attenuation of internal multiples in a multidimensional
earth is an important and longstanding problem in exploration
seismics. In this paper we report the results of applying
an attenuation algorithm based on the inverse scattering
series to synthetic prestack data sets generated in on
and two dimensional earth models. The attenuation algorithm
requires no information about the subsurface structure
or the velocity field. However, detailed information about
the source wavelet is a prerequisite. An attractive feature of:
the attenuation algorithm is the preservation of the amplitude
(and phase) of primary events in the data; thus allowing for
subsequent AVO and other true amplitude processing.
Inverse Scattering Series & Seismic Exploration - Topical Review by Arthur We...Arthur Weglein
Topical Review on the "Inverse Scattering Series & Seismic Exploration" - Seismic research by Arthur Weglein, Director M-OSRP & Professor Department of Physic University of Houston
This document provides an overview of multiple attenuation methods in seismic processing. It discusses two broad categories of methods: 1) filtering methods that exploit differences between primaries and multiples, and 2) prediction and subtraction methods that model wave propagation to predict and remove multiples. Specific filtering and prediction/subtraction techniques are examined, along with their assumptions, effectiveness, and limitations. The document also explores challenges in attenuating internal multiples and developments that could improve current methods.
The document describes the process of updating the 3D geological model and mineral resource estimate for a gold deposit in Mali using new drilling data. Key steps in the process include:
1) Refining the geological model using new drilling data to better capture the structural complexity and geology of the deposit.
2) Updating the grade estimates and spatial statistics based on the refined geological model to improve understanding of grade distribution.
3) Applying the updated geological model and grade data to re-evaluate the potential for additional oxide resources near the existing gold deposit, with the aim of extending the mine life.
Terra is a listed company, which was founded in 2005, although the science that Terra uses goes back more than 35 years. Terra has over 500 years of Geotechnical, Exploration, Development, and Production combined experience.
Terra is represented in Indonesia by: PT. Indonesia – International Energy/Exploration Solution Partners which is a registered Indonesian owned company.
Terra is a powerful exploration solution that significantly improves the exploration success rate, reduces time from years to months, and when incorporated into a project, reduces wasteful spending on expensive seismic and drilling into non-prospective areas.
The Terra technologies address such challenges as:
* Large area high-grading and low-grading;
* Lack of subsurface data;
* Complex geology;
* Stratigraphic plays where seismic is not effective
* Lack, absence or poor quality of seismic data;
* Detection of subsurface structures, identifying high quality prospects which merit further exploration work and in many cases the presence of hydrocarbons and
minerals;
In General, the Terra technologies are superior tools for:
* Generating prospects in under-explored areas;
* Remotely or noninvasively delineate hydrocarbon and mineral anomalies and structures;
The document discusses the past, present, and future of seismic analysis techniques including seismic stratigraphy and geomorphology. It notes that integrating seismic stratigraphy and geomorphology provides the most robust geological interpretations. It describes how 3D seismic data and advances in computing/visualization have improved analysis. The document predicts that machine learning techniques will increasingly integrate seismic data types and attributes to better identify hydrocarbon traps. Handling "big data" from multiple sources is a challenge, but integrating rock physics can improve seismic reservoir description.
EVALUATING TRANSFER LEARNING APPROACHES FOR IMAGE INFORMATION.pptxgrssieee
This document evaluates transfer learning approaches for image information mining applications. It discusses using transfer learning to help with tasks like disaster response, change detection, and land use/land cover classification when only small amounts of labeled training data are available for new domains or tasks. The methodology adapts a weighted least squares support vector machine approach to transfer knowledge between related classes. Experimental results on pre- and post-hurricane Landsat imagery show this approach provides better classification accuracy than traditional methods when labeling only a small number of samples for new target classes. More research is still needed to fully understand knowledge transferability and avoid negative transfer effects.
PetroSync - Advanced Seismic Data Acquisition and ProcessingPetroSync
This document provides information about an upcoming training course on advanced seismic data acquisition and processing. The 5-day course will cover key principles in data acquisition, survey design and execution, data preparation and pre-processing, advanced data processing techniques, and data interpretation. It will provide a holistic look at seismic acquisition and processing workflows and include case studies. The course aims to help participants implement optimal workflows, ensure quality control, and make business decisions regarding seismic surveys. It is intended for exploration geophysicists, interpreters, managers, and others involved in seismic data acquisition and processing.
ROBUST OPTIMIZATION FOR RCPSP UNDER UNCERTAINTYijseajournal
The aim of the present article is to optimize the robustness objective for the Resource-Constrained Project
scheduling Problem (RCPSP) dealing with activity duration uncertainty. The studied robustness consists in
minimizing the worst-case performance, referred to as the min-max robustness objective, among a set of
initial scenarios. We propose an enhanced GRASP approach as a solution to the given scenario-based
robust model. This approach is based on different priority rules in the construction phase and a forwardbackward
heuristic in the improvement phase. We investigate two different benchmark data sets, the
Patterson set and the PSPLIB J30 set. Experiments show that the proposed enhanced GRASP outperforms
the basic procedure, and a based-evolutionary algorithm, in robustness optimization.
Seismic Applications Throughout the Life of the Reservoir
(C) July 2002 Oilfield Review
Projects: Seismic Reservoir Characterizationusing avo inversion for reservoir characterization
Operators are getting more from their reservoirs by combining high-quality seismic
images with conventional reservoir data. Asset teams use this calibrated seismic
information to gain detailed knowledge of reservoir properties, allowing them to
reduce risk at every stage in the life of their prospects.
Trine Alsos
Alfhild Eide
Statoil
Trondheim, Norway
Donatella Astratti
Stephen Pickering
Gatwick, England
Marcelo Benabentos
Nader Dutta
Subhashis Mallick
George Schultz
Houston, Texas, USA
Lennert den Boer
Calgary, Alberta, Canada
Michael Livingstone
Aberdeen, Scotland
Michael Nickel
Lars Sønneland
Stavanger, Norway
Juergen Schlaf
Phillips Petroleum Company
Stavanger, Norway
Pascal Schoepfer
Petroleum Development Oman
Muscat, Sultanate of Oman
Mario Sigismondi
Juan Carlos Soldo
Pecom Energía de Pérez Companc SA
Neuquén, Argentina
Lars Kristian Strønen
Statoil
Bergen, Norway
For help in preparation of this article, thanks to Mike
Bahorich, Apache Corporation, Houston, Texas, USA; Lee
Bell, Laurence Darmon, Olav Holberg, John Waggoner and
Bob Will, Houston, Texas; Phil Christie, Cambridge, England;
Doug Evans, Malcolm Francis, Michael French, Bob
Godfrey, Kim Hughes and Stephen McHugo, Gatwick,
England; and Ray Pratt, Amerada Hess, Olso, Norway.
ECLIPSE, FrontSim, MultiWave Array and RFT (Repeat
Formation Tester) are marks of Schlumberger.
Nuclear emergency response and Big Data technologiesBigData_Europe
This document discusses using big data technologies to improve nuclear emergency response. It describes how real-time systems currently use deterministic modeling to simulate radiological situations and consequences of countermeasures. Ensemble modeling is proposed to better account for input uncertainties. Existing scenarios could be further analyzed and accessed through web tools to support decision making. Case-based reasoning is presented as an approach to integrate historical information and suggest emergency strategies. An analytical platform demonstrates retrieving similar historical cases and reusing or adapting their solutions to support response to new nuclear events.
1) Many African oil operators are focusing on improving recovery rates and extending the life of existing fields due to fluctuations in oil prices and declining drilling activity. Cost containment has also become a key focus.
2) 3D reservoir modeling is playing an important role in development decisions and mapping reservoir behavior to optimize field lifecycles and production. Challenges include seismic interpretation, rapid model updating, and strengthening reservoir modeling skills.
3) New technologies like model-driven interpretation and multi-realization workflows allow for faster, more accurate incorporation of data and uncertainties into integrated models, facilitating improved field development planning and production forecasts. This contributes to extending field life in Africa.
The document discusses a project called Space Evaders that aims to prevent collisions between spacecraft and debris in space. Their team is developing methods to analyze data and find ways to prevent debris-causing collisions, which could eventually make major orbital regions unusable. If collisions and debris continue to increase unchecked, it could lead to a dangerous proliferation of collisions known as a Kessler cascade. The team's goal is to use data-driven approaches to help avoid this scenario and keep space accessible for future use.
This document discusses the importance of data science and management. It covers how developing a data management plan can help researchers meet grant requirements, increase efficiency and visibility of research, and facilitate new discoveries. Interoperability of data through standards and vocabularies allows data to be discoverable and usable by others. Provenance tracking provides documentation of the research process to improve transparency and credibility. The emerging era of open science enables more collaborative practices through social media and altmetrics, which provide broader measures of research impact.
Dear Colleagues,
Call for papers for another Machine Learning special issue of SEG/AAPG Journal of Interpretation focusing on the Seismic Data Analysis has been announced.
We look forward to your contribution.
Vikram Jayaram
Special Section Editor
Interpretation
The document discusses how Landmark's DecisionSpace platform provides a unified system for analyzing unconventional oil and gas assets. It includes Dynamic Frameworks to Fill workflow technology which automatically updates a 3D subsurface model as new well and seismic data is acquired, allowing operators to more precisely target sweet spots. This helps operators plan drilling campaigns and well placements to increase reservoir contact and reduce costs.
This document discusses using particle swarm optimization (PSO) to design optimal close-range photogrammetry networks. PSO is introduced as a heuristic optimization algorithm inspired by bird flocking behavior that can be used to solve complex optimization problems. The document then provides an overview of close-range photogrammetry network design and the four design stages. It explains that PSO will be used to optimize the first stage of determining optimal camera station positions. Mathematical models of PSO for close-range photogrammetry network design are developed. Experimental tests are carried out to develop a PSO algorithm that can determine optimum camera positions and evaluate the accuracy of the developed network.
The Inverse Scattering Series (ISS) is a direct inversion method
for a multidimensional acoustic, elastic and anelastic earth. It
communicates that all inversion processing goals are able to
be achieved directly and without any subsurface information.
This task is reached through a task-specific subseries of the
ISS. Using primaries in the data as subevents of the first-order
internal multiples, the leading-order attenuator can predict the
time of all the first-order internal multiples and is able to attenuate
them.
However, the ISS internal multiple attenuation algorithm can
be a computationally demanding method specially in a complex
earth. By using an approach that is based on two angular
quantities and that was proposed in Terenghi et al. (2012), the
cost of the algorithm can be controlled. The idea is to use the
two angles as key-control parameters, by limiting their variation,
to disregard some calculated contributions of the algorithm
that are negligible. Moreover, the range of integration
can be chosen as a compromise of the required degree of accuracy
and the computational time saving.
This time-saving approach is presented
Inverse Scattering Series & Seismic Exploration - Topical Review by Arthur We...Arthur Weglein
Topical Review on the "Inverse Scattering Series & Seismic Exploration" - Seismic research by Arthur Weglein, Director M-OSRP & Professor Department of Physic University of Houston
This document provides an overview of multiple attenuation methods in seismic processing. It discusses two broad categories of methods: 1) filtering methods that exploit differences between primaries and multiples, and 2) prediction and subtraction methods that model wave propagation to predict and remove multiples. Specific filtering and prediction/subtraction techniques are examined, along with their assumptions, effectiveness, and limitations. The document also explores challenges in attenuating internal multiples and developments that could improve current methods.
The document describes the process of updating the 3D geological model and mineral resource estimate for a gold deposit in Mali using new drilling data. Key steps in the process include:
1) Refining the geological model using new drilling data to better capture the structural complexity and geology of the deposit.
2) Updating the grade estimates and spatial statistics based on the refined geological model to improve understanding of grade distribution.
3) Applying the updated geological model and grade data to re-evaluate the potential for additional oxide resources near the existing gold deposit, with the aim of extending the mine life.
Terra is a listed company, which was founded in 2005, although the science that Terra uses goes back more than 35 years. Terra has over 500 years of Geotechnical, Exploration, Development, and Production combined experience.
Terra is represented in Indonesia by: PT. Indonesia – International Energy/Exploration Solution Partners which is a registered Indonesian owned company.
Terra is a powerful exploration solution that significantly improves the exploration success rate, reduces time from years to months, and when incorporated into a project, reduces wasteful spending on expensive seismic and drilling into non-prospective areas.
The Terra technologies address such challenges as:
* Large area high-grading and low-grading;
* Lack of subsurface data;
* Complex geology;
* Stratigraphic plays where seismic is not effective
* Lack, absence or poor quality of seismic data;
* Detection of subsurface structures, identifying high quality prospects which merit further exploration work and in many cases the presence of hydrocarbons and
minerals;
In General, the Terra technologies are superior tools for:
* Generating prospects in under-explored areas;
* Remotely or noninvasively delineate hydrocarbon and mineral anomalies and structures;
The document discusses the past, present, and future of seismic analysis techniques including seismic stratigraphy and geomorphology. It notes that integrating seismic stratigraphy and geomorphology provides the most robust geological interpretations. It describes how 3D seismic data and advances in computing/visualization have improved analysis. The document predicts that machine learning techniques will increasingly integrate seismic data types and attributes to better identify hydrocarbon traps. Handling "big data" from multiple sources is a challenge, but integrating rock physics can improve seismic reservoir description.
EVALUATING TRANSFER LEARNING APPROACHES FOR IMAGE INFORMATION.pptxgrssieee
This document evaluates transfer learning approaches for image information mining applications. It discusses using transfer learning to help with tasks like disaster response, change detection, and land use/land cover classification when only small amounts of labeled training data are available for new domains or tasks. The methodology adapts a weighted least squares support vector machine approach to transfer knowledge between related classes. Experimental results on pre- and post-hurricane Landsat imagery show this approach provides better classification accuracy than traditional methods when labeling only a small number of samples for new target classes. More research is still needed to fully understand knowledge transferability and avoid negative transfer effects.
PetroSync - Advanced Seismic Data Acquisition and ProcessingPetroSync
This document provides information about an upcoming training course on advanced seismic data acquisition and processing. The 5-day course will cover key principles in data acquisition, survey design and execution, data preparation and pre-processing, advanced data processing techniques, and data interpretation. It will provide a holistic look at seismic acquisition and processing workflows and include case studies. The course aims to help participants implement optimal workflows, ensure quality control, and make business decisions regarding seismic surveys. It is intended for exploration geophysicists, interpreters, managers, and others involved in seismic data acquisition and processing.
ROBUST OPTIMIZATION FOR RCPSP UNDER UNCERTAINTYijseajournal
The aim of the present article is to optimize the robustness objective for the Resource-Constrained Project
scheduling Problem (RCPSP) dealing with activity duration uncertainty. The studied robustness consists in
minimizing the worst-case performance, referred to as the min-max robustness objective, among a set of
initial scenarios. We propose an enhanced GRASP approach as a solution to the given scenario-based
robust model. This approach is based on different priority rules in the construction phase and a forwardbackward
heuristic in the improvement phase. We investigate two different benchmark data sets, the
Patterson set and the PSPLIB J30 set. Experiments show that the proposed enhanced GRASP outperforms
the basic procedure, and a based-evolutionary algorithm, in robustness optimization.
Seismic Applications Throughout the Life of the Reservoir
(C) July 2002 Oilfield Review
Projects: Seismic Reservoir Characterizationusing avo inversion for reservoir characterization
Operators are getting more from their reservoirs by combining high-quality seismic
images with conventional reservoir data. Asset teams use this calibrated seismic
information to gain detailed knowledge of reservoir properties, allowing them to
reduce risk at every stage in the life of their prospects.
Trine Alsos
Alfhild Eide
Statoil
Trondheim, Norway
Donatella Astratti
Stephen Pickering
Gatwick, England
Marcelo Benabentos
Nader Dutta
Subhashis Mallick
George Schultz
Houston, Texas, USA
Lennert den Boer
Calgary, Alberta, Canada
Michael Livingstone
Aberdeen, Scotland
Michael Nickel
Lars Sønneland
Stavanger, Norway
Juergen Schlaf
Phillips Petroleum Company
Stavanger, Norway
Pascal Schoepfer
Petroleum Development Oman
Muscat, Sultanate of Oman
Mario Sigismondi
Juan Carlos Soldo
Pecom Energía de Pérez Companc SA
Neuquén, Argentina
Lars Kristian Strønen
Statoil
Bergen, Norway
For help in preparation of this article, thanks to Mike
Bahorich, Apache Corporation, Houston, Texas, USA; Lee
Bell, Laurence Darmon, Olav Holberg, John Waggoner and
Bob Will, Houston, Texas; Phil Christie, Cambridge, England;
Doug Evans, Malcolm Francis, Michael French, Bob
Godfrey, Kim Hughes and Stephen McHugo, Gatwick,
England; and Ray Pratt, Amerada Hess, Olso, Norway.
ECLIPSE, FrontSim, MultiWave Array and RFT (Repeat
Formation Tester) are marks of Schlumberger.
Nuclear emergency response and Big Data technologiesBigData_Europe
This document discusses using big data technologies to improve nuclear emergency response. It describes how real-time systems currently use deterministic modeling to simulate radiological situations and consequences of countermeasures. Ensemble modeling is proposed to better account for input uncertainties. Existing scenarios could be further analyzed and accessed through web tools to support decision making. Case-based reasoning is presented as an approach to integrate historical information and suggest emergency strategies. An analytical platform demonstrates retrieving similar historical cases and reusing or adapting their solutions to support response to new nuclear events.
1) Many African oil operators are focusing on improving recovery rates and extending the life of existing fields due to fluctuations in oil prices and declining drilling activity. Cost containment has also become a key focus.
2) 3D reservoir modeling is playing an important role in development decisions and mapping reservoir behavior to optimize field lifecycles and production. Challenges include seismic interpretation, rapid model updating, and strengthening reservoir modeling skills.
3) New technologies like model-driven interpretation and multi-realization workflows allow for faster, more accurate incorporation of data and uncertainties into integrated models, facilitating improved field development planning and production forecasts. This contributes to extending field life in Africa.
The document discusses a project called Space Evaders that aims to prevent collisions between spacecraft and debris in space. Their team is developing methods to analyze data and find ways to prevent debris-causing collisions, which could eventually make major orbital regions unusable. If collisions and debris continue to increase unchecked, it could lead to a dangerous proliferation of collisions known as a Kessler cascade. The team's goal is to use data-driven approaches to help avoid this scenario and keep space accessible for future use.
This document discusses the importance of data science and management. It covers how developing a data management plan can help researchers meet grant requirements, increase efficiency and visibility of research, and facilitate new discoveries. Interoperability of data through standards and vocabularies allows data to be discoverable and usable by others. Provenance tracking provides documentation of the research process to improve transparency and credibility. The emerging era of open science enables more collaborative practices through social media and altmetrics, which provide broader measures of research impact.
Dear Colleagues,
Call for papers for another Machine Learning special issue of SEG/AAPG Journal of Interpretation focusing on the Seismic Data Analysis has been announced.
We look forward to your contribution.
Vikram Jayaram
Special Section Editor
Interpretation
The document discusses how Landmark's DecisionSpace platform provides a unified system for analyzing unconventional oil and gas assets. It includes Dynamic Frameworks to Fill workflow technology which automatically updates a 3D subsurface model as new well and seismic data is acquired, allowing operators to more precisely target sweet spots. This helps operators plan drilling campaigns and well placements to increase reservoir contact and reduce costs.
This document discusses using particle swarm optimization (PSO) to design optimal close-range photogrammetry networks. PSO is introduced as a heuristic optimization algorithm inspired by bird flocking behavior that can be used to solve complex optimization problems. The document then provides an overview of close-range photogrammetry network design and the four design stages. It explains that PSO will be used to optimize the first stage of determining optimal camera station positions. Mathematical models of PSO for close-range photogrammetry network design are developed. Experimental tests are carried out to develop a PSO algorithm that can determine optimum camera positions and evaluate the accuracy of the developed network.
Similar to M-OSR Director Arthur Weglein - Annual report & introduction, 2014 (20)
The Inverse Scattering Series (ISS) is a direct inversion method
for a multidimensional acoustic, elastic and anelastic earth. It
communicates that all inversion processing goals are able to
be achieved directly and without any subsurface information.
This task is reached through a task-specific subseries of the
ISS. Using primaries in the data as subevents of the first-order
internal multiples, the leading-order attenuator can predict the
time of all the first-order internal multiples and is able to attenuate
them.
However, the ISS internal multiple attenuation algorithm can
be a computationally demanding method specially in a complex
earth. By using an approach that is based on two angular
quantities and that was proposed in Terenghi et al. (2012), the
cost of the algorithm can be controlled. The idea is to use the
two angles as key-control parameters, by limiting their variation,
to disregard some calculated contributions of the algorithm
that are negligible. Moreover, the range of integration
can be chosen as a compromise of the required degree of accuracy
and the computational time saving.
This time-saving approach is presented
Inverse scattering series for multiple attenuation: An example with surface a...Arthur Weglein
A multiple attenuation method derived from an inverse scattering
series is described. The inversion series approach allows a
separation of multiple attenuation subseries from the full series.
The surface multiple attenuation subseries was described and illustrated
in Carvalho et al. (1991, 1992). The internal multiple
attenuation method consists of selecting the parts of the odd
terms that are associated with removing only multiply reflected
energy. The method, for both types of multiples, is multidimensional
and does not rely on periodicity or differential moveout,
nor does it require a model of the reflectors generating the multiples.
An example with internal and surface multiples will be
presented.
In this paper we present a multidimensional method for attenuating internal multiples that derives from
an inverse scattering series . The method doesn't depend on periodicity or differential moveout, nor does it
require a model for the multiple generating reflectors.
Summary
Methods for removal of free-surface and internal multiples have been developed from bath a feedback model approach and inverse scatterin g theory. White these two formulations derive from different mathematica) viewpoints,
the resulting algorithm s for free-surface multiple are very similar. By contrast , the feedback and inverse scattering
method for internal multiple are totally different and have different requirements for sub surface information or
interpretive intervention . The former removes all multiple related to a certain boundary with the a of a surface
integral along this boundary ; the alter wilt predict and attenuate a ll internal multiple a t the same time . In this paper, we continue our comparison study of these internal multiple attenuation method ; specifically , we examine two
different realizations of the feedback method and the inverse scattering technique .
Internal multiple attenuation using inverse scattering: Results from prestack...Arthur Weglein
The attenuation of internal multiples in a multidimensional
earth is an important and longstanding problem in exploration
seismics. In this paper we report the results of applying
an attenuation algorithm based on the inverse scattering
series to synthetic prestack data sets generated in on
and two dimensional earth models. The attenuation algorithm
requires no information about the subsurface structure
or the velocity field. However, detailed information about
the source wavelet is a prerequisite. An attractive feature of:
the attenuation algorithm is the preservation of the amplitude
(and phase) of primary events in the data; thus allowing for
subsequent AVO and other true amplitude processing.
Wavelet estimation for a multidimensional acoustic or elastic earth- Arthur W...Arthur Weglein
A new and general wave theoretical wavelet estimation
method is derived. Knowing the seismic wavelet
is important both for processing seismic data and for
modeling the seismic response. To obtain the wavelet,
both statistical (e.g., Wiener-Levinson) and deterministic
(matching surface seismic to well-log data) methods
are generally used. In the marine case, a far-field
signature is often obtained with a deep-towed hydrophone.
The statistical methods do not allow obtaining
the phase of the wavelet, whereas the deterministic
method obviously requires data from a well. The
deep-towed hydrophone requires that the water be
deep enough for the hydrophone to be in the far field
and in addition that the reflections from the water
bottom and structure do not corrupt the measured
wavelet. None of the methods address the source
array pattern, which is important for amplitude-versus-
offset (AVO) studies
The document analyzes the reference velocity sensitivity of an elastic internal multiple attenuation algorithm. It first provides background on internal multiples and inverse scattering series methods for their attenuation. It then presents a 1.5D layered earth model and uses it to show that the elastic internal multiple algorithm can correctly predict arrival times without requiring an accurate reference velocity model, similarly to the acoustic case. The analysis involves deriving expressions for predicted internal multiples in the frequency-wavenumber domain and showing they depend only on the traveltimes of the primary reflections involved, not the reference velocities.
This document summarizes finite difference modeling methods used at M-OSRP. It discusses:
1) The second order time and fourth order space finite difference schemes used to model acoustic wave propagation.
2) How boundary conditions like Dirichlet/Neumann generate strong spurious reflections that can mask true events.
3) The importance of accurate source fields for modeling - better source fields lead to more accurate linear inversions and the ability to observe phenomena like polarity reversals in modeled data.
Inverse scattering series for multiple attenuation: An example with surface a...Arthur Weglein
A multiple attenuation method derived from an inverse scattering
series is described. The inversion series approach allows a
separation of multiple attenuation subseries from the full series.
The surface multiple attenuation subseries was described and illustrated
in Carvalho et al. (1991, 1992). The internal multiple
attenuation method consists of selecting the parts of the odd
terms that are associated with removing only multiply reflected
energy. The method, for both types of multiples, is multidimensional
and does not rely on periodicity or differential moveout,
nor does it require a model of the reflectors generating the multiples.
An example with internal and surface multiples will be
presented.
Green's Theorem Deghostin Algorthm- Dr. Arthur B. WegleinArthur Weglein
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise causes chemical changes in the brain that may help protect against mental illness and improve symptoms.
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise boosts blood flow and levels of neurotransmitters and endorphins which elevate and stabilize mood.
All of the perturbative approaches to multidimensional wave
equation processing. for example. wave equation migration (see,
e.g., Claerbout, 1971; French, 1975: Schneider, 1978; Stolt, 1978;
Sattlegger et al, 1980), or Born approximation inversion (see,
e.g., Cohen and Bleistein, 1979; Raz, 1981: Clayton and Stolt,
1981) require some input velocity information. In the Born approximation
to inversion, a reference or background velocity is
chosena nd a perturbationa boutt his velocity is determined.S imilarly,
a velocity model is a required input to all wave equation
migration techniques.
The Inverse Source Problem in The Presence of External Sources- Dr. Arthur B....Arthur Weglein
This paper presents a brief review of the various integral equation formuiations that have been employed
for the inverse source problem for the inhomogeneous scalar Heimhoitz equation. It is shown that these
formulations apply only in cases where either the data are prescribed on a closed surface surrounding the
unknown source or where the unknown source lies entirely on one side of an open measurement surface.
A generalized integral equation is derived that applies to the more general case where unknown sources
can exist on both sides of an open measurement surface. This latter problem arises in geophysical remote
sensing and the derived integral equation offers an approach to this class of problems not offered by
currently employed techniques.
Direct non-linear inversion of multi-parameter 1D elastic media using the inv...Arthur Weglein
In this paper, we present the first non-linear direct target identification method and algorithm
for 1D elastic media (P velocity, shear velocity and density vary in depth) from the inverse
scattering series. Direct non-linear means that we provide explicit formulas that: (1) input data
and directly output changes in material properties, without the use or need for any indirect procedures
such as model matching, searching, optimization or other assumed aligned objectives or
proxies, and (2) the algorithms recognize and directly invert the intrinsic non-linear relationship
between changes in material properties and changes in the concomitant wave-field. The results
clearly demonstrate that, in order to achieve full elastic inversion, all four components of data
(ˆD PP , ˆDPS, ˆD SP and ˆDSS) are needed. The method assumes that only data and reference
medium properties are input, and terms in the inverse series for moving mislocated reflectors
resulting from the linear inverse term, are separated from amplitude correction terms. Although
in principle this direct inversion approach requires all four components of elastic data, synthetic
tests indicate that a consistent value-added result may be achieved given only ˆDPP measurements,
as long as the ˆD PP were used to approximately synthesize the ˆD PS, ˆDSP and ˆD SS
components. We can reasonably infer that further value would derive from actually measuring
ˆD
PP , ˆDPS, ˆDSP and ˆD SS as the method requires. For the case that all four components of
data are available, we give one consistent method to solve for all of the second terms (the first
terms beyond linear). The method’s nonlinearity and directness provides this unambiguous data
requirement message, and that unique clarity, and the explicit non-linear formulas casts doubts
and reasonable concerns for indirect methods, in general, and their assumed aligned goals, e.g.,
using model matching objectives, that would never recognize the fundamental inadequacy from
a basic physics point of view of using only PP data to perform elastic inversion. There are important
conceptual and practical implications for the link between data acquisition and target
identification goals and objectives.
Linear inversion of absorptive/dispersive wave field measurements: theory and...Arthur Weglein
The use of inverse scattering theory for the inversion of viscoacoustic wave field
measurements, namely for a set of parameters that includes Q, is by its nature very
different from most current approaches for Q estimation. In particular, it involves an
analysis of the angle- and frequency-dependence of amplitudes of viscoacoustic data
events, rather than the measurement of temporal changes in the spectral nature of
events. We consider the linear inversion for these parameters theoretically and with
synthetic tests. The output is expected to be useful in two ways: (1) on its own it
provides an approximate distribution of Q with depth, and (2) higher order terms in
the inverse scattering series as it would be developed for the viscoacoustic case would
take the linear inverse as input.
We will begin, following Innanen (2003) by casting and manipulating the linear
inversion problem to deal with absorption for a problem with arbitrary variation of
wavespeed and Q in depth, given a single shot record as input. Having done this, we
will numerically and analytically develop a simplified instance of the 1D problem. This
simplified case will be instructive in a number of ways, first of all in demonstrating
that this type of direct inversion technique relies on reflectivity, and has no interest in
or ability to analyse propagation effects as a means to estimate Q. Secondly, through
a set of examples of slightly increasing complexity, we will demonstrate how and where
the linear approximation causes more than the usual levels of error. We show how
these errors may be mitigated through use of specific frequencies in the input data,
or, alternatively, through a layer-stripping based, or bootstrap, correction. In either
case the linear results are encouraging, and suggest the viscoacoustic inverse Born
approximation may have value as a standalone inversion procedure.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
M-OSR Director Arthur Weglein - Annual report & introduction, 2014
1. Introduction
Arthur B. Weglein
May 21, 2014
M-OSRP’s research objective and goal: identify and address seismic processing
challenges
The research objective of M-OSRP is to identify and address outstanding, pressing and prioritized
seismic exploration challenges. M-OSRP develops and delivers step-change improved capability to
the seismic processing tool-box. That goal delivers new options and processing methods that will
be successful and effective when the methods and choices within the current tool-box can and will
have difficulty and/or fail. What we provide will not always be the appropriate choice within the
toolbox; but what we provide increases the number and types of options, and allows new capability,
when needed. That new set of options then facilitates and supports exploring and producing in
offshore and on-shore areas and types of plays that are currently too technically challenging, high
risk, and unreasonable as investments and to help make those currently difficult or precluded E&P
areas, plays and opportunities more reasonable, accessible and manageable.
What’s behind seismic processing challenges? Why do we drill dry holes; why
do we drill suboptimal development wells? Why do seismic processing methods
have problems and fail?
Seismic processing methods are effective and successful when their assumptions and prerequisites are
satisfied. When assumptions behind processing methods are not satisfied, these methods can fail.
That, in turn, contributes to dry hole drilling and sub-optimal development well drilling decisions.
There are different categories of assumptions, among them are: (1) acquiring the required/ade-
quate seismic data, and (2) required preprocessing, subsurface information, interpretive intervention,
needed for methods to be effective.
The M-OSRP strategy and plan
The industry trend to more complicated and challenging offshore and on-shore plays demands
more sophisticated and effective seismic processing and imaging methods, with more physically
1
2. Introduction M-OSRP13-14
complete and realistic descriptions and concepts. Historically, at every stage of seismic processing
and imaging evolution and advancement there has been a concomitant greater demand on providing
more detailed and accurate subsurface information (for example, the evolution of seismic migration
from post-stack migration, to pre-stack time migration, to pre-stack depth migration, to pre-stack
migration-inversion, to RTM for heterogeneous anisotropic media) where each step was, at once,
both more complete, effective and more demanding.
The confluence of seismic methods that make increasing demands for detailed and accurate subsur-
face information, together with the industry trend to regions and plays where providing that level
of subsurface information is, in general, increasingly difficult to satisfy, and therefore represents a
significant challenge at the present time and for the foreseeable future.
The inability to adequately provide that accurate and detailed subsurface information is a con-
tributing factor to seismic processing, imaging and inversion breakdown and failure and subsequent
dry hole drilling. There are two ways to address that type of challenge: (1) remove the assumption
violation by finding, e.g., new methods and approaches for satisfying the prerequisites, data acqui-
sition assumptions, and accurate velocity models and other subsurface information, or (2) remove
the assumption, by developing new methods that do not make that assumption.
Methods have assumptions and requirements: How does M-OSRP decide whether
to advance seismic capability by: (a) developing new methods/approaches to
provide what a current method requires or (b) developing fundamentally new
methods that do not have those requirements/assumptions?
M-OSRP adopts one or the other of these attitudes and approaches for different assumptions,
requirements, steps, and links in the seismic processing chain that’s under stress and duress and is
in need of attention and improvement.
We describe here the basic strategy that M-OSRP adopts in deciding between: (a) finding a better
way of satisfying a current method’s assumptions and requirements, and (b) developing new and
more effective methods that do not require those assumptions.
If the method requires, e.g., adequate data collected or a better description of the seismic experi-
ment (e.g., the source signature and radiation pattern and deghosted data) then find an approach
and method to satisfy those requirements. However, if the method requires detailed subsurface in-
formation, or interpretive intervention, to be effective and successful then develop a direct method
that doesn’t require that subsurface information.
Why the aversion to providing subsurface information?
As the petroleum industry trend world-wide moves to ever more complex and challenging plays,
the ability to provide subsurface information has become (and will continue to become) an increas-
ingly serious impediment to effectiveness. That fact, explains the reasonableness and (in fact) the
necessity of seeking new methods that can be more effective than current capability, and without
subsurface information.
2
3. Introduction M-OSRP13-14
A central purpose of M-OSRP has been, and remains, to provide a consistent and comprehensive
set of seismic processing methods, for every link in the processing chain, that are direct and do not
require any knowledge of subsurface information.
The tools
The methods M-OSRP develops to satisfy prerequisites, e.g., (reference wave prediction separation,
source signature and radiation pattern, and source and receiver deghosting) benefit from forms
of wave field separation methods derived from Green’s theorem (the extinction theorem). Recent
papers on this subject can be found in
http://mosrp.uh.edu/news/recent-published-papers-deghosting
Those preprocessing methods do not require subsurface information. All ISS subseries require these
preprocessing prerequisites (provided by variants of Green’s theorem) to achieve ISS processing
objectives.
The inverse scattering series communicates that all processing objectives are achievable directly and
without subsurface information. Isolated task ISS subseries are identified that directly and without
subsurface information achieve free surface and internal multiple removal, depth imaging, target
identification and Q compensation without Q.
The combination of Green’s theorem preprocessing and ISS processing provides every link in the
processing chain with methods that are direct, and do not require subsurface information.
M-OSRP and Multiples
The early history of multiple removal methods required at every step a more detailed and accurate
velocity model (for example, stacking, to FK filters, to Radon, and high resolution Radon or 1D
earth and statistical assumptions concerning primaries and multiples (Deconvolution)). A sea-
change in multiple removal capability arrived with methods that to one extent or another didn’t
require subsurface information or interpretive intervention. Methods from the Delphi consortium
depended on either subsurface information and/or interpretive intervention, whereas the inverse
scattering series (ISS) methods from M-OSRP did not. That is the reason that the ISS internal
multiple attenuator has assumed a mainstream industry status and recognition as the high water
mark of internal multiple capability. There is ample evidence to support the latter claim, including
the 2013 SEG (Thursday, September 26, 2013) Internal Multiple Workshop where 9 of the 10
presenters showed ISS internal multiple results as the state of the art available capability for that
seismic internal multiple goal and objective. ISS internal multiple attenuation has become fully
mainstream.
Challenge we face
However, we also recognize, report and emphasize that the current industry portfolio/trend and
focus today (and for the foreseeable future) makes it clear that there is a large and significant
3
4. Introduction M-OSRP13-14
gap between the current challenge for the removal of free surface and internal multiples, where the
specific issues are that: (1) the multiple generators and the subsurface properties are ill-defined and
complex and (2) the multiple can too often be proximal to or interfering with primaries. That type of
challenge of removing multiples proximal to, and/or overlapping with, primaries (without damaging
primaries) is well beyond the collective capability of the petroleum industry, service companies and
academic research groups and consortia—in their ability to effectively address that issue today.
There is a need for new basic concepts, and fundamental theory development that will first need to
take place, and following that delivery the practical application issues will need to be addressed.
The plan
At the 2013 SEG, we proposed and described a three pronged strategy (please see the link and slides
below) to address that challenge, that M-OSRP will pursue—with the potential to provide the step-
change increased and necessary capability. That strategy and plan responds to this outstanding,
pressing and prioritized challenge. That level and magnitude of challenge (and the potential open-
ing and delineation of new petroleum reserves and the scale of the opportunity that overcoming
it represents) resides behind our commitment to develop and deliver fundamental new concepts,
algorithms with step change increased capability, and has returned multiple removal (from its being
viewed as a relatively mature subject and project that helps “pay the rent”) back to center stage
as a major research project and focus within the Mission-Oriented Seismic Research Program. We
feel that our background and experience gives us a good chance to develop, and to deliver, the next
level of required capability.
Below please find links for the 2013 SEG abstracts/posters/presentations and slides that relate to
this communication.
http://mosrp.uh.edu/events/event-news/seg-annual-meeting-2013
http://mosrp.uh.edu/news/seg-annual-meeting-2013
In the table of contents, we have divided the contributions to this Annual Report into categories:
(1) (a) Prerequisite satisfaction and ISS multiple removal, (b) case studies
(2) Beyond internal multiple attenuation: (a) Removing artifacts/spurious events, (b) internal mul-
tiple elimination
(3) ISS depth imaging without the velocity model: update
(4) Analysis and tests of amplitude information at the image point from asymptotic and wave
equation migration: implications for RTM.
2014 SEG Expanded Abstracts that relate to these topics can be found in the link:
http://mosrp.uh.edu/research/publications/seg-abstracts-2014
What is asymptotic and wave equation migration amplitude analysis with a
velocity model doing in an M-OSRP report? What are our objectives, what
have we contributed and what are our plans?
While deriving preprocessing methods (described above), and one-way-wave Stolt pre-stack wave
equation migration (WEM) from Green’s theorem, for our class at UH in Seismic Physics, it was
4
5. Introduction M-OSRP13-14
natural to consider how to derive a wave equation migration for two way propagating waves (for
RTM) from Green’s theorem. That produced the first WEM RTM.
Please see the link
http://mosrp.uh.edu/news/wave-equation-migration-RTM
Wave equation migration (WEM) consists of predicting a source and receiver experiment at depth,
and then a coincident source and receiver experiment at time equals zero. There are two other
original and classic migration imaging conditions, (1) the space and time coincidence of up and
down going waves and (2) the exploding reflector model.
These three migration/imaging conditions are not equivalent, with the latter two representing
asymptotic (Kirchhoff) approximations of the WEM concept and algorithms. There are several
advantages to WEM compared to asymptotic forms of migration: (1) a definitive answer as to
whether a subsurface point is an image point; (2) the amplitude analysis at the image point, and
(3) the ubiquitous wave coverage and illumination of WEM compared to asymptotic ray based
methods. Those advantages and differences between asymptotic and WEM are present for both one
way and two-way (RTM) methods.
The first definitive examination and comparison of WEM and asymptotic (Kirchhoff) migration, for
one-way-waves, and their respective amplitude information and analysis at the target is reported in
this Annual Report. The asymptotic approximate image is useful for structure maps, but doesn’t
provide an angle dependent reflection coefficient. WEM provides both structure and amplitude
information at the image point.
Asymptotic (Kirchhoff) approximate migration doesn’t provide an approximate
experiment of a predicted coincident source and receiver at time equals zero at
an image point.
Asymptotic migration loses that definiteness and meaning of WEM at the image point and provides
instead a fixed travel-time curve with “candidate” image points. The physical meaning and ampli-
tude benefits of the WEM experiment at the image point are lost in asymptotic migration, whether
for one way waves or for RTM (all current RTM methods are asymptotic migration).
This has important implications for all current RTM methods, which are asymptotic migration,
and suggests and encourages a look at the first WEM RTM that M-OSRP has pioneered, and will
keep developing. There are also implications for those looking at increased illumination by imaging
primaries and multiples separately, since all current approaches to that problem use asymptotic
RTM methods, which are intrinsically illumination challenged to begin with. Once again, that’s an
opportunity for WEM RTM.
M-OSRP doesn’t plan to be involved long term in the RTM business, because
it’s a velocity dependent method.
Every year those pursuing RTM methods describe their need for a yet more complicated and accurate
set of heterogeneous and anisotropic velocities. We have no special concept, or idea (or interest) to
5
6. Introduction M-OSRP13-14
compete in that “provide the velocity” arena, where many capable and resourceful and hard-working
researchers now reside. Our goal is to examine and define the potential and the added-value of WEM
RTM compared to asymptotic RTM, in some simple but meaningful two dimensional examples, and
then encourage others to pursue the further development, where their experience and expertise will
expedite that continued development, application and delivery. That’s our current plan for RTM.
ISS depth imaging without the velocity model
In this Report, HOIS, a ISS direct depth imaging algorithm without a velocity model is applied to
the Marmousi model.
The results are very encouraging and the computer run time is 30% longer than a single Stolt
pre-stack water speed migration.
Our prediction for the future is that ISS depth imaging will enter the seismic imaging tool box with
stand-alone imaging capability in the next 5-10 years. Our first published field data test of ISS
imaging on the Kristin North Sea data set demonstrated concept and method viability.
The technical steps needed between viability and providing a new and more effective seismic imag-
ing option within the seismic toolbox are understood, and will be taken. ISS depth imaging will
in the future have the same mainstream role for depth imaging under the most challenging and
daunting subsurface complexity that ISS multiple removal plays today, and will be the new and
necessary option available for exactly those same daunting circumstances. Those who understand
the derivations and logic behind the ISS free surface and internal multiple attenuation methods,
will have no problem with the logic that leads to ISS depth imaging. The ISS multiple removal
methods and the ISS depth imaging and inversion algorithms derive from the same single set of ISS
equations.
Summary/Focus
M-OSRP is pleased to see the mainstream usage and application of the ISS internal multiple atten-
uator.
However, the current challenges frequently faced with on-shore plays and complex off-shore plays to
remove multiples of different orders, and, proximal to, or interfering with, primaries, and without
damaging primaries, raises the bar on needed multiple removal capability. That new level of multiple
removal challenge is currently beyond the collective capability of the petroleum industry to address.
M-OSRP has a three pronged strategy and plan to address that challenge and to develop and deliver
an effective response. That is currently (and near term will remain) our central and principle focus
and resource allocation. That new algorithmic strength and capability developed within M-OSRP
will have a commensurate higher level of compute demand. Our collaboration/cooperation/part-
nership with IBM connects the “what to compute” for more effectiveness from M-OSRP with new
visions of “how to compute” from IBM. That combination of “what” and “how” are essential for
providing a new, more effective and relevant capability to our sponsors.
6
7. Introduction M-OSRP13-14
We plan to continue our research in WEM RTM (with a velocity model) and ISS direct imaging
(without a velocity model).
For the near future, these imaging projects, will move forward, but with somewhat less resource
allocation in comparison to the serious and unfinished business of removing multiples.
There are several reasons for this relative emphasis of projects within M-OSRP: (1) there are
numerous regions in the world (e.g., the Middle East, Central North Sea, . . . ) where without more
capability and effectiveness in removing multiples, making advances in imaging (that depend on
multiples having been removed) are irrelevant or useless; and (2) there is often a resistance among
researchers in seismic imaging to recognize and acknowledge an imaging challenge as existing, beyond
the one that they are currently able to address.
M-OSRP has a portfolio of projects that takes ownership of all the links in the seismic processing
chain. Some of our projects are embryonic, whereas others are more “mature” and help “pay the rent”.
M-OSRP is very aware that some of the concepts we develop and methods we pursue, and messages
we communicate, are at variance with the conventional, the consensus view and mainstream concepts
and thinking. To provide fundamental new step change capability requires the ability to be resolute
and to flourish and to succeed in the face of considerable doubt and skepticism.
No real advance in science ever occurred without running into issues and obsta-
cles from the “consensus view”
It wasn’t too long ago that a statement like “multiples can be predicted directly and without
subsurface information” was greeted with eyes rolling and disbelief. Now that idea and methodology
have become mainstream. Today, saying it will be possible to “directly depth image without a
velocity model” generally elicits that same incredulous response.
We are enormously fortunate and deeply grateful for your encouragement and support. Your support
and confidence allows us the opportunity to perform fundamental directed “mission-oriented” seismic
research that will continue to deliver effective responses to pressing and prioritized seismic processing
challenges.
Arthur B. Weglein
May, 2014
7