1. The document discusses various methods for building velocity models in Petrel software using well and seismic data, including average velocity, layer cake, and anisotropic approaches.
2. It provides steps for incorporating well velocity data, seismic stacking velocities, and time-depth curves to create interval velocity surfaces. Quality control steps are described for horizon and fault interpretation before velocity modeling.
3. Examples are given of different velocity modeling methods in Petrel and their applications, including calibrated co-kriging, trend modeling, and layer cake approaches. Metrics for evaluating velocity model accuracy like well tie errors are also discussed.
This document provides an overview of static reservoir modeling. It discusses grid definition and selection, structural modeling using seismic data interpretation, stratigraphic modeling using well log correlation, lithological modeling to define facies distributions, and petrophysical modeling to estimate properties. Structural modeling involves identifying tops, interpreting faults manually or automatically using ant tracking. Stratigraphic modeling uses sequence stratigraphy and well log correlation. Lithological modeling integrates sedimentology, facies classification, and 3D distribution. Petrophysical modeling can be deterministic or stochastic to interpolate properties between wells.
Seismic interpretation involves correlating seismic data features with geological elements to understand the subsurface. The goal is to map reservoirs, including depth, thickness, and properties. This involves processing data, well calibration, horizon and fault tracking, and attribute analysis. Direct hydrocarbon indicators on seismic can help identify potential reservoirs, but require validation with amplitude versus offset analysis due to limitations and need for a geological model.
This document outlines a simple seismic data processing workflow. It begins with acquiring field data and updating the geometry. Next steps include trace editing, amplitude recovery, and noise attenuation. Velocity analysis and normal moveout correction are then applied. Deconvolution and multiple attenuation are performed before migration. Post-migration involves stacking, filtering and amplitude scaling to produce the final processed seismic section. The goal of seismic processing is to produce high quality seismic data for geological interpretation and hydrocarbon exploration.
Role of Seismic Attributes in Petroleum Exploration_30May22.pptxNagaLakshmiVasa
The document discusses seismic attributes which are measurable properties of seismic data computed through mathematical manipulation to highlight geological features. It describes how seismic waves are reflected and refracted and how this seismic response is recorded. The key types of seismic attributes discussed are amplitude, phase, frequency and complex trace attributes. Specific amplitude attributes like RMS amplitude and sweetness are explained. The document also covers applications of seismic attributes like direct hydrocarbon indication and limitations. Spectral decomposition and AVO/AVA analysis are also summarized.
1) Seismic interpretation uses acoustic waves to image the subsurface by measuring the two-way travel time and amplitude of reflections. 2) A seismic source generates wavefronts that travel through the subsurface, reflecting or transmitting at interfaces between rock layers. 3) The amount of reflection depends on the relative difference in physical properties across interfaces, defined by reflection coefficients. Layers thinner than 1/4 the wavelength cannot be resolved individually.
Seismic attributes are additional data obtained from seismic data that provide information beyond just structure. They are generated to enhance structural and stratigraphic features, locate misinterpretations, and provide information on lithology, facies or fluid content for reservoir characterization. Seismic attributes can be classified as physical attributes derived from pre-stack or post-stack data, geometric attributes like dip and curvature, or instantaneous attributes like frequency and phase. Case studies demonstrate how attributes are extracted and analyzed from 3D seismic volumes.
The document describes seismic interpretation workflows, including conventional and unconventional techniques. Conventional techniques involve horizon interpretations, fault picking, and tying seismic data to well logs to understand subsurface geology. Unconventional techniques analyze seismic attribute variations like amplitudes to identify hydrocarbon indicators. The workflow includes generating synthetics from well logs, interpreting horizons on seismic sections, identifying structures like faults and gas chimneys, and determining direct hydrocarbon indicators.
This document provides an overview of static reservoir modeling. It discusses grid definition and selection, structural modeling using seismic data interpretation, stratigraphic modeling using well log correlation, lithological modeling to define facies distributions, and petrophysical modeling to estimate properties. Structural modeling involves identifying tops, interpreting faults manually or automatically using ant tracking. Stratigraphic modeling uses sequence stratigraphy and well log correlation. Lithological modeling integrates sedimentology, facies classification, and 3D distribution. Petrophysical modeling can be deterministic or stochastic to interpolate properties between wells.
Seismic interpretation involves correlating seismic data features with geological elements to understand the subsurface. The goal is to map reservoirs, including depth, thickness, and properties. This involves processing data, well calibration, horizon and fault tracking, and attribute analysis. Direct hydrocarbon indicators on seismic can help identify potential reservoirs, but require validation with amplitude versus offset analysis due to limitations and need for a geological model.
This document outlines a simple seismic data processing workflow. It begins with acquiring field data and updating the geometry. Next steps include trace editing, amplitude recovery, and noise attenuation. Velocity analysis and normal moveout correction are then applied. Deconvolution and multiple attenuation are performed before migration. Post-migration involves stacking, filtering and amplitude scaling to produce the final processed seismic section. The goal of seismic processing is to produce high quality seismic data for geological interpretation and hydrocarbon exploration.
Role of Seismic Attributes in Petroleum Exploration_30May22.pptxNagaLakshmiVasa
The document discusses seismic attributes which are measurable properties of seismic data computed through mathematical manipulation to highlight geological features. It describes how seismic waves are reflected and refracted and how this seismic response is recorded. The key types of seismic attributes discussed are amplitude, phase, frequency and complex trace attributes. Specific amplitude attributes like RMS amplitude and sweetness are explained. The document also covers applications of seismic attributes like direct hydrocarbon indication and limitations. Spectral decomposition and AVO/AVA analysis are also summarized.
1) Seismic interpretation uses acoustic waves to image the subsurface by measuring the two-way travel time and amplitude of reflections. 2) A seismic source generates wavefronts that travel through the subsurface, reflecting or transmitting at interfaces between rock layers. 3) The amount of reflection depends on the relative difference in physical properties across interfaces, defined by reflection coefficients. Layers thinner than 1/4 the wavelength cannot be resolved individually.
Seismic attributes are additional data obtained from seismic data that provide information beyond just structure. They are generated to enhance structural and stratigraphic features, locate misinterpretations, and provide information on lithology, facies or fluid content for reservoir characterization. Seismic attributes can be classified as physical attributes derived from pre-stack or post-stack data, geometric attributes like dip and curvature, or instantaneous attributes like frequency and phase. Case studies demonstrate how attributes are extracted and analyzed from 3D seismic volumes.
The document describes seismic interpretation workflows, including conventional and unconventional techniques. Conventional techniques involve horizon interpretations, fault picking, and tying seismic data to well logs to understand subsurface geology. Unconventional techniques analyze seismic attribute variations like amplitudes to identify hydrocarbon indicators. The workflow includes generating synthetics from well logs, interpreting horizons on seismic sections, identifying structures like faults and gas chimneys, and determining direct hydrocarbon indicators.
3D Facies Modelling project using Petrel software. Msc Geology and Geophysics
Abstract
The Montserrat and Sant Llorenç del Munt fan-delta complexes were developed during the Eocene in the Ebro basin. The depositional stratigraphic record of these fan deltas has been described as a made up by a several transgressive and regressive composite sequences each made up by several fundamental sequences. Each sequence set is in turn composed by five main facies belts: proximal alluvial fan, distal alluvial fan, delta front, carbonates platforms and prodelta.
Using outcrop data from three composite sequences (Sant Vicenç, Vilomara and Manresa), a 3D facies model was built. The key sequential traces of the studied area georeferenced and digitalized on to photorealistic terrain models, were the hard data used as input to reconstruct the main surfaces, which are separating transgressive and regressive stacking patterns. Regarding the facies modelling has been achieved using a geostatistical algorithm in order to define the stacking trend and the interfingerings of adjacent facies belts, and five paleogeographyc maps to reproduce the paleogeometry of the facies belts within each system tract.
The final model has been checked, using a real cross section, and analysed in order to obtain information about the Delta Front facies which are the ones susceptible to be analogous of a reservoir. Attending to the results including eight probability maps of occurrence, the transgressive sequence set of Vilomara is the greatest accumulation of these facies explained by its agradational component.
Introduction Petrel Course (UAB-2014)
This course has been prepared as an introduction of Petrel software (Schlumberger, www.software.slb.com/products/platform/Pages/petrel.aspx), an application which allows the modeling and visualization of reservoirs, since the exploration stage until production, integrating geological and geophysical data, geological modeling (structural and stratigraphic frameworks), well planning, or property modeling ( petrophysical or petrological) among other possibilities.
The course will be focused mainly in the understanding and utilization of workflows aimed to build geological models based on superficial data (at the outcrop scale) but also with seismic data. The course contents have been subdivided in 5 modules each one developed through the combination of short explanations and practical exercises.
The duration of the course covers more or less 10h divided in three sessions. The starting data will be in the first week of December.
This course will be oriented mainly for the PhD and master students ascribed at the Geologic department of the UAB. For logistic reasons the maximum number of places for each torn are 9. The course is free from the Department members but the external interested will have to make a symbolic payment.
Those interested send an e-mail to the Doctor Griera (albert.griera@uab.cat).
The course will be imparted by Marc Diviu (Msc. Geology and Geophysics of reservoirs).
This document discusses seismic data processing workflows. It begins with an introduction and agenda. The general workflow includes reformatting, trace editing, geometry handling, amplitude recovery, noise attenuation through techniques like frequency and FK filtering, deconvolution, multiple removal, migration, velocity analysis, NMO correction, muting, stacking, and post-stack filtering and amplitude scaling to produce a final image for geological interpretation. The document emphasizes that the proper workflow selection depends on processing environment, targets, costs, and client preferences. It concludes with time for questions.
This document provides instructions for interpreting seismic horizons using both manual and automated techniques in Petrel. It discusses horizon interpretation workflows including inserting new horizons, identifying reflection events, using different autotracking methods like seeded 2D and 3D autotracking, editing interpreted horizons, and displaying and manipulating horizons. The document also reviews individual autotracking parameters and provides exercises for practicing horizon interpretation and editing in Petrel.
The analysis of all of the significant processes that formed a basin and deformed its sedimentary fill from basin-scale processes (e.g., plate tectonics)
to centimeter-scale processes (e.g., fracturing)
Using 3-D Seismic Attributes in Reservoir Characterizationguest05b785
The document discusses using 3D seismic attributes for reservoir characterization. It provides an overview of seismic reflection methods and defines seismic attributes as any measurement derived from seismic data. Common types of attributes are described including time, complex trace, window, Fourier and multi-trace attributes. The document gives examples of attributes like envelope, phase, frequency and coherence that can provide information on lithology, thickness, faults and fractures. Methods of interpreting attribute data from 3D volumes are outlined. The document concludes by providing examples of how attributes can be used for reservoir characterization tasks like fault interpretation and porosity estimation.
The document discusses various techniques for visualizing and manipulating seismic data in Petrel, including:
- Inserting random lines and polyline intersections to create arbitrary seismic intersections
- Tiling windows and linking cameras between windows to synchronize views
- Using ghost curves to compare seismic signal patterns across faults
- Overlaying seismic attributes and vintages to aid interpretation
- Adjusting settings like transparency and annotation to control seismic data display
- Browsing and managing seismic surveys using the Survey manager tool.
Reservoir Geophysics : Brian Russell Lecture 1Ali Osman Öncel
This document provides an introduction to AVO and pre-stack inversion methods. It begins with a brief history of seismic interpretation, from purely structural interpretation to identifying "bright spots" to direct hydrocarbon detection using AVO and pre-stack inversion. It then discusses how AVO response is closely linked to rock physics properties like P-wave velocity, S-wave velocity, and density. The key concepts of AVO modeling and attributes are introduced. Finally, it provides an overview of rock physics and fluid replacement modeling using equations like Biot-Gassmann to model velocity and density changes with fluid saturation.
Seismic data interpretation aims to tell the geologic story contained in seismic data by correlating seismic features with known geological elements. The summary outlines key concepts including:
1. Reflection, velocity, P and S waves, polarity, phase, resolution and detectability which influence seismic interpretation.
2. Depositional environments, rock types, faults and folds are interpreted from seismic data to understand the subsurface petroleum system.
3. Structural and stratigraphic interpretation including seismic attributes, multi-attribute logging, direct hydrocarbon indicators, and AVO/impedance inversion are used to characterize reservoirs.
This document provides an overview of principles of seismic data processing. It discusses key concepts like seismic generation, data processing steps, velocity analysis, noise attenuation techniques, and common processing flows. The document is divided into multiple chapters that cover topics such as wave propagation, reflection coefficients, deconvolution, F-K transforms, and factors that affect seismic amplitudes. Specific noise types like swell noise are also explained and methods to attenuate them, such as using band-pass filters or amplitude/frequency filters, are described.
Seismic attribute analysis using complex trace analysisSomak Hajra
The document discusses seismic attributes, which are measurements or properties obtained from seismic data that provide information about rock properties. It defines various types of attributes such as pre-stack, instantaneous, physical, and multi-trace attributes. The document also discusses the analysis of key seismic attributes like reflection strength, instantaneous phase and frequency through the use of complex trace analysis. Finally, it concludes that seismic attributes are important tools that help interpreters extract more information from seismic data for applications like hydrocarbon exploration and reservoir characterization.
Quantitative and Qualitative Seismic Interpretation of Seismic Data Haseeb Ahmed
This document discusses quantitative and qualitative seismic interpretation techniques used to analyze seismic data and map subsurface geology. It compares traditional qualitative techniques to more modern quantitative techniques. It then focuses on unconventional seismic interpretation techniques used for unconventional reservoirs with low permeability, including AVO analysis, seismic inversion, seismic attributes, and forward seismic modeling. These techniques can help identify tight gas, shale gas, and gas hydrate reservoirs that conventional methods cannot easily detect. The document provides details on how each technique works and its advantages.
What do you means by seismic resolutionHaseeb Ahmed
Seismic resolution refers to the ability to differentiate between two seismic features. Vertical resolution is the ability to resolve two vertically stacked seismic horizons, and depends on factors like frequency and wavelength. Higher frequencies and shorter wavelengths improve vertical resolution but are attenuated at greater depths. Lateral resolution is the ability to resolve two horizontally separated features, and depends on factors like frequency, velocity, aperture, and whether pre-stack or post-stack migration is used. Both vertical and lateral resolution decrease with depth due to attenuation of higher frequencies. Increasing bandwidth, using phase rotation techniques, and migration can help enhance seismic resolution.
This document describes model-based seismic inversion performed on a 3D seismic dataset from the F3 block in the Dutch sector of the North Sea. The study area has complex geology from the Paleozoic to Cenozoic eras. The authors applied model-based inversion using Hampson-Russell software to determine lithology and fluid distributions in the target reservoir. They imported the 3D seismic cube, identified the target reservoir horizon, performed depth conversion and quality control, extracted the wavelet, built an initial model, ran the inversion, and analyzed the resulting acoustic impedance volume to characterize the subsurface.
This document outlines the steps in a Petrel course, including loading seismic data, well data like trajectories and logs, creating synthetic seismograms, picking horizons in the time domain, applying seismic attributes, converting horizons to depth using well data, and exporting maps of depth surfaces. The horizon picking was noted to be for practice only.
This document discusses reservoir geophysics and geology. It begins with an introduction to geophysics, noting that most rocks are opaque so geophysics uses physics to obtain "geophysical images" of the subsurface based on properties like density, magnetism, conductivity, and velocity. It discusses using natural fields like gravity and magnetics to measure subsurface variations at a regional scale. Later sections discuss seismic reflection methods, potential field applications in mapping geology, and benefits of 3D seismic over 2D in providing better geological models. The document provides an overview of key concepts in reservoir geophysics and geology.
This document provides instructions for interpreting faults in Petrel. It describes how to manually pick faults on seismic lines, edit fault segments, move and reassign segments between faults, clean faults, and use restrict mode. The exercises section instructs the user to create a new fault interpretation folder, interpret faults on every 20th crossline between lines 400 and 420, highlight and assign fault sticks to individual fault planes, and name the faults. The overall document teaches how to perform a basic fault interpretation project in Petrel.
1) Geophysics uses remote sensing to determine subsurface conditions by analyzing seismic and radar signals that travel through and reflect off underground materials.
2) There are four main modes of signal propagation: vertical reflection, wide angle reflection, critical refraction, and direct waves. Precisely measuring the travel times of these signals allows subsurface structures to be interpreted.
3) Reflection seismology analyzes reflected signals to determine depth to interfaces by relating travel time, distance between source and receiver, and velocity, while refraction seismology uses travel times of critically refracted signals to determine shallow subsurface velocity structure.
Seismic attributes are being used more and more often in the reservoir characterization and interpretation processes. The new software and computer’s development allows today to generate a large number of surface and volume attributes. They proved to be very useful for the facies and reservoir properties distribution in the geological models, helping to improve their quality in the areas between the wells and areas without wells. The seismic attributes can help to better understand the stratigraphic and structural features, the sedimentation processes, lithology variations, etc. By improving the static geological models, the dynamic models are also improved, helping to better understand the reservoirs’ behavior during exploitation. As a result, the estimation of the recoverable hydrocarbon volumes becomes more reliable and the development strategies will become more successful.
The exploration team built a 3D geologic model from 2D seismic data in the Peruvian Andes to extract 2D velocity models for prestack depth migration, in order to impose additional geologic constraints and ensure consistency across lines. While effective for lines parallel to dip, models extracted from the 3D volume for oblique lines required adjustments to match reflector positions. The final depth migrated images showed improvements over time migration, minimizing velocity pull-up below structures through a geologically constrained 3D velocity model.
This document discusses various subsurface mapping techniques used in seismic interpretation, including:
- Seismic picking, fault interpretation, contouring, time structure mapping, velocity modeling, and depth conversion to analyze seismic data.
- Tools for basic seismic mapping including workstations to access large volumes of 2D and 3D seismic data.
- Methods for well tie, structural interpretation, 3D visualization, and attribute analysis of seismic volumes.
- Creating structure contour maps, depth structure maps, and 3D structural models from seismic horizon picks and velocity modeling.
- Using attributes like coherency to enhance fault and feature imaging at the limit of seismic resolution.
- Analyzing sequences, unconformities,
3D Facies Modelling project using Petrel software. Msc Geology and Geophysics
Abstract
The Montserrat and Sant Llorenç del Munt fan-delta complexes were developed during the Eocene in the Ebro basin. The depositional stratigraphic record of these fan deltas has been described as a made up by a several transgressive and regressive composite sequences each made up by several fundamental sequences. Each sequence set is in turn composed by five main facies belts: proximal alluvial fan, distal alluvial fan, delta front, carbonates platforms and prodelta.
Using outcrop data from three composite sequences (Sant Vicenç, Vilomara and Manresa), a 3D facies model was built. The key sequential traces of the studied area georeferenced and digitalized on to photorealistic terrain models, were the hard data used as input to reconstruct the main surfaces, which are separating transgressive and regressive stacking patterns. Regarding the facies modelling has been achieved using a geostatistical algorithm in order to define the stacking trend and the interfingerings of adjacent facies belts, and five paleogeographyc maps to reproduce the paleogeometry of the facies belts within each system tract.
The final model has been checked, using a real cross section, and analysed in order to obtain information about the Delta Front facies which are the ones susceptible to be analogous of a reservoir. Attending to the results including eight probability maps of occurrence, the transgressive sequence set of Vilomara is the greatest accumulation of these facies explained by its agradational component.
Introduction Petrel Course (UAB-2014)
This course has been prepared as an introduction of Petrel software (Schlumberger, www.software.slb.com/products/platform/Pages/petrel.aspx), an application which allows the modeling and visualization of reservoirs, since the exploration stage until production, integrating geological and geophysical data, geological modeling (structural and stratigraphic frameworks), well planning, or property modeling ( petrophysical or petrological) among other possibilities.
The course will be focused mainly in the understanding and utilization of workflows aimed to build geological models based on superficial data (at the outcrop scale) but also with seismic data. The course contents have been subdivided in 5 modules each one developed through the combination of short explanations and practical exercises.
The duration of the course covers more or less 10h divided in three sessions. The starting data will be in the first week of December.
This course will be oriented mainly for the PhD and master students ascribed at the Geologic department of the UAB. For logistic reasons the maximum number of places for each torn are 9. The course is free from the Department members but the external interested will have to make a symbolic payment.
Those interested send an e-mail to the Doctor Griera (albert.griera@uab.cat).
The course will be imparted by Marc Diviu (Msc. Geology and Geophysics of reservoirs).
This document discusses seismic data processing workflows. It begins with an introduction and agenda. The general workflow includes reformatting, trace editing, geometry handling, amplitude recovery, noise attenuation through techniques like frequency and FK filtering, deconvolution, multiple removal, migration, velocity analysis, NMO correction, muting, stacking, and post-stack filtering and amplitude scaling to produce a final image for geological interpretation. The document emphasizes that the proper workflow selection depends on processing environment, targets, costs, and client preferences. It concludes with time for questions.
This document provides instructions for interpreting seismic horizons using both manual and automated techniques in Petrel. It discusses horizon interpretation workflows including inserting new horizons, identifying reflection events, using different autotracking methods like seeded 2D and 3D autotracking, editing interpreted horizons, and displaying and manipulating horizons. The document also reviews individual autotracking parameters and provides exercises for practicing horizon interpretation and editing in Petrel.
The analysis of all of the significant processes that formed a basin and deformed its sedimentary fill from basin-scale processes (e.g., plate tectonics)
to centimeter-scale processes (e.g., fracturing)
Using 3-D Seismic Attributes in Reservoir Characterizationguest05b785
The document discusses using 3D seismic attributes for reservoir characterization. It provides an overview of seismic reflection methods and defines seismic attributes as any measurement derived from seismic data. Common types of attributes are described including time, complex trace, window, Fourier and multi-trace attributes. The document gives examples of attributes like envelope, phase, frequency and coherence that can provide information on lithology, thickness, faults and fractures. Methods of interpreting attribute data from 3D volumes are outlined. The document concludes by providing examples of how attributes can be used for reservoir characterization tasks like fault interpretation and porosity estimation.
The document discusses various techniques for visualizing and manipulating seismic data in Petrel, including:
- Inserting random lines and polyline intersections to create arbitrary seismic intersections
- Tiling windows and linking cameras between windows to synchronize views
- Using ghost curves to compare seismic signal patterns across faults
- Overlaying seismic attributes and vintages to aid interpretation
- Adjusting settings like transparency and annotation to control seismic data display
- Browsing and managing seismic surveys using the Survey manager tool.
Reservoir Geophysics : Brian Russell Lecture 1Ali Osman Öncel
This document provides an introduction to AVO and pre-stack inversion methods. It begins with a brief history of seismic interpretation, from purely structural interpretation to identifying "bright spots" to direct hydrocarbon detection using AVO and pre-stack inversion. It then discusses how AVO response is closely linked to rock physics properties like P-wave velocity, S-wave velocity, and density. The key concepts of AVO modeling and attributes are introduced. Finally, it provides an overview of rock physics and fluid replacement modeling using equations like Biot-Gassmann to model velocity and density changes with fluid saturation.
Seismic data interpretation aims to tell the geologic story contained in seismic data by correlating seismic features with known geological elements. The summary outlines key concepts including:
1. Reflection, velocity, P and S waves, polarity, phase, resolution and detectability which influence seismic interpretation.
2. Depositional environments, rock types, faults and folds are interpreted from seismic data to understand the subsurface petroleum system.
3. Structural and stratigraphic interpretation including seismic attributes, multi-attribute logging, direct hydrocarbon indicators, and AVO/impedance inversion are used to characterize reservoirs.
This document provides an overview of principles of seismic data processing. It discusses key concepts like seismic generation, data processing steps, velocity analysis, noise attenuation techniques, and common processing flows. The document is divided into multiple chapters that cover topics such as wave propagation, reflection coefficients, deconvolution, F-K transforms, and factors that affect seismic amplitudes. Specific noise types like swell noise are also explained and methods to attenuate them, such as using band-pass filters or amplitude/frequency filters, are described.
Seismic attribute analysis using complex trace analysisSomak Hajra
The document discusses seismic attributes, which are measurements or properties obtained from seismic data that provide information about rock properties. It defines various types of attributes such as pre-stack, instantaneous, physical, and multi-trace attributes. The document also discusses the analysis of key seismic attributes like reflection strength, instantaneous phase and frequency through the use of complex trace analysis. Finally, it concludes that seismic attributes are important tools that help interpreters extract more information from seismic data for applications like hydrocarbon exploration and reservoir characterization.
Quantitative and Qualitative Seismic Interpretation of Seismic Data Haseeb Ahmed
This document discusses quantitative and qualitative seismic interpretation techniques used to analyze seismic data and map subsurface geology. It compares traditional qualitative techniques to more modern quantitative techniques. It then focuses on unconventional seismic interpretation techniques used for unconventional reservoirs with low permeability, including AVO analysis, seismic inversion, seismic attributes, and forward seismic modeling. These techniques can help identify tight gas, shale gas, and gas hydrate reservoirs that conventional methods cannot easily detect. The document provides details on how each technique works and its advantages.
What do you means by seismic resolutionHaseeb Ahmed
Seismic resolution refers to the ability to differentiate between two seismic features. Vertical resolution is the ability to resolve two vertically stacked seismic horizons, and depends on factors like frequency and wavelength. Higher frequencies and shorter wavelengths improve vertical resolution but are attenuated at greater depths. Lateral resolution is the ability to resolve two horizontally separated features, and depends on factors like frequency, velocity, aperture, and whether pre-stack or post-stack migration is used. Both vertical and lateral resolution decrease with depth due to attenuation of higher frequencies. Increasing bandwidth, using phase rotation techniques, and migration can help enhance seismic resolution.
This document describes model-based seismic inversion performed on a 3D seismic dataset from the F3 block in the Dutch sector of the North Sea. The study area has complex geology from the Paleozoic to Cenozoic eras. The authors applied model-based inversion using Hampson-Russell software to determine lithology and fluid distributions in the target reservoir. They imported the 3D seismic cube, identified the target reservoir horizon, performed depth conversion and quality control, extracted the wavelet, built an initial model, ran the inversion, and analyzed the resulting acoustic impedance volume to characterize the subsurface.
This document outlines the steps in a Petrel course, including loading seismic data, well data like trajectories and logs, creating synthetic seismograms, picking horizons in the time domain, applying seismic attributes, converting horizons to depth using well data, and exporting maps of depth surfaces. The horizon picking was noted to be for practice only.
This document discusses reservoir geophysics and geology. It begins with an introduction to geophysics, noting that most rocks are opaque so geophysics uses physics to obtain "geophysical images" of the subsurface based on properties like density, magnetism, conductivity, and velocity. It discusses using natural fields like gravity and magnetics to measure subsurface variations at a regional scale. Later sections discuss seismic reflection methods, potential field applications in mapping geology, and benefits of 3D seismic over 2D in providing better geological models. The document provides an overview of key concepts in reservoir geophysics and geology.
This document provides instructions for interpreting faults in Petrel. It describes how to manually pick faults on seismic lines, edit fault segments, move and reassign segments between faults, clean faults, and use restrict mode. The exercises section instructs the user to create a new fault interpretation folder, interpret faults on every 20th crossline between lines 400 and 420, highlight and assign fault sticks to individual fault planes, and name the faults. The overall document teaches how to perform a basic fault interpretation project in Petrel.
1) Geophysics uses remote sensing to determine subsurface conditions by analyzing seismic and radar signals that travel through and reflect off underground materials.
2) There are four main modes of signal propagation: vertical reflection, wide angle reflection, critical refraction, and direct waves. Precisely measuring the travel times of these signals allows subsurface structures to be interpreted.
3) Reflection seismology analyzes reflected signals to determine depth to interfaces by relating travel time, distance between source and receiver, and velocity, while refraction seismology uses travel times of critically refracted signals to determine shallow subsurface velocity structure.
Seismic attributes are being used more and more often in the reservoir characterization and interpretation processes. The new software and computer’s development allows today to generate a large number of surface and volume attributes. They proved to be very useful for the facies and reservoir properties distribution in the geological models, helping to improve their quality in the areas between the wells and areas without wells. The seismic attributes can help to better understand the stratigraphic and structural features, the sedimentation processes, lithology variations, etc. By improving the static geological models, the dynamic models are also improved, helping to better understand the reservoirs’ behavior during exploitation. As a result, the estimation of the recoverable hydrocarbon volumes becomes more reliable and the development strategies will become more successful.
The exploration team built a 3D geologic model from 2D seismic data in the Peruvian Andes to extract 2D velocity models for prestack depth migration, in order to impose additional geologic constraints and ensure consistency across lines. While effective for lines parallel to dip, models extracted from the 3D volume for oblique lines required adjustments to match reflector positions. The final depth migrated images showed improvements over time migration, minimizing velocity pull-up below structures through a geologically constrained 3D velocity model.
This document discusses various subsurface mapping techniques used in seismic interpretation, including:
- Seismic picking, fault interpretation, contouring, time structure mapping, velocity modeling, and depth conversion to analyze seismic data.
- Tools for basic seismic mapping including workstations to access large volumes of 2D and 3D seismic data.
- Methods for well tie, structural interpretation, 3D visualization, and attribute analysis of seismic volumes.
- Creating structure contour maps, depth structure maps, and 3D structural models from seismic horizon picks and velocity modeling.
- Using attributes like coherency to enhance fault and feature imaging at the limit of seismic resolution.
- Analyzing sequences, unconformities,
Changes in dam break hydrodynamic modelling practice - Suter et alStephen Flood
Abstract: Today, many organisations rely on hydrodynamic modelling to assess the consequences of dam break failure on downstream populations and infrastructure. The availability of finite volume shock-capturing schemes and flexible mesh schematisations in widely used software platforms imply that dam break modelling projects will be carried out differently in the future: Finite volume based platforms allow widespread application of shock-capturing methods and flexible mesh platforms can represent features in the study area more realistically and are more flexible thanks to varying mesh resolutions. Furthermore, the recent adoption of Graphics Processing Unit (GPU) technology in mainstream scientific and engineering computing will also significantly decrease computation times at relatively low cost.
This paper examines the application of finite volume, flexible mesh and GPU technologies to dam break modelling. One-dimensional (1D) modelling results are compared to those from two-dimensional (2D) finite difference and finite volume approaches. The results demonstrate that there are differences between modelling approaches and that the computational speeds of 2D simulations can be significantly reduced by the use of GPU processors.
Static corrections are needed to account for irregular near-surface layers and topography in land seismic data. This shifts seismic traces to a common datum to obtain the correct subsurface image and enhance resolution. Static shifts traces to account for variations in elevation, weathering layer thickness and velocity. Several methods calculate static including field statics from acquisition parameters, elevation statics for flat areas, and refraction/reflection methods using first arrivals or residual shifts. Correcting statics aligns events, improves stack quality and avoids structural distortions in the subsurface image.
This document discusses static correction in seismic data processing. It covers:
1) Static correction removes the effects of surface elevation changes and weathering layers on seismic data.
2) Examples are given of how water depth variations can induce pull-down of reflectors, though this does not represent real geology.
3) A figure from a research paper shows a seismic section with associated velocity information, geology, and an approximate static corrections diagram.
A Fully Automated System for Monitoring Pit Wall DisplacementsJOSE ESPEJO VASQUEZ
ABSTRACTO.
El Monitoreo automatizado de taludes empinados, excavaciones y terraplenes altos; permite la detección temprana de la inestabilidad y se puede utilizar para evitar o mitigar las posibles fallas de taludes.
Los sistemas que utilizan múltiples y diferentes tipos de sensores se han desarrollado y probado con éxito en la Mina Highland Valley Copper en la Columbia Británica. Estos sistemas utilizan estaciones totales robóticas (RTS) como principales sensores de medición, con levantamientos repetidas en intervalos predefinidos seleccionados para optimizar la eficiencia operativa.
Esta metodología ha sido desarrollada para mejorar el sistema de exactitud y fiabilidad mediante la reducción de los efectos de errores sistemáticos creados por la refracción atmosférica e instrumento inestable y posiciones de punto de referencia. La inclusión de sensores GPS para monitorear las posiciones RTS crea flexibilidad operativa adicional y mantiene la integridad del sistema cuando las estaciones de referencia disponibles son insuficientes.
PetroTeach Free Webinar by Dr. Andrew Ross on Seismic Reservoir CharacterizationPetro Teach
A reliable reservoir model is an invaluable tool for risk reduction. I will give an overview of seismic reservoir characterization and the quantitative interpretation workflow including the use of pre and post stack seismic attributes and inversion outputs for mapping reservoir properties and integration of the attribute output with petrophysical data to create quantitative reservoir models.
PetroTeach Free Webinar on Seismic Reservoir CharacterizationPetroTeach1
A reliable reservoir model is an invaluable tool for risk reduction. Dr. Andrew Ross gave an overview of seismic reservoir characterization and the quantitative interpretation workflow including the use of pre and post-stack seismic attributes and inversion outputs for mapping reservoir properties and integration of the attribute output with petrophysical data to create quantitative reservoir models.
Assessment of subsurface shallow gas expressionsMahmoud Hossam
at Netherlands Offshore F3 Block, in the Dutch Central Graben of the North Sea.
Graduation Project to Geophysics Department, Faculty of Science.
Cairo University, June 27, 2015
ASEG-PESA-AIG_2016_Abstract_North West Shelf 3D Velocity Modeling_ESTIMAGESLaureline Monteignies
The document describes the creation of a 3D velocity model covering the entire North West Shelf of Australia using seismic and well data. Over 200 seismic surveys and nearly 900 wells were integrated using an innovative 3D modeling approach. The major challenges were honoring geological features at this wide regional scale and ensuring consistency across basins. The resulting geologically consistent model has a depth uncertainty of less than +/-100m even over 100km from wells.
Fugro Survey performs geophysical surveys and site surveys in Norwegian waters to identify hazards for offshore drilling. They use seismic data to interpret shallow soils and identify features like shallow gas. An amplitude anomaly workflow in ArcGIS is used to standardize mapping and visualizing interpreted seismic amplitude anomalies from site surveys in a geodatabase. This allows the data to be easily incorporated into reports, presentations, web maps, and 3D visualizations.
The document proposes an improved change vector analysis (ICVA) method to more accurately detect land cover changes using multi-temporal remote sensing data. ICVA combines traditional change vector analysis with a cross-correlogram spectral matching algorithm to 1) preliminarily detect changes, 2) identify and eliminate areas of vegetation variation rather than conversion using profile similarity analysis, and 3) determine actual land cover conversion types. The method is tested on MODIS EVI data for a region in China, achieving higher accuracy than traditional change vector analysis alone.
This document provides an overview of gravity and seismic geophysical exploration methods. It begins with introductions to gravity, its units of measurement, and factors that cause gravity variations. It then discusses gravity data acquisition, processing steps like tidal and elevation corrections to derive anomaly maps, and interpretation. For seismic exploration, it describes data acquisition using common midpoint gathers and factors like fold, followed by processing steps like normal moveout correction and stacking to improve signal-to-noise ratio and imaging resolutions. It concludes with discussions on filtering, migration, and how these improve subsurface representations.
Development of Methodology for Determining Earth Work Volume Using Combined S...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
This document summarizes the seismic hazard assessment conducted for the Kathmandu valley in Nepal. It describes the procedures used, including setting scenario earthquakes, developing a ground model, and assessing characteristics of the 2015 Gorkha earthquake. Scenario earthquakes of magnitudes 7.8-8.6 were set, and a ground model was developed using over 400 drilling data points, microtremor measurements, and geological cross sections. Site response analyses were then conducted to estimate seismic ground motions and risks of liquefaction and slope failure across the valley.
02 chapter: Earthquake: Strong Motion and Estimation of Seismic HazardPriodeep Chowdhury
This document discusses strong ground motion from earthquakes and methods for measuring and analyzing it. It describes how modern accelerographs can record ground acceleration digitally up to 100 Hz. Parameters derived from ground motion records are used to analyze earthquake and site characteristics and their impact on structures. Evaluating seismic hazard requires understanding characteristics controlling ground motion as well as the seismicity and tectonics of the surrounding region, using either deterministic or probabilistic approaches.
Today very high resolution DEM from satellite image data with resolution of about one meter allows to depict very detailed surface changes.
High resolution DEM increase accurate satellite image geometry and adding DGPS ground control points increases x.y.z accuracy.
Wrong positioning of objects or bad parameters calculation often result in bad image geometry.
From along track stereo pairs of VHR satellite optical data it’s possible to generate an automatic DEM.
Applications :
Ortho-rectification of satellite images, 3D display.
Creation of accurate topographic reference, relief maps.
Topographic profiles and contour generation.
Surface analysis.
Calculations of slope, orientation and shading.
Calculations of volume and elevation
Extraction of terrain and morphometric parameters.
Geomorphology and structural analysis.
Geological quantifications (dips, lithological thicknesses, faults and folds of geometry, etc.).
3D Reference map of resources extraction zones (quarries, open-pits).
Calculation of hydrographic networks and watershed basin.
Determination of hypsometric curves, knickpoints, etc.
Characterization of eroded areas.
Floods simulation, risks evaluation.
Volume calculation for restraints of dams.
This document presents an overview of using GIS data to design a pipeline over varying terrain. It demonstrates how to:
1. Use digital elevation models and satellite imagery in a GIS system to plan the pipeline centerline route and avoid obstructions.
2. Generate a plan view and profile of the pipeline to account for elevation changes along the route.
3. Perform hydraulic analysis of the pipeline considering factors like diameter, pressure, temperature, and water content to optimize the design.
The presentation shows how integrating GIS data improves pipeline design by enabling optimization of the route and hydraulic performance analysis.
This study developed a rainfall-runoff model using HEC-HMS to simulate runoff from Irwin Creek watershed in Charlotte, North Carolina under current and future climate change precipitation scenarios. The model was calibrated and validated using stream gauge data and produced comparable results. Simulation of design storms indicated that an 18% increase in storm depth due to climate change could increase peak discharge by 43%. The study concluded HEC-HMS is a useful tool for watershed modeling and that future flood management should consider potential impacts of climate change.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
WhatsApp offers simple, reliable, and private messaging and calling services for free worldwide. With end-to-end encryption, your personal messages and calls are secure, ensuring only you and the recipient can access them. Enjoy voice and video calls to stay connected with loved ones or colleagues. Express yourself using stickers, GIFs, or by sharing moments on Status. WhatsApp Business enables global customer outreach, facilitating sales growth and relationship building through showcasing products and services. Stay connected effortlessly with group chats for planning outings with friends or staying updated on family conversations.
Software Engineering, Software Consulting, Tech Lead, Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Transaction, Spring MVC, OpenShift Cloud Platform, Kafka, REST, SOAP, LLD & HLD.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
E-commerce Application Development Company.pdfHornet Dynamics
Your business can reach new heights with our assistance as we design solutions that are specifically appropriate for your goals and vision. Our eCommerce application solutions can digitally coordinate all retail operations processes to meet the demands of the marketplace while maintaining business continuity.
AI Fusion Buddy Review: Brand New, Groundbreaking Gemini-Powered AI AppGoogle
AI Fusion Buddy Review: Brand New, Groundbreaking Gemini-Powered AI App
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-fusion-buddy-review
AI Fusion Buddy Review: Key Features
✅Create Stunning AI App Suite Fully Powered By Google's Latest AI technology, Gemini
✅Use Gemini to Build high-converting Converting Sales Video Scripts, ad copies, Trending Articles, blogs, etc.100% unique!
✅Create Ultra-HD graphics with a single keyword or phrase that commands 10x eyeballs!
✅Fully automated AI articles bulk generation!
✅Auto-post or schedule stunning AI content across all your accounts at once—WordPress, Facebook, LinkedIn, Blogger, and more.
✅With one keyword or URL, generate complete websites, landing pages, and more…
✅Automatically create & sell AI content, graphics, websites, landing pages, & all that gets you paid non-stop 24*7.
✅Pre-built High-Converting 100+ website Templates and 2000+ graphic templates logos, banners, and thumbnail images in Trending Niches.
✅Say goodbye to wasting time logging into multiple Chat GPT & AI Apps once & for all!
✅Save over $5000 per year and kick out dependency on third parties completely!
✅Brand New App: Not available anywhere else!
✅ Beginner-friendly!
✅ZERO upfront cost or any extra expenses
✅Risk-Free: 30-Day Money-Back Guarantee!
✅Commercial License included!
See My Other Reviews Article:
(1) AI Genie Review: https://sumonreview.com/ai-genie-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
#AIFusionBuddyReview,
#AIFusionBuddyFeatures,
#AIFusionBuddyPricing,
#AIFusionBuddyProsandCons,
#AIFusionBuddyTutorial,
#AIFusionBuddyUserExperience
#AIFusionBuddyforBeginners,
#AIFusionBuddyBenefits,
#AIFusionBuddyComparison,
#AIFusionBuddyInstallation,
#AIFusionBuddyRefundPolicy,
#AIFusionBuddyDemo,
#AIFusionBuddyMaintenanceFees,
#AIFusionBuddyNewbieFriendly,
#WhatIsAIFusionBuddy?,
#HowDoesAIFusionBuddyWorks
Need for Speed: Removing speed bumps from your Symfony projects ⚡️Łukasz Chruściel
No one wants their application to drag like a car stuck in the slow lane! Yet it’s all too common to encounter bumpy, pothole-filled solutions that slow the speed of any application. Symfony apps are not an exception.
In this talk, I will take you for a spin around the performance racetrack. We’ll explore common pitfalls - those hidden potholes on your application that can cause unexpected slowdowns. Learn how to spot these performance bumps early, and more importantly, how to navigate around them to keep your application running at top speed.
We will focus in particular on tuning your engine at the application level, making the right adjustments to ensure that your system responds like a well-oiled, high-performance race car.
Hand Rolled Applicative User ValidationCode KataPhilip Schwarz
Could you use a simple piece of Scala validation code (granted, a very simplistic one too!) that you can rewrite, now and again, to refresh your basic understanding of Applicative operators <*>, <*, *>?
The goal is not to write perfect code showcasing validation, but rather, to provide a small, rough-and ready exercise to reinforce your muscle-memory.
Despite its grandiose-sounding title, this deck consists of just three slides showing the Scala 3 code to be rewritten whenever the details of the operators begin to fade away.
The code is my rough and ready translation of a Haskell user-validation program found in a book called Finding Success (and Failure) in Haskell - Fall in love with applicative functors.
A Study of Variable-Role-based Feature Enrichment in Neural Models of CodeAftab Hussain
Understanding variable roles in code has been found to be helpful by students
in learning programming -- could variable roles help deep neural models in
performing coding tasks? We do an exploratory study.
- These are slides of the talk given at InteNSE'23: The 1st International Workshop on Interpretability and Robustness in Neural Software Engineering, co-located with the 45th International Conference on Software Engineering, ICSE 2023, Melbourne Australia
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
Why Mobile App Regression Testing is Critical for Sustained Success_ A Detail...kalichargn70th171
A dynamic process unfolds in the intricate realm of software development, dedicated to crafting and sustaining products that effortlessly address user needs. Amidst vital stages like market analysis and requirement assessments, the heart of software development lies in the meticulous creation and upkeep of source code. Code alterations are inherent, challenging code quality, particularly under stringent deadlines.
Artificia Intellicence and XPath Extension FunctionsOctavian Nadolu
The purpose of this presentation is to provide an overview of how you can use AI from XSLT, XQuery, Schematron, or XML Refactoring operations, the potential benefits of using AI, and some of the challenges we face.
Do you want Software for your Business? Visit Deuglo
Deuglo has top Software Developers in India. They are experts in software development and help design and create custom Software solutions.
Deuglo follows seven steps methods for delivering their services to their customers. They called it the Software development life cycle process (SDLC).
Requirement — Collecting the Requirements is the first Phase in the SSLC process.
Feasibility Study — after completing the requirement process they move to the design phase.
Design — in this phase, they start designing the software.
Coding — when designing is completed, the developers start coding for the software.
Testing — in this phase when the coding of the software is done the testing team will start testing.
Installation — after completion of testing, the application opens to the live server and launches!
Maintenance — after completing the software development, customers start using the software.
Atelier - Innover avec l’IA Générative et les graphes de connaissancesNeo4j
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Allez au-delà du battage médiatique autour de l’IA et découvrez des techniques pratiques pour utiliser l’IA de manière responsable à travers les données de votre organisation. Explorez comment utiliser les graphes de connaissances pour augmenter la précision, la transparence et la capacité d’explication dans les systèmes d’IA générative. Vous partirez avec une expérience pratique combinant les relations entre les données et les LLM pour apporter du contexte spécifique à votre domaine et améliorer votre raisonnement.
Amenez votre ordinateur portable et nous vous guiderons sur la mise en place de votre propre pile d’IA générative, en vous fournissant des exemples pratiques et codés pour démarrer en quelques minutes.
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Velocity model building in Petrel
1. Velocity model building
using Petrel software
A
Universiti Teknologi Petronas
Centre Of Excellence In Subsurface Seismic Imaging
& Hydrocarbon Prediction (CSI)
Amir Abbas Babasafari
November 2019 1
2. Outline
• Velocity modeling the principles and pitfalls
• Well and seismic velocity data
• Incorporating velocity data to build a reliable model in Petrel software
• Time to Depth conversion (Map and reservoir property)
• Residual error correction and well marker adjustment
• Structural uncertainty
2
In this presentation some figures adapted from Dr. Badley, Dr. Robertson, Dr. Abdollahi far and Dr. Nosrat
and courtesy of Schlumberger, CGG, Jason and dGB.
3. • Data gathering, loading and QC
• Well top correlation
• Data conditioning
Seismic data conditioning
Well data conditioning
• Well to seismic tie and horizon identification
• Time structural interpretation
Seismic attribute generation
Horizon picking
Fault interpretation
• Velocity model building
• Depth conversion and mapping
Seismic Structural Interpretation
3
5. Depth Conversion
Geometric distortions due to velocity changes (pitfalls) will be removed
To predict drilling depth to the target horizon
More accurate Reserve Calculations and Uncertainty Quantification
For basin modelling purpose
5
6. Pitfalls and issues in seismic data interpretation affecting seismic data quality and S/N ratio
Inherent : steep dip
fault zone
reflectivity
Acquisition : acquisition footprint
surface condition
navigation
receiver problem
shot problem
missed shots
recording problem
crooked line
feathering in marine
Processing : time mismatches
mute
polarity differences
vertical anomalies
static problem
filtering
Others : migration & sideswipe
display
tuning
velocity effects
multiples and bottom simulating reflectors
llimits of software packages 6
7. Common Velocity Pitfalls:
• Anomalous high/low velocity zone (lithology)
• Lateral lithofacies changes
• Fault zones
• Gas effect
7
9. Velocity effects
Variations in velocity produce apparent structures which may not exist.
Velocity pull up
Velocity push down
9
10. Velocity effects and depth migration
Depth migration accounts for lateral variations
in velocity and can minimise the appearance
of spurious structures
Time migrated section
Depth migrated section
10
16. Depth Conversion
Time section
Note that the water depth increases from 100m on the
right to 2.2km on the left
Depth section
The prospect is now imaged as a structural closure. The rapid lateral variations in
water depth and overburden are responsible for the distortion of the time section.
Prospect
16
18. 1. Well data (markers and velocity)
2. Seismic velocity (Stacking or Migration)
3. Time (TWT) surfaces
Well velocity data include check-shot and VSP
18
Input data
40. Why is this important?
In Field Development:
Example Field Study
• Water breakthrough problems
in all 3 wells
• Decision made to inject water
in well 2 to stimulate
production in well 3
Well 1 Well 2 Well 3
After Weber et al.,
1995
Grainstone distribution
Seismic data contribution
40
41. Why is this important?
Well 1 Well 2 Well 3
Wrong decision because:
• Original correlation based on
lithostratigraphy
• New correlation based on
chronostratigraphy using
seismic data
After Weber et al.,
1995
Grainstone distribution
41
42. a) b)
N
2000 m
c)
Well data Seismic data
Incorporating well and seismic data
42
Objective: Incorporating well and seismic data for a reliable velocity model
44. Some QC steps for horizon interpretation
before velocity modeling
Seismic data conditioning
• Using DSMF volume to enhance auto tracking quality and time horizon interpretation
• Using variance and ant track cubes to illustrate faults trend
Tying loops
• Various inline, crossline and arbitrary lines passing through all wells to cover the entire field
Auto tracking / Manual Picking
• 2D Auto tracking/ Manual Picking
Using paint brush by setting parameters for 3D tracking
Displaying next & previous horizons as a guidance
Flattening horizons to find reflector’s continuity
Quality Controlling in the cross line directions to follow reflectors
Using seismic surface attribute such as extract amplitude value
Isochrone map generation to control thickness variations
TDR creation for interval velocity checking at well locations
44
45. Some QC steps for fault interpretation
before velocity modeling
1. Extracting Steered cube for Dip and Azimuth calculation based on seismic events.
2. Generating Variance, chaos and curvature attribute volumes to illustrate fault trends and orientations.
3. Providing Ant track cube and confining dip and azimuth to evaluate minor faults and fractures on the
basis of seismic data resolution.
4. Generating surface attribute maps of Variance and Ant track.
5. Fault interpretation on seismic sections using co-volume cubes which were generated.
Interval 10 inline by 10 inline or 5 by 5 (depends on tectonic setting) and quality checked on Variance
attribute maps.
6. Building fault sticks and fault planes in time domain.
45
46. 46
Well (red color point) and seismic (green color point) velocity data in Petrel
Seismic stacking velocity grid: 200 * 200 or 100 * 100 meters
48. 1. Sonic log (DT) correction with check-shot
2. Well to seismic tie using corrected sonic log
3. Applying the obtained TDR (Time Depth Relation) on well
More appropriate match between markers and predicted depth map is achieved at well locations after
conducting the sequences above.
48
Data preparation in Petrel
84. Note:
• Average velocity surface for the first horizon by incorporating well and seismic
• Interval velocity surface for the second horizon onward by incorporating well and seismic
84
3. Layer Cake approach
1. Seismic interval velocity extraction between main horizons
2. Outlier points elimination using Time vs. Int. velocity cross plot
3. Interpolation, smoothing and interval velocity map creation
4. Calibrating with well interval velocities using co-kriging collocated method
5. Depth conversion using velocity grid
6. Well top adjustment
7. Performing blind test and cross validation for depth conversion
8. Cross section QC
9. Thickness map QC
107. Note: Once the reservoir property e.g. porosity and water saturation is
converted to depth domain, the correlation coefficient and error
between measured and predicted reservoir property at well locations
should be checked.
Slight change in correlation and error between time and depth domain is
acceptable, while in the case of observing significant change the velocity
model needs to be updated.
107
130. Calibrated method
1. A simple grid construction and layering
2. Scaling up well average velocity (TDR) at well locations
3. Interpolation and smoothing of average velocity derived from seismic
stacking velocity and average velocity map generation for each interval
separately
4. Calculation of a fraction from dividing well average velocity (TDR) by
average velocity derived from seismic stacking velocity maps at well
locations
5. Interpolation of fraction values using kriging method by determination of
major/minor direction and range for variography (interpolated fraction)
6. Multiplying the average velocity derived from seismic stacking velocity (3)
by interpolated fraction (5) to calibrate it at well locations (velocity model)
7. Depth conversion using velocity model
8. Well top adjustment
9. Performing blind test and cross validation for depth conversion
10. Cross section QC
11. Thickness map QC
130
131. Co-kriging method
1. A simple grid construction and layering
2. Scaling up well average velocity (TDR) at well locations
3. Interpolation and smoothing of average velocity derived from seismic
stacking velocity and average velocity map generation for each interval
separately
4. Velocity model building through geostatistical method combination of well
average velocity (2) as primary data and average velocity derived from
seismic stacking velocity (3) as secondary data (trend using co-kriging
algorithm). “Using Petrophysical modeling in Petrel”
5. Depth conversion using velocity model
6. Well top adjustment
7. Performing blind test and cross validation for depth conversion
8. Cross section QC
9. Thickness map QC
131
132. Trend method
1. A simple grid construction and layering
2. Scaling up well average velocity (TDR) at well locations
3. Interpolation and smoothing of average velocity derived from seismic
stacking velocity and average velocity map generation for each interval
separately
4. Velocity model building through geostatistical method combination of well
average velocity (2) as primary data and average velocity derived from
seismic stacking velocity (3) as secondary data (trend using calculation of a
fraction via subtraction of well average velocity (TDR) from seismic average
velocity at well locations, subsequently interpolation and adding to seismic
stacking velocity for calibration). “Using Petrophysical modeling in Petrel”
5. Depth conversion using velocity model
6. Well top adjustment
7. Performing blind test and cross validation for depth conversion
8. Cross section QC
9. Thickness map QC
132