Hydraulic fracturing stimulation designs are moving towards tighter spaced clusters, longer stage length, and more proppant volumes. However, effectively evaluating the hydraulic fracturing stimulation efficiency remains a challenge. Distributed fiber optic sensing, which includes DAS and DTS, can continuously monitor the hydraulic fracturing stimulation downhole and be compared with other monitoring technology such as microseismic.
The DAS and DTS data, when integrated with the microseismic, highlight processes relevant to the completion design and allow for a better understanding and interpretation of each dataset.
This paper outlines a workflow to improve processing and interpretation of DAS and DTS data. In addition,
an estimate of the slurry distribution can be made. These methods will be demonstrated for a horizontal
Wolfcamp well in the Permian Basin. Here we compare key aspects of the microseismic, DAS, and DTS
results in several fracture stages to understand the downhole geomechanical processes. In order to interpret
the DTS data a thermal model is developed (using DTS data) to simulate the temperature behavior after
pumping has ceased. A slurry distribution is obtained by matching the simulated temperature with the
measured temperature from DTS. In addition, the DAS data signal is studied in the frequency domain and
the dominant frequencies are identified that are mostly related to fluid flow and to reduce the background
noise. This time frequency analysis enhances the ability to monitor and optimize well treatments.
After reducing the background noise, the acoustic intensity is correlated to the slurry distribution. The fluid
distribution data from DAS and DTS are compared with the microseismic and near field strain to better
understand the completion processes. We utilized fiber optic microseismic to better understand and
compare it to conventional microseismic.
Finally, we highlight the dynamics of strain and microseismic signature as fluid moves from an offset well
completion into the prior stimulated fiber well to better understand the reservoir and far field effects of the
completion.
The document analyzes 3 years of ice and wave data collected by an instrument in Lake Erie. The maximum ice thickness was slightly over 1 meter, though one reading of 6 meters was likely stacked ice. Waves reached 1.5 meters but were likely exaggerated by stirred sediments. A statistical analysis found the 50-year return wave for Lake Erie to be 3.75 meters based on NOAA buoy data and a Weibull distribution. A website was also created to organize wind energy research group data.
This document proposes a new methodology to link ocean wave spectra to atmospheric circulation patterns (CPs) in order to improve wave modeling. The methodology involves:
1. Partitioning wave spectra into low frequency swell and locally generated wind waves.
2. Estimating the origin of swell waves using wave dispersion relationships and tracking algorithms.
3. Grouping swell wave origins by location or spectral characteristics.
4. Identifying and classifying CPs that correspond to swell wave origins using fuzzy logic.
5. Obtaining typical spectral characteristics associated with specific CPs that can then be used to model waves driven by different forcing mechanisms like tropical cyclones. The methodology is tested using wave data from South Africa.
This document discusses using additional data from the TAMDAR sensor network to improve forecasts of cold air damming events in the Southeast United States. Cold air damming occurs when cold air becomes trapped against mountain slopes, causing winter weather. The study aims to enhance existing detection algorithms by combining TAMDAR data with radiosonde data to better predict the life cycle and severity of cold air damming events. It is hoped that the increased data availability from TAMDAR will allow forecast models to more accurately predict cold air damming frequencies, timing, and associated weather patterns. The methodology will evaluate forecasts from the last 10 years with and without TAMDAR data to determine if the additional observations significantly improve cold air damming forecasts.
The document summarizes a study that deployed various hydrological tools along the East Fork of the Jemez River in New Mexico to understand linked physical and chemical processes. Key findings include:
1) The Jemez River is overall gaining based on positive head gradients measured along the riverbed and higher water table depths measured farther from the river.
2) Analysis of chloride concentrations and thermal data from instruments in the riverbed suggest mixing is occurring between groundwater and the hyporheic zone of the riverbed and banks.
3) Spatial temperature variations recorded over two days indicate surface water is entering the hyporheic zone in some locations with lower magnitude variations under vegetation.
Investigation of the effect of ground and air temperature on very high freque...Alexander Decker
This document summarizes a study that investigated the effect of ground surface temperature, air temperature, and relative humidity on very high frequency (VHF) radio signal propagation. The study found that as ground surface temperature increased, path loss of the VHF signal decreased at a rate of 21.8dB/°C. It also found that as air temperature increased, path loss decreased but at a lower rate of 7.5dB/°C. Additionally, as relative humidity increased, path loss slightly increased at a rate of 2.32dB/°C, likely due to refraction, diffraction and scattering effects of water vapor in the air. The results indicate that VHF radio waves propagate more through the ground surface than through the
This document provides a design manual for hydro-meteorology networks in India. It discusses the physics of rainfall and evaporation processes and the design of rainfall and climate observation networks. Key sections include network design and optimization, site selection criteria, measurement frequencies, and measurement techniques for rainfall and climatic variables. Guidelines are also provided on station design, construction, and equipment installation. The overall aim is to establish standard procedures for India's Hydrological Information System to accurately measure hydro-meteorological quantities needed for water resources assessment and management.
Atmospheric Dispersion in Nuclear Power Plant SitingHussain Majid
This document discusses atmospheric dispersion modeling for nuclear power plant siting. Meteorological data collection is important for modeling radioactive releases from plants during normal operation and accidents. Stability classes are determined from factors like temperature lapse rate, wind fluctuations, insolation, and cloud cover to characterize turbulence levels affecting dispersion. Both simple and complex terrain features must be considered when collecting and interpreting meteorological data used in dispersion calculations.
This document discusses a method for classifying atmospheric circulation patterns (CPs) based on their links to extreme wave events. It proposes using an entropy measure to objectively evaluate the quality of CP classifications and determine the optimal number of CP classes. The method is applied to wave data from Durban, South Africa to classify CPs driving extreme wave heights over 3.5m. The results indicate that 15-20 CP classes are needed for a good quality classification but one persistent class explains a large proportion of extreme events regardless of the number of classes.
The document analyzes 3 years of ice and wave data collected by an instrument in Lake Erie. The maximum ice thickness was slightly over 1 meter, though one reading of 6 meters was likely stacked ice. Waves reached 1.5 meters but were likely exaggerated by stirred sediments. A statistical analysis found the 50-year return wave for Lake Erie to be 3.75 meters based on NOAA buoy data and a Weibull distribution. A website was also created to organize wind energy research group data.
This document proposes a new methodology to link ocean wave spectra to atmospheric circulation patterns (CPs) in order to improve wave modeling. The methodology involves:
1. Partitioning wave spectra into low frequency swell and locally generated wind waves.
2. Estimating the origin of swell waves using wave dispersion relationships and tracking algorithms.
3. Grouping swell wave origins by location or spectral characteristics.
4. Identifying and classifying CPs that correspond to swell wave origins using fuzzy logic.
5. Obtaining typical spectral characteristics associated with specific CPs that can then be used to model waves driven by different forcing mechanisms like tropical cyclones. The methodology is tested using wave data from South Africa.
This document discusses using additional data from the TAMDAR sensor network to improve forecasts of cold air damming events in the Southeast United States. Cold air damming occurs when cold air becomes trapped against mountain slopes, causing winter weather. The study aims to enhance existing detection algorithms by combining TAMDAR data with radiosonde data to better predict the life cycle and severity of cold air damming events. It is hoped that the increased data availability from TAMDAR will allow forecast models to more accurately predict cold air damming frequencies, timing, and associated weather patterns. The methodology will evaluate forecasts from the last 10 years with and without TAMDAR data to determine if the additional observations significantly improve cold air damming forecasts.
The document summarizes a study that deployed various hydrological tools along the East Fork of the Jemez River in New Mexico to understand linked physical and chemical processes. Key findings include:
1) The Jemez River is overall gaining based on positive head gradients measured along the riverbed and higher water table depths measured farther from the river.
2) Analysis of chloride concentrations and thermal data from instruments in the riverbed suggest mixing is occurring between groundwater and the hyporheic zone of the riverbed and banks.
3) Spatial temperature variations recorded over two days indicate surface water is entering the hyporheic zone in some locations with lower magnitude variations under vegetation.
Investigation of the effect of ground and air temperature on very high freque...Alexander Decker
This document summarizes a study that investigated the effect of ground surface temperature, air temperature, and relative humidity on very high frequency (VHF) radio signal propagation. The study found that as ground surface temperature increased, path loss of the VHF signal decreased at a rate of 21.8dB/°C. It also found that as air temperature increased, path loss decreased but at a lower rate of 7.5dB/°C. Additionally, as relative humidity increased, path loss slightly increased at a rate of 2.32dB/°C, likely due to refraction, diffraction and scattering effects of water vapor in the air. The results indicate that VHF radio waves propagate more through the ground surface than through the
This document provides a design manual for hydro-meteorology networks in India. It discusses the physics of rainfall and evaporation processes and the design of rainfall and climate observation networks. Key sections include network design and optimization, site selection criteria, measurement frequencies, and measurement techniques for rainfall and climatic variables. Guidelines are also provided on station design, construction, and equipment installation. The overall aim is to establish standard procedures for India's Hydrological Information System to accurately measure hydro-meteorological quantities needed for water resources assessment and management.
Atmospheric Dispersion in Nuclear Power Plant SitingHussain Majid
This document discusses atmospheric dispersion modeling for nuclear power plant siting. Meteorological data collection is important for modeling radioactive releases from plants during normal operation and accidents. Stability classes are determined from factors like temperature lapse rate, wind fluctuations, insolation, and cloud cover to characterize turbulence levels affecting dispersion. Both simple and complex terrain features must be considered when collecting and interpreting meteorological data used in dispersion calculations.
This document discusses a method for classifying atmospheric circulation patterns (CPs) based on their links to extreme wave events. It proposes using an entropy measure to objectively evaluate the quality of CP classifications and determine the optimal number of CP classes. The method is applied to wave data from Durban, South Africa to classify CPs driving extreme wave heights over 3.5m. The results indicate that 15-20 CP classes are needed for a good quality classification but one persistent class explains a large proportion of extreme events regardless of the number of classes.
This document summarizes a study of wind conditions and their impact on transmission line ratings in Northern California. The study found that local winds are statistically predictable and not independent of temperature and solar radiation. This means transmission line ratings could potentially be increased through probabilistic wind analysis. The study was based on high-quality weather data collected from sensors placed near transmission lines, which found a high probability that many line ratings in Northern California could be raised.
The document describes the development of an adaptive weather sensing framework using phased array radar. The framework consists of four main processes: 1) storm cell identification using reflectivity data, 2) storm cell tracking to associate cells over time, 3) task configuration to determine optimal update times, and 4) scheduling of tasks. Simulated phased array radar observations were generated from operational WSR-88D data to test the framework. Results demonstrated the feasibility of adaptively scanning regions of interest with different update times while maintaining surveillance, allowing for higher temporal resolution compared to conventional radar sensing.
This study used a 2D hydrodynamic model to evaluate wind-induced sea level fluctuations in the Persian Gulf and Gulf of Oman over a 10-year period. The model was calibrated using water level measurements from two stations, with tidal levels removed to isolate the wind effect. Results showed wind drag coefficients were higher than open oceans. Extreme wind setups and setdowns were calculated for ports, with the northeast Persian Gulf experiencing over 1.5m of setup. Maximum setup maps showed southern Bahrain and areas from Doha to Dubai experienced over 1m of wind-induced water level rise.
- The document discusses two studies that analyzed how vertically integrated liquid (VIL) and vertically integrated liquid density (VILD) relate to hail size in thunderstorms. The first study from 1997 found little correlation between VIL and hail size above 0.75 inches, while the second study from 2005 found a strong correlation between VILD and hail size.
- A 2005 study by Conor Lahiff compared VILD to the WSR-88D hail detection algorithm and found that VILD slightly outperformed the algorithm in predicting severe hail, with a critical success index of 0.89 for VILD versus 0.83 for the algorithm. VILD also more accurately estimated reported hail sizes.
IAHR 2015 - SWAN's underestimation of long wave penetration into coastal syst...Deltares
1) The study re-analyzed existing wave hindcasts from three coastal regions to investigate why the SWAN model underestimates long wave penetration into tidal inlet systems.
2) The results showed that SWAN underestimation could be explained by the hypothesis that two-dimensional nonlinear wave interactions play an important role in transporting wave energy across channels, whereas SWAN only models one-dimensional interactions.
3) The study concluded that developing a two-dimensional alternative for SWAN's three-wave interaction formulation could help improve its ability to model long wave penetration into complex coastal regions.
Extinction of Millimeter wave on Two Dimensional Slices of Foam-Covered Sea-s...IJSRED
This document discusses millimeter wave (mmW) extinction due to its interaction with layers of air bubbles (sea foam) on the ocean surface. It presents the following key points:
1) A numerical model using the split-step Fourier method was used to evaluate mmW attenuation through layers of sea foam of varying thickness, frequency, polarization, and incidence angle.
2) Estimates of the effective dielectric constant of sea foam layers were calculated for different foam configurations and WindSat frequencies based on a two-dimensional model of randomly packed air bubbles coated with thin seawater layers.
3) The parabolic wave equation method, which approximates solutions to Maxwell's equations, was discussed as an efficient way to
1) The document describes a wavelet-based technique for denoising underwater signals affected by wind-driven ambient noise. It uses discrete wavelet transform to decompose the noisy signal into coefficients. 2) A threshold is calculated using the universal threshold method and applied to the coefficients to remove noise. Hard and soft thresholding are evaluated. 3) The denoised signal is then reconstructed from the modified coefficients using inverse discrete wavelet transform. The technique is shown to effectively reduce wind noise and improve the signal-to-noise ratio.
The document summarizes a study determining atmospheric stability classes in Mazoe, Zimbabwe over two years. The modified Pasquill-Gifford method was used to classify stability based on wind speed, solar radiation, temperature and other factors. Results showed neutral stability conditions (class D) were most prevalent, occurring over 60% of the time with moderate to strong winds and over 50% cloud cover. Classes A-C represented unstable conditions up to 20% of the time, while very stable classes E and F were rare or non-existent in 2012. In general, the atmosphere was found to be neutrally stable for dispersion of pollutants.
Early kick detection and nonlinear behavior of drilling mu…Frank-Michael Jäger
The following test measurements serve the quantification of resolution and achievable sensitivity of parameters of sound velocity and sound absorption in wellbore fluids. More precisely, these studies refer to tools and methods to identify the flow of liquids or gases, preferably hydrocarbons in the well bore in real time during the drilling. The aim is a way to show with the highly sensitive and robust tools for use in the deep ocean can be realized.
A study confined to the lower tapi basin in Gujarat, India to find out the primary causes for 2006 floods in Surat city. The study involves collection of topographical data from the local geological survey organization, rainfall data from meteorological department of india and the application of HEC-HMS software from US Army corps of engineers to identify the primary cause of the runoff.
The document summarizes the Multi-sensor Improved Sea-Surface Temperature for IOOS (MISST) project. The MISST project involves 28 scientists from various organizations working to provide near real-time satellite SST data. The project aims to 1) integrate new SST observations and improve data access, 2) explore improving SST products through expanded Arctic in situ observations, 3) focus on improving SST accuracy and uncertainties in high latitudes, and 4) research air-sea-ice interactions to improve SST products in marginal ice zones.
The document describes seismic interpretation workflows, including conventional and unconventional techniques. Conventional techniques involve horizon interpretations, fault picking, and tying seismic data to well logs to understand subsurface geology. Unconventional techniques analyze seismic attribute variations like amplitudes to identify hydrocarbon indicators. The workflow includes generating synthetics from well logs, interpreting horizons on seismic sections, identifying structures like faults and gas chimneys, and determining direct hydrocarbon indicators.
This document describes a microfluidic method for measuring interfacial tension between immiscible fluids using a microfluidic device. The device contains two tapered microchannels connected by a pair of modified Laplace sensors. Interfacial tension is determined by monitoring the pressure drop across the microchannels where the interfaces are formed and measuring the curvatures of the interfaces. The method was tested using oil/water systems and results agreed well with a commercial tensiometry. This provides a low-cost and fast way to measure interfacial tension in microfluidics.
2003-12-04 Evaluation of the ASOS Light Scattering NetworkRudolf Husar
The document reports on an evaluation of the Automated Surface Observing System (ASOS) light scattering network. It analyzes data from 220 ASOS stations to evaluate the precision and performance of the ASOS visibility sensors. It finds that some stations show excellent correlation between duplicate sensors while others show poorer correlation or significant offsets. It also examines diurnal patterns and the effects of relative humidity on visibility readings.
This study aimed to calibrate the peak flow coefficient (C value) used in the NRCS dimensionless unit hydrograph equation for watersheds in New Jersey. The researchers calculated C values for watersheds less than 20 square miles using historical rainfall and flow data. C values ranged widely from 360 to 765, varying significantly even over short distances. The study found no clear regional patterns and that assuming a constant C value of 484 could produce inaccurate peak flow predictions. More precise locally calibrated C values should improve design of hydraulic structures in New Jersey.
Considerations on the collection of data from bio-argo floats across sampling...SeaBirdScientific
Ian D. Walsh, Ph. D, Joel Reiter, Dan Quittman, David J. Murphy, Thomas O. Mitchell, Ph.D. Sea-Bird Scientific. GAIC 2015 Meeting, Galway, Ireland, 14 – 18 Sept. 2015.
ABSTRACT
The flexibility of the current generation of float sensor packages peovides an opportunity to craft mission specific sampling schemes that balance the collection of data for specific sampling goals with the practicalities of float operation. Autonomous floats operate within constraints of battery life and data transfer rates.
For simplicity of data transfer and handling, most float data sets are transmitted after binning on pressure. Within a given pressure bin different instruments will be sampling within a particular defined sequence. A sampling sequence should be balanced towards minimizing energy consumption while maximizing data accuracy of each instrument. As the number of sensors increases and the breadth of mission parameters expands it becomes more difficult to optimize data sequencing and reporting.
We consider methods to reduce the size of the problem by setting rules for sequence development and test those rules relative to field data. We examine a set of data from a float that was equipped with internal memory that captured the full set of sample data taken during the profiling mission.
Comparing the ‘raw’ data and the transmitted data we examine the variance around the transmitted data and discuss the impact of data sequencing on the data.
Effectiveness of the telemetric flood monitoring deviceHarhar Caparida
This document summarizes a study that aimed to determine the best telemetric flood monitoring device design between a floating sensor design and an ultrasonic sensor design. Twenty trials were conducted to test the water level readings and response times of each design. The results showed that the floating sensor design had an average actual water level reading of 7.55 inches, while the ultrasonic sensor was 5.90 inches. The average response time for the floating sensor was 7.90 seconds and 12.67 seconds for the ultrasonic sensor. The study concluded that the floating sensor design was more effective based on the water level readings and faster response times.
This document summarizes a study on water level sensors conducted under the guidance of Dr. N. Sai Bhaskar Reddy. It discusses various types of level measurement sensors including capacitance, ultrasound, radar, and mobile canal control sensors. It reviews literature on the importance of accurate water level measurement and different sensor technologies. It also describes the site selection process for sensor installation based on factors like natural controls, safety, and maintaining a stationary record.
The document evaluates the performance of the TRMM Multi-satellite Precipitation Analysis (TMPA) product in estimating daily precipitation in the Central Andes region, compared to gauge measurements. It finds large biases in daily precipitation amounts from TMPA for the regions of Cuzco, Peru and La Paz, Bolivia, though strong precipitation events are generally detected. Correlation with gauge data increases significantly when aggregating TMPA estimates to longer time periods like weekly or monthly sums. Spatial aggregation has little effect on performance. The document proposes blending TMPA with daily gauge data to improve daily estimates.
This document summarizes the methodology used to optimize hydraulic fracturing in the San Jorge Basin in Argentina. State-of-the-art well logging tools and collaboration between operating and service companies were used to better understand reservoir conditions and design fractures. NMR logging, sonic logs, and pressure diagnostics during fracturing were integrated to determine fracture heights and calibrate models. This approach resulted in improved well performance through more accurate fracture design tailored to each reservoir's characteristics.
DEVELOPING THE OPTIMIZED OCEAN CURRENT STRENGTHENING DESALINATION SEMI-PERMEA...ijbesjournal
Alongside improvements in desalination operation and development of new technologies, problems of weakened counter current and global warming have emerged. Therefore, our study suggests a new desalination model, based on the experimental Support Vector Machine (SVM) algorithm, for semipermeable membrane separation. First, the reverse osmosis (RO) process used semi-permeable membrane and osmotic pressure to remove the solutes dissolved in seawater and obtain pure freshwater. The desalination process also applied MSF and MED, which are the best technologies developed through elimination of various problems that were previously experienced. This research is directed towards suggesting a model that can effectively create the semi-permeable membrane used in the desalination process. To efficiently prevent a counter current and safely obtain the water resources, an innovative technology is suggested by applying Genetic Algorithm (GA) to the SVM model for the semi-p
This document summarizes a study of wind conditions and their impact on transmission line ratings in Northern California. The study found that local winds are statistically predictable and not independent of temperature and solar radiation. This means transmission line ratings could potentially be increased through probabilistic wind analysis. The study was based on high-quality weather data collected from sensors placed near transmission lines, which found a high probability that many line ratings in Northern California could be raised.
The document describes the development of an adaptive weather sensing framework using phased array radar. The framework consists of four main processes: 1) storm cell identification using reflectivity data, 2) storm cell tracking to associate cells over time, 3) task configuration to determine optimal update times, and 4) scheduling of tasks. Simulated phased array radar observations were generated from operational WSR-88D data to test the framework. Results demonstrated the feasibility of adaptively scanning regions of interest with different update times while maintaining surveillance, allowing for higher temporal resolution compared to conventional radar sensing.
This study used a 2D hydrodynamic model to evaluate wind-induced sea level fluctuations in the Persian Gulf and Gulf of Oman over a 10-year period. The model was calibrated using water level measurements from two stations, with tidal levels removed to isolate the wind effect. Results showed wind drag coefficients were higher than open oceans. Extreme wind setups and setdowns were calculated for ports, with the northeast Persian Gulf experiencing over 1.5m of setup. Maximum setup maps showed southern Bahrain and areas from Doha to Dubai experienced over 1m of wind-induced water level rise.
- The document discusses two studies that analyzed how vertically integrated liquid (VIL) and vertically integrated liquid density (VILD) relate to hail size in thunderstorms. The first study from 1997 found little correlation between VIL and hail size above 0.75 inches, while the second study from 2005 found a strong correlation between VILD and hail size.
- A 2005 study by Conor Lahiff compared VILD to the WSR-88D hail detection algorithm and found that VILD slightly outperformed the algorithm in predicting severe hail, with a critical success index of 0.89 for VILD versus 0.83 for the algorithm. VILD also more accurately estimated reported hail sizes.
IAHR 2015 - SWAN's underestimation of long wave penetration into coastal syst...Deltares
1) The study re-analyzed existing wave hindcasts from three coastal regions to investigate why the SWAN model underestimates long wave penetration into tidal inlet systems.
2) The results showed that SWAN underestimation could be explained by the hypothesis that two-dimensional nonlinear wave interactions play an important role in transporting wave energy across channels, whereas SWAN only models one-dimensional interactions.
3) The study concluded that developing a two-dimensional alternative for SWAN's three-wave interaction formulation could help improve its ability to model long wave penetration into complex coastal regions.
Extinction of Millimeter wave on Two Dimensional Slices of Foam-Covered Sea-s...IJSRED
This document discusses millimeter wave (mmW) extinction due to its interaction with layers of air bubbles (sea foam) on the ocean surface. It presents the following key points:
1) A numerical model using the split-step Fourier method was used to evaluate mmW attenuation through layers of sea foam of varying thickness, frequency, polarization, and incidence angle.
2) Estimates of the effective dielectric constant of sea foam layers were calculated for different foam configurations and WindSat frequencies based on a two-dimensional model of randomly packed air bubbles coated with thin seawater layers.
3) The parabolic wave equation method, which approximates solutions to Maxwell's equations, was discussed as an efficient way to
1) The document describes a wavelet-based technique for denoising underwater signals affected by wind-driven ambient noise. It uses discrete wavelet transform to decompose the noisy signal into coefficients. 2) A threshold is calculated using the universal threshold method and applied to the coefficients to remove noise. Hard and soft thresholding are evaluated. 3) The denoised signal is then reconstructed from the modified coefficients using inverse discrete wavelet transform. The technique is shown to effectively reduce wind noise and improve the signal-to-noise ratio.
The document summarizes a study determining atmospheric stability classes in Mazoe, Zimbabwe over two years. The modified Pasquill-Gifford method was used to classify stability based on wind speed, solar radiation, temperature and other factors. Results showed neutral stability conditions (class D) were most prevalent, occurring over 60% of the time with moderate to strong winds and over 50% cloud cover. Classes A-C represented unstable conditions up to 20% of the time, while very stable classes E and F were rare or non-existent in 2012. In general, the atmosphere was found to be neutrally stable for dispersion of pollutants.
Early kick detection and nonlinear behavior of drilling mu…Frank-Michael Jäger
The following test measurements serve the quantification of resolution and achievable sensitivity of parameters of sound velocity and sound absorption in wellbore fluids. More precisely, these studies refer to tools and methods to identify the flow of liquids or gases, preferably hydrocarbons in the well bore in real time during the drilling. The aim is a way to show with the highly sensitive and robust tools for use in the deep ocean can be realized.
A study confined to the lower tapi basin in Gujarat, India to find out the primary causes for 2006 floods in Surat city. The study involves collection of topographical data from the local geological survey organization, rainfall data from meteorological department of india and the application of HEC-HMS software from US Army corps of engineers to identify the primary cause of the runoff.
The document summarizes the Multi-sensor Improved Sea-Surface Temperature for IOOS (MISST) project. The MISST project involves 28 scientists from various organizations working to provide near real-time satellite SST data. The project aims to 1) integrate new SST observations and improve data access, 2) explore improving SST products through expanded Arctic in situ observations, 3) focus on improving SST accuracy and uncertainties in high latitudes, and 4) research air-sea-ice interactions to improve SST products in marginal ice zones.
The document describes seismic interpretation workflows, including conventional and unconventional techniques. Conventional techniques involve horizon interpretations, fault picking, and tying seismic data to well logs to understand subsurface geology. Unconventional techniques analyze seismic attribute variations like amplitudes to identify hydrocarbon indicators. The workflow includes generating synthetics from well logs, interpreting horizons on seismic sections, identifying structures like faults and gas chimneys, and determining direct hydrocarbon indicators.
This document describes a microfluidic method for measuring interfacial tension between immiscible fluids using a microfluidic device. The device contains two tapered microchannels connected by a pair of modified Laplace sensors. Interfacial tension is determined by monitoring the pressure drop across the microchannels where the interfaces are formed and measuring the curvatures of the interfaces. The method was tested using oil/water systems and results agreed well with a commercial tensiometry. This provides a low-cost and fast way to measure interfacial tension in microfluidics.
2003-12-04 Evaluation of the ASOS Light Scattering NetworkRudolf Husar
The document reports on an evaluation of the Automated Surface Observing System (ASOS) light scattering network. It analyzes data from 220 ASOS stations to evaluate the precision and performance of the ASOS visibility sensors. It finds that some stations show excellent correlation between duplicate sensors while others show poorer correlation or significant offsets. It also examines diurnal patterns and the effects of relative humidity on visibility readings.
This study aimed to calibrate the peak flow coefficient (C value) used in the NRCS dimensionless unit hydrograph equation for watersheds in New Jersey. The researchers calculated C values for watersheds less than 20 square miles using historical rainfall and flow data. C values ranged widely from 360 to 765, varying significantly even over short distances. The study found no clear regional patterns and that assuming a constant C value of 484 could produce inaccurate peak flow predictions. More precise locally calibrated C values should improve design of hydraulic structures in New Jersey.
Considerations on the collection of data from bio-argo floats across sampling...SeaBirdScientific
Ian D. Walsh, Ph. D, Joel Reiter, Dan Quittman, David J. Murphy, Thomas O. Mitchell, Ph.D. Sea-Bird Scientific. GAIC 2015 Meeting, Galway, Ireland, 14 – 18 Sept. 2015.
ABSTRACT
The flexibility of the current generation of float sensor packages peovides an opportunity to craft mission specific sampling schemes that balance the collection of data for specific sampling goals with the practicalities of float operation. Autonomous floats operate within constraints of battery life and data transfer rates.
For simplicity of data transfer and handling, most float data sets are transmitted after binning on pressure. Within a given pressure bin different instruments will be sampling within a particular defined sequence. A sampling sequence should be balanced towards minimizing energy consumption while maximizing data accuracy of each instrument. As the number of sensors increases and the breadth of mission parameters expands it becomes more difficult to optimize data sequencing and reporting.
We consider methods to reduce the size of the problem by setting rules for sequence development and test those rules relative to field data. We examine a set of data from a float that was equipped with internal memory that captured the full set of sample data taken during the profiling mission.
Comparing the ‘raw’ data and the transmitted data we examine the variance around the transmitted data and discuss the impact of data sequencing on the data.
Effectiveness of the telemetric flood monitoring deviceHarhar Caparida
This document summarizes a study that aimed to determine the best telemetric flood monitoring device design between a floating sensor design and an ultrasonic sensor design. Twenty trials were conducted to test the water level readings and response times of each design. The results showed that the floating sensor design had an average actual water level reading of 7.55 inches, while the ultrasonic sensor was 5.90 inches. The average response time for the floating sensor was 7.90 seconds and 12.67 seconds for the ultrasonic sensor. The study concluded that the floating sensor design was more effective based on the water level readings and faster response times.
This document summarizes a study on water level sensors conducted under the guidance of Dr. N. Sai Bhaskar Reddy. It discusses various types of level measurement sensors including capacitance, ultrasound, radar, and mobile canal control sensors. It reviews literature on the importance of accurate water level measurement and different sensor technologies. It also describes the site selection process for sensor installation based on factors like natural controls, safety, and maintaining a stationary record.
The document evaluates the performance of the TRMM Multi-satellite Precipitation Analysis (TMPA) product in estimating daily precipitation in the Central Andes region, compared to gauge measurements. It finds large biases in daily precipitation amounts from TMPA for the regions of Cuzco, Peru and La Paz, Bolivia, though strong precipitation events are generally detected. Correlation with gauge data increases significantly when aggregating TMPA estimates to longer time periods like weekly or monthly sums. Spatial aggregation has little effect on performance. The document proposes blending TMPA with daily gauge data to improve daily estimates.
This document summarizes the methodology used to optimize hydraulic fracturing in the San Jorge Basin in Argentina. State-of-the-art well logging tools and collaboration between operating and service companies were used to better understand reservoir conditions and design fractures. NMR logging, sonic logs, and pressure diagnostics during fracturing were integrated to determine fracture heights and calibrate models. This approach resulted in improved well performance through more accurate fracture design tailored to each reservoir's characteristics.
DEVELOPING THE OPTIMIZED OCEAN CURRENT STRENGTHENING DESALINATION SEMI-PERMEA...ijbesjournal
Alongside improvements in desalination operation and development of new technologies, problems of weakened counter current and global warming have emerged. Therefore, our study suggests a new desalination model, based on the experimental Support Vector Machine (SVM) algorithm, for semipermeable membrane separation. First, the reverse osmosis (RO) process used semi-permeable membrane and osmotic pressure to remove the solutes dissolved in seawater and obtain pure freshwater. The desalination process also applied MSF and MED, which are the best technologies developed through elimination of various problems that were previously experienced. This research is directed towards suggesting a model that can effectively create the semi-permeable membrane used in the desalination process. To efficiently prevent a counter current and safely obtain the water resources, an innovative technology is suggested by applying Genetic Algorithm (GA) to the SVM model for the semi-p
Real Time Downhole Flow Measurement SensorsSurajit Haldar
1. The document describes using a new coiled tubing real-time flow (CTRF) tool to measure bottom-hole parameters during an acid stimulation treatment of an open-hole horizontal water injector well in the Arab-D formation in Ghawar field, Saudi Arabia.
2. The CTRF tool directly measures fluid velocity and direction using heat transfer sensors, providing real-time data on flow distribution between zones to help optimize stimulation.
3. During the field operation, the CTRF tool was calibrated and used along with distributed temperature surveys (DTS) to identify high-flow zones for diversion and evaluate the treatment effectiveness. The intervention successfully improved well injectivity.
An ultrasonic level sensor uses high frequency sound waves to measure the distance from the sensor to the surface of the liquid being measured. It works by transmitting sound pulses and measuring the time it takes for the echoes to return. The sensor contains piezoelectric crystals that generate and detect sound waves. It can provide continuous level monitoring or point level control and has advantages of being non-contact, requiring no calibration, and working with viscous or solid materials. However, its accuracy can be affected by temperature, turbulence, foam and other process factors.
Ecological Society of America Workshop on Incentives for Data SharingTom Moritz
The document describes sap flow data collected from multiple branches of manzanita shrubs from December 2007 to July 2008 using heat dissipation probes. The data was collected every 5 minutes using a datalogger and includes measurements of sap flow from 13 different branches as well as date/time stamps and other metadata. The data is intended to correlate physiological activity with below-ground measures of root growth and CO2 production.
Moving beyond the 4th Dimension of Quantifying, Analyzing & Visualizing A...chrismalzone
Presentation & Paper By Chris Malzone and given by Mike Mutschler, RESON. Focuses on the benefits of fusing multiple sources of acoustic data through fusion. Presentation also introduces the concept of fusion & looking at multisource 4-dimensional data. It also brings home the point through a Habitat Mapping Case Study in the US Virgin Islands as analysed via the Eonfusion software.
GIS based spatial distribution of Temperature and Chlorophyll-a along Kalpakk...IJERA Editor
This paper briefly describes the status of Temperature and Chlorophyll-a trend in Kalpakkam Coast, discusses its ecological and temperature impacts recommending measures to achieve long term sustainability using advanced tools like Geographic Information System (GIS). Present study reveals the monthly spatial distribution of Temperature and Chlorophyll-a at Kalpakkam. Transect based in-situ Temperature and Chlorophyll-a collected at 200m, 500m and 1 km distance into the sea was interpolated using the Inverse Distance Weightage (IDW) method in ARC GIS. Data revealed the extent of spatial distribution of thermal effluent in Kalpakkam. It could be found that temperature range of 26.2 – 31.9°C provided substantial Chlorophyll-a concentration between 0.8 – 2.9 mg/m3 for surface and bottom waters. Further, increase of Chlorophyll-a levels did not lead to higher productivity. Combined temperature and chlorophyll a showed little synergistic effects. It is concluded that the effect of thermal discharge from the power plant into the receiving water body is quite localized and productivity of the coastal waters are not affected. From the results obtained, the spatial data has been found to be useful in determining zones of safe use of seawater and to understand the extent of relationship between the relatable parameters.
Briana Sullivan from UNH introduces NOAA data visualizations created by her lab. The visualizations include WindVis2, a weather forecast visualization; a "tide-aware" chart combining tides, bathymetry and nautical charts; and GeoNav3D, which allows planning a safe navigation path using predicted tide levels. The lab also visualizes surface currents using streamlines and animations, as well as combining wind, wave and current data for mission planning tools. Collaborations between the lab and government agencies and private sector allow further research into visualizing ocean and weather data.
Similar to Hydraulic Fracturing Stimulation Monitoring with Distributed Fiber Optic Sensing and Microseismic in the Permian Wolfcamp Shale Play (20)
Interpretation Special-Section: Insights into digital oilfield data using ar...Pioneer Natural Resources
Invitation to contribute to our special issue titled “Insights to the Digital Oil Field data using Artificial Intelligence & Big Data Analytics”. The scope of this special issue is to further bridge the gap between the geophysical interpretation and well planning (drilling, completions and production).
The technology communities both in industry and academia are utilizing advanced signal processing / machine learning algorithms, high compute / Big Data architectures at scale to develop these practical solutions. Your contributions to advanced algorithms in signal processing/machine learning, subsurface imaging and interpretation will be a good fit for this issue.
We hope you will find this participation both rewarding and worthwhile for our industry and to the Interpretation community in general.
Dear Colleagues,
Call for papers for another Machine Learning special issue of SEG/AAPG Journal of Interpretation focusing on the Seismic Data Analysis has been announced.
We look forward to your contribution.
Vikram Jayaram
Special Section Editor
Interpretation
An approach to offer management: maximizing sales with fare products and anci...Pioneer Natural Resources
With the growth in ancillary sales, an area of increasing importance for airlines is the concept of offer management, which entails the creation of dynamic, custom, personalized offers consisting of a flight itinerary and ancillary products offered by an airline. This practice-oriented, overview paper provides an end-to-end, future-oriented framework for determining the composition of optimal base fare and ancillary bundles by customer trip purpose segment followed by 1:1 personalization to maximize total sales. Our focus in this paper is primarily on the proposed offer management framework and its sub-components.
During the past decade, the size of 3D seismic data volumes and the number of seismic attributes have increased
to the extent that it is difficult, if not impossible, for interpreters to examine every seismic line and time
slice. To address this problem, several seismic facies classification algorithms including k-means, self-organizing
maps, generative topographic mapping, support vector machines, Gaussian mixture models, and artificial neural
networks have been successfully used to extract features of geologic interest from multiple volumes. Although
well documented in the literature, the terminology and complexity of these algorithms may bewilder the average
seismic interpreter, and few papers have applied these competing methods to the same data volume. We have
reviewed six commonly used algorithms and applied them to a single 3D seismic data volume acquired over the
Canterbury Basin, offshore New Zealand, where one of the main objectives was to differentiate the architectural
elements of a turbidite system. Not surprisingly, the most important parameter in this analysis was the choice of
the correct input attributes, which in turn depended on careful pattern recognition by the interpreter. We found
that supervised learning methods provided accurate estimates of the desired seismic facies, whereas unsupervised
learning methods also highlighted features that might otherwise be overlooked.
New exploration challenges and current research demands
3D gravity modeling with 3D geology interpretations. In
the near future, multi-parameter and multi-dimensional
interpretations represent the observed and expected in situ
geology, geophysical, and petro-physical data that will be
used for join multi-parameter, multi-dimensional
inversions. We present an initial 3D gravity model of
Osage County in northeastern Oklahoma, where there is a
greater than 40 mGal, 100 km diameter semi-circular
gravity anomaly that cannot be effectively removed by
traditional gravity processing techniques.
OPTIMIZED RATE ALLOCATION OF HYPERSPECTRAL IMAGES IN COMPRESSED DOMAIN USING ...Pioneer Natural Resources
This document discusses optimizing the rate allocation of hyperspectral images compressed using JPEG2000. It presents a mixed model for bit allocation that combines high and low bit rate models. This mixed model and an optimal rate allocation approach based on minimizing mean squared error under a rate constraint provide lower reconstruction errors than traditional approaches. Computational tests on hyperspectral data show the discrete wavelet transform allows for faster processing and less memory usage compared to the Karhunen-Loeve transform.
Receiver deghosting method to mitigate F-K transform artifacts: A non-windo...Pioneer Natural Resources
In this study, we implemented and tested a new processing- based broadband solution for mitigating F-K transform arti- facts for receiver deghosting in a marine environment. The F- K transform has traditionally been used for flat cable (constant depth) deghosting and often times tailored to meet the slanted (variable depth) cable criteria. Recently, the usage of τ − p do- main deterministic deghost operator has been more prominent with slant cable deghosting. Irrespective of the type of trans- form or deghost operator used, a windowed process is essential due to the time and offset varying character of the ghost. This use of a windowed process usually results in poor reconstruc- tion of deghosted signals and artifacts beyond the control of the transform(s) itself. The windowing in time and offset produces edgy effects which can be clearly seen in the difference plots. Our method, using a non-windowing approach, demonstrates a better representation of the deghosted signals without the arti- facts caused by the boundary of the windows. This method has also been well-tested for both the flat and slant cable receiver deghosting workflows in synthetic and field data examples.
Distance Metric Based Multi-Attribute Seismic Facies Classification to Identi...Pioneer Natural Resources
Conventional reservoirs benefit from a long scientific history that correlates successful plays to seismic measurements through depositional, tectonic, and digenetic models. Unconventional reservoirs are less well understood, however benefit from significantly denser well control. Thus, allowing us to establish statistical rather than model-based correlations between seismic data, geology, and successful completion strategies. One of the more commonly encountered correlation techniques is based on computer assisted pattern recognition. The pattern recognition techniques have found their niche in a plethora of applications ranging from flagging suspicious credit card purchase patterns to rewarding repeating online buying patterns. Classification of a given seismic response as having a “good” or “bad” pattern requires a “distance metric”. Distance metric “learning” uses past experiences (well performance) as training data to develop a distance metric. Alternative distance metrics have demonstrated significant value in the identification and classification of repeated or anomalous behaviors in public health, security, and marketing. In this paper we examine the value of three of these alternative distance metrics of 3D seismic attributes to the identification of sweet spots in a Barnett Shale play.
We illustrate unsupervised and supervised learning algorithms that accurately classify the lithological variations in the 3D seismic data. We demonstrate blind source separation techniques such as the principal components (PCA) and noise adjusted principal
components in conjunction with Kohonen Self organizing maps to produce superior unsupervised classification maps.
Further, we utilize the PCA space training in Maximum likelihood (ML) supervised classification. Results demonstrate that the ML supervised classification produces an improved classification of the facies in the 3D seismic dataset from the Anadarko basin in central Oklahoma.
This document discusses using hidden Markov models (HMMs) for unsupervised learning in hyperspectral image classification. It proposes an HMM-based probability density function classifier that models hyperspectral data using a reduced feature space. The approach uses an unsupervised learning scheme for maximum likelihood parameter estimation, combining both model selection and estimation. This HMM method can accurately model and synthesize approximate observations of true hyperspectral data in a reduced feature space without relying on supervised learning.
Directional Analysis and Filtering for Dust Storm detection in NOAA-AVHRR Ima...Pioneer Natural Resources
This document proposes techniques for detecting dust storms and determining their direction of transport using NOAA-AVHRR satellite imagery. It introduces a two-part approach using image processing algorithms: 1) A visualization technique uses filters, edge detectors and classification to locate dust sources. 2) An automation technique performs power spectrum analysis to detect dust storm direction by analyzing texture orientation in image blocks. The goal is to automatically detect and track dust storms for applications like hazard monitoring.
1. The document discusses using statistical learning methods like Gaussian mixture models (GMM) and dynamic component allocation (DCA) for hyperspectral image classification.
2. GMM represents pixel spectra as mixtures of Gaussian distributions. DCA is an algorithm that dynamically adds and removes Gaussian components to better characterize the data during training.
3. The document outlines how DCA works, including merging similar Gaussian modes, splitting modes with high kurtosis, and pruning insignificant modes. These techniques aim to learn the appropriate number of mixture components from the data.
This document describes a new method for training Gaussian mixture classifiers for hyperspectral image classification. The method uses dynamic pruning, splitting, and merging of Gaussian mixture kernels to automatically determine the appropriate number of components during training. This "structural learning" approach is employed to model and classify hyperspectral imagery data. Experimental results on AVIRIS hyperspectral data sets suggest this approach is a potential alternative to traditional Gaussian mixture modeling and classification using expectation-maximization.
The document discusses using discrete wavelet transform (DWT) and principal component analysis (PCA) as decorrelating transforms for hyperspectral image classification under JPEG2000 compression. It compares the classification performance of DWT and PCA when applying lossless compression and two JPEG2000 scalability options: color and quality. Color scalability decompresses a subset of bands, while quality scalability assigns more bits to important bands. The DWT provides similar classification to PCA but is faster and does not require additional files. Reordering bands by variance before color decompression improved DWT classification results compared to using the initial DWT band order.
The document discusses optimizing rate allocation for compressing hyperspectral images using JPEG2000. It proposes using the discrete wavelet transform instead of the Karhunen-Loeve transform for decorrelation due to lower computational complexity. A mixed model is used for rate distortion optimal bit allocation instead of experimentally obtained rate distortion curves. Comparisons show the mixed model approach results in lower mean squared error than traditional bit allocation schemes, while having lower implementation complexity than prior methods.
Detection and Classification in Hyperspectral Images using Rate Distortion an...Pioneer Natural Resources
This document summarizes an experiment that compares two methods of bit allocation for compressing hyperspectral imagery using JPEG2000: 1) the traditional high bit rate quantizer approach and 2) the rate distortion optimal (RDO) approach. The experiment shows that both methods perform well at relatively low bit rates, achieving over 96% classification accuracy. However, at very low bit rates, the RDO approach outperforms the high bit rate quantizer approach, achieving 90% accuracy at 0.0375 bpppb compared to less than 90% for the high bit rate method. The RDO approach also achieves lower mean squared error than the high bit rate quantizer approach.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
Monitoring Java Application Security with JDK Tools and JFR Events
Hydraulic Fracturing Stimulation Monitoring with Distributed Fiber Optic Sensing and Microseismic in the Permian Wolfcamp Shale Play
1. URTeC: 291
Hydraulic Fracturing Stimulation Monitoring with Distributed Fiber
Optic Sensing and Microseismic in the Permian Wolfcamp Shale Play
Vikram Jayaram, Robert Hull, Jed Wagner and Shuang Zhang
Pioneer Natural Resources Company
Copyright 2019, Unconventional Resources Technology Conference (URTeC) DOI 10.15530/urtec-2019-291
This paper was prepared for presentation at the Unconventional Resources Technology Conference held in Denver, Colorado, USA,
22 to 24 July 2019.
The URTeC Technical Program Committee accepted this presentation on the basis of information contained in an abstract submitted
by the author(s). The contents of this paper have not been reviewed by URTeC and URTeC does not warrant the accuracy, reliability,
or timeliness of any information herein. All information is the responsibility of, and, is subject to corrections by the author(s). Any
person or entity that relies on any information obtained from this paper does so at their own risk. The information herein does not
necessarily reflect any position of URTeC. Any reproduction, distribution, or storage of any part of this paper by anyone other than the
author without the written consent of URTeC is prohibited.
Abstract
Hydraulic fracturing stimulation designs are moving towards tighter spaced clusters, longer stage length,
and more proppant volumes. However, effectively evaluating the hydraulic fracturing stimulation
efficiency remains a challenge. Distributed fiber optic sensing, which includes Distributed Acoustic
Sensing (DAS) and Distributed Temperature Sensing (DTS), can continuously monitor the hydraulic
fracturing stimulation downhole and be compared with other monitoring technology such as microseismic.
The DAS and DTS data, when integrated with the microseismic, highlight processes relevant to the
completion design and allow for a better understanding and interpretation of each dataset.
This paper outlines a workflow to improve processing and interpretation of DAS and DTS data. In addition,
an estimate of the slurry distribution can be made. These methods will be demonstrated for a horizontal
Wolfcamp well in the Permian Basin. Here we compare key aspects of the microseismic, DAS, and DTS
results in several fracture stages to understand the downhole geomechanical processes. In order to interpret
the DTS data a thermal model is developed (using DTS data) to simulate the temperature behavior after
pumping has ceased. A slurry distribution is obtained by matching the simulated temperature with the
measured temperature from DTS. In addition, the DAS data signal is studied in the frequency domain and
the dominant frequencies are identified that are mostly related to fluid flow and to reduce the background
noise. This time frequency analysis enhances the ability to monitor and optimize well treatments.
After reducing the background noise, the acoustic intensity is correlated to the slurry distribution. The fluid
distribution data from DAS and DTS are compared with the microseismic and near field strain to better
understand the completion processes. We utilized fiber optic microseismic to better understand and
compare it to conventional microseismic.
Finally, we highlight the dynamics of strain and microseismic signature as fluid moves from an offset well
completion into the prior stimulated fiber well to better understand the reservoir and far field effects of the
completion.
2. URTeC 291 2
Introduction
The Permian basin is the largest productive basin in the United States. It is currently responsible for most
of the recent increase in U.S. oil production.
In Figure 1 we show several key deep shale reservoirs within the Permian basin, which is divided into the
Delaware basin to the west and the Midland basin to the east. Multiple operators are currently targeting
horizontal wells in this basin.
In 2017, Pioneer Natural Resources installed a fiber optic system on the outside of a horizontal well for the
purposes of (1) recording the development of physical changes resulting from the completion near the
injection site; and (2) recording the far field interaction with offsetting horizontal stimulations. Pioneer has
implemented similar efforts elsewhere to better understand the horizontal and vertical extents of various
completion designs in unconventional resources (Hull et al. 2019).
A horizontal well in the Midland basin was equipped with several downhole pressure and temperature
gauges. The well also had a fiber optic cable for DAS and DTS measurements. The monitor well and several
adjacent horizontal wells were stimulated in a zipper sequence to develop our understanding of pressure,
temperature, and strain changes related to the stimulations. Data were processed in-house and integrated
with the addition data that was collected in an effort to develop our understanding of the physical mechanics
at play.
Figure 1. Shows some of the key landing zones in the Midland side of the Permian basin, regional geology of key uplifts
surrounding the study area, and a geographic map of the study area highlighted with the box.
3. URTeC 291 3
In our workflow, we first convert the measured fiber-based temperature collected during the stimulation
into a temperature difference by subtracting it from the pre-frac formation temperature. We then build a
2D thermal model for the well and surrounding rock matrix. From this we can create a simulated
temperature for the DTS during and after the stimulation. Our results show a very good match with the
measured temperatures. Using the DTS model and the temporal DTS data, we can construct a slurry
distribution for the stage and the individual clusters.
In our example, we observe changes in the DTS signal to detect fluid entering the formation across clusters
and what may be the effective transmission of the completion through time. The DAS signal is also
traditionally used to determine the fluid flow through the perforations into the reservoir.
The background acoustic signal is first studied and then removed from the main signal. The dominant
frequency bands related to the injection process are retained. To better quantify the performance of the
stage, the parameter “uniformity” is defined as an indicator of how the slurry is distributed into each cluster.
We can compare the uniformity of the DAS slurry measured data to the DTS slurry data.
In our data sets, we have found the DTS slurry allocations can be utilized to better understand the
distribution of fluid and, in general, track the more conventionally presented DAS allocation. By having
both calculations we can interpret some of the physical processes taking place at the perforations, recognize
and understand variations between data sets, and provide insight into the stimulation that we may have
otherwise missed.
Beyond modeling and comparing the near wellbore DTS and DAS, the microseismic data can also be
integrated into these data. This combination of data sets further define key relationships between fluid,
pressure, and acoustic activity within a stimulation stage. In addition the development of the hydraulic
stimulation through time in the far field, away from the stimulated well, is also highlighted.
Acquisition Setup for the Project
Pioneer Natural Resources ran a permanently installed fiber optic line in a 10,000-ft horizontal Wolfcamp
Shale well in Midland County. This well was also equipped with downhole pressure gauges and was
observed by downhole geophones. During the stimulation of the instrumented well and its offsets, DAS,
DTS, and microseismic were recorded. The DAS and DTS data were used for both near field, instrumented
well stimulation, and far field offset well stimulation observations. Microseismic was recorded on both
wells. In Figure 2 we show the collection of instrumentation utilized for the acquisition of DAS and DTS.
The downhole pressure and temperature sensors were installed outside the wellbore at various locations
across the horizontal section of the wellbore.
Figure 2. Represents the collection of instrumentation utilized for recording the hydraulic stimulation. We used VSI
(Versatile Seismic Imager) geophone arrays (left), as well as external pressure gauges (bottom left), and fiber optics (upper,
middle and right) for the acquisition.
4. URTeC 291 4
Distributed Acoustic Sensing (DAS)
DAS is a newly adapted technology that can measure the acoustic signature in the near wellbore region.
These data can be used to visualize and understand important downhole parameters such as active
perforations, flow rate, etc. As noted previously, the permanent fiber provides an ability to monitor for the
life of the well the entire length of the wellbore. However, DAS is still not completely understood due to
the complexity of the acoustic phenomenon it records and our lack of understanding around what that
represents physically.
During a hydraulic stimulation, real-time fluid distribution was recorded for each cluster using DAS and
DTS. The DTS real-time temperature recorded during and after the stimulation provides a window into
understanding the treatment effects, as shown in Figure 3. In addition, cross-well interaction during the
offset well treatment was also observed from DAS and DTS.
It is important to note that the raw acoustic information from the DAS was processed using signal
processing workflows, and various metrics were computed, including sound pressure level and other signal
metrics. Fluid distribution results were provided during stimulation from the DAS data. Project data were
recorded, processed, and delivered to stakeholders, allowing for observations to be made prior to subsequent
stages. In addition, analytics were compiled during the project to allow for the comparison of stage-to-stage
performance.
The processing framework also involved a cloud-based solution where terabytes worth of DAS data were
processed and stored on the Data Lake. Processing and computations were performed using a high-end
Linux data science virtual machine from the Data Lake.
Figure 3. Shows the responses from DAS and DTS correlated with pump schedule during a single stage. The red triangles
on the left edge of the DAS/DTS plot represent the location of the perf clusters and the green triangle block indicates the
location of the plug for this stage.
5. URTeC 291 5
Distributed Temperature Sensing (DTS) and Numerical Modeling
While temperature logs have been a part of standard production logging packages for years, downhole fiber
optic DTS technology has introduced a continuous measurement of both temporal and spatial temperature,
allowing an entire well’s response to flow to be recorded. Downhole temperature can be recorded during
fracturing, shut-in, and production to provide continuous and integrated information. Integrated DTS
interpretation provides information on fracture/flow distribution, providing key insights into what occurred
during fracture treatments. It also identifies variations in treatment design and execution, should they exist,
and makes it potentially possible to improve the efficiency of multistage fracture stimulation. To
demonstrate the sensitivity of the DTS measurement, Figure 4 shows the change in the DTS at the casing
collars, which is then used for depth calibration of the fiber. As shown here, the sensitivity of the DTS
imaging allows the geoscientist to even pick up casing collar locations with a high degree of accuracy.
In this paper, a thermal model (Figure 5) is developed to simulate the temperature behavior after pumping
stops. A slurry/proppant distribution is obtained by matching the simulated temperature with the measured
temperature from DTS.
Figure 4. This figure shows the change in the DTS at the casing collars which is then used for depth calibration of the fiber.
From DTS, we first convert the temperature into a temperature difference by subtracting the geothermal
temperature. With the calibration of the thermal properties, the simulated temperature matches the measured
temperature very well, as indicated in Figure 6. A slurry/proppant distribution is generated from this
information. To better quantify the performance of the stage, the parameter “uniformity” is defined as an
indicator of how evenly the slurry/proppant is distributed into each cluster. In addition, the fluid interaction
with previous stages are detected automatically based on the rate of temperature change.
6. URTeC 291 6
The temperature behavior during the fracturing process has previously been studied, taking in to account
both a fracture propagation model and a temperature model (Huckabee 2009).
Below we briefly discuss the numerical model to simulate downhole temperature. We start the discussions
of the pre-requisites of the model and governing equations. The problem is simplified by assuming that a
single transverse fracture is created instantaneously at the beginning of injection, thus fixing the geometry
for the entire injection period. Seth et al. (2010) presented a simple analytical solution for fluid temperature
along the fracture during the hydraulic stimulation process. The fluid leak-off to the formation is ignored
for purpose of simplicity. We can now use an analytical solution to generate the initial temperature profile.
2
where –
Y = coordinate in the y-direction which origin is shifted to the
reservoir center
w = fracture width
h = heat transfer coefficient on fracture face
r and fr = liquid phase, rock matrix, and fracture, respectively
Initial Condition: for warmback temperature simulation, the initial condition is the temperature
profile after injection.
Thermal properties include which is the material conductivity (from general heat conduction
equation), is the density and is the specific heat capacity
Boundary Condition: Neumann
Meshing: Tartan grid in x-direction, uniform grid in r-direction
51 grids in x-direction, 61 grids in r-direction
Accordingly, in Seth et al. 2010 two dimensionless quantities are defined:
2
Here and denote initial temperature and injection fluid temperature respectively. The temperature
profile itself does not change with time while the temperature front advanced with time. Then, the analytical
solution is highly dependent on the value of the heat transfer coefficient .
7. URTeC 291 7
Figure 5 shows a tabulation of commonly used thermal properties for Water and Shale. The figure also shows the simulation
result depicting the injecting fluid volume influence on long term warmback.
Given the boundary conditions are , 0 0 and 0, 1, the analytical solution is
provided as:
0 0
0
The solution is then given as a similarity solution provided below (Han 2012):
, erfc
√4
0
Where, erfc (·) denotes the complementary error function. In Figure 6 we can see a high degree of
correlation between warmback time and difference in temperature when we compare simulated versus
measured in each of the clusters (utilizing the modeled approach).
8. URTeC 291 8
Figure 6. Shows the actual vs modeled thermal changes in the wellbore when warmback begins. We have adjusted the
warmback time to be zero once a cluster shuts down. From the different slopes in the warmback signatures we can estimate
the contribution of fluid across individual clusters.
Integration of the Microseismic with the Near Feld Strain
By integrating the microseismic with the fiber optic data, we can observe changes near and far field within
the rock during stimulation. Figure 7 shows some of the key aspects of the microseismic event timing
including where it occurs in the near field along the wellbore. The actual microseismic positional data
beyond their relationship along the wellbore is not shown here, but in general for this stage, the
microseismic cloud develops from the perforations extending in distance in the SHmax direction in time
throughout the stimulation.
Microseismic events can initiate adjacent to the completion perforation interval as the guns are being
pumped as well as when the sealing ball is pumped prior to a stage. Figure 7 (A), DAS indicates good
transmission of energy across the perforation starts at point B as the stage comes up to rate with cooling of
the fiber shortly afterward at point C. One key aspect is that the microseismic occurs early across most of
the perforation interval, and develops outward here in a slightly toe-ward direction through time as indicated
at time D. Minor variations in surface pressure, DAS changes, and MS are noted at time line highlighted
at point E. At the end of the stage during shutdown, pressure is abruptly lowered and along the main
perforation interval we observe an increase in the microseismic events F. Thermal warming begins shortly
afterward as noted by the point G. For this stage and other stages, the noted increase in microseismic
activity as pumps shut down suggests that we may be imaging the closing down of the fracture aperture.
9. URTeC 291 9
Figure 7. Shows the temporal relationship of the DAS, DTS, and completion, as well as the occurrence of the microseismic
along the wellbore. Microseismic events are colored and sized for magnitude. The horizontal blue lines highlight the
treatment interval while the vertical lines mark key points in the stage.
Offset Stimulation Monitoring with Fiber Optics to Understand the Far Field
Pioneer Natural Resources recorded offset completions into the permanent fiber well to better understand
far field deformational changes in the reservoir. The goal in this study is to record the low frequency strain
and thermal variations on the fiber for stimulations at various well spacings, and then relate this information
to the progression of the zipper frac through time and space. The method involves both leading and lagging
the offset completion with the fiber well in the zipper frac sequence to better understand the relationship
between the stimulation of the virgin rock and that of prior stimulated zones.
Beyond recording the offset stimulation strain on the fiber, Pioneer utilized the fiber to recorded
microseismic events. We also utilized a conventional microseismic geophone array to compare the two
technologies and calibrate the geophone based microseismic data to the fiber data. These data allow us to
establish a temporal understanding of the hydraulic fracture geometry and conductivity in three-dimensions.
The fiber based microseismic can complement our understanding of the progression of the stimulation
adjacent to the offsetting fiber well.
The complexity of the hydraulic stimulation and well-to-well interaction can be better described and
understood using the microseismic, strain and thermal data. Establishing strain and thermal relationships
through the virgin reservoir can highlight the degree of natural fractures, which potentially contribute to the
complexity of the stimulation. For example, does the hydraulic stimulation show up on the offset well along
SHmax at one or two localized points, or is it more diffuse along the lateral position of the fiber well beyond
what we would consider the stimulation deformation width? These data can be utilized to help answer
questions like these while also reveal the existence of natural fractures that may provide communication
pathways in unconventional plays.
10. URTeC 291 10
Figure 8. Shows the development of the microseismic onto the offset fiber well for one stage. The microseismic events are
colored in time and sized by magnitude. Here the fiber well has been completed prior. We see no effect on the thermal DTS
for this stage but do see the interaction across the perforations in the DAS as 4 or 5 lineation through time. The injection
depths of the offset well are noted in the red arrows. The DAS interaction occurs over 400ft of interval over midrange to low
frequencies and starts up after about 1/2 of the stage has been pumped and continues after the pumps have stopped.
In our far field study, we can track the evolution of the stimulation from the offset well stimulation with the
conventional and fiber based microseismic, tying together the DAS strain and DTS in the instrumented
well. In Figure 8 we observed strain hits and thermal interactions with the fiber from the offset well and
tied these observations to the microseismic development. By integrating these data, we have developed a
more complete understanding of the physical processes occurring within these stimulations and their
interactions with the reservoir.
Some of the key observations we have made utilizing these various datasets, as they relate to the far field,
are noted below.
From the microseismic data:
In Figure 8 we establish a direct correlation of the progression of the stimulation from the offsetting
well on the fiber well with the progression of the microseismic through time.
Microseismic extents correlate in general to the strain deformation envelopes noted on the fiber.
Fiber-based microseismic, not shown here, gives a qualitative estimation of the stimulation building
out from the offset well towards the fiber well.
From the strain and thermal interactions we note:
There appears to be more complex strain hits on the fiber when we stimulate the offset well through
previously stimulated rock.
Strain effects appear to be overall broad and occur earlier in time than the thermal warming related
to the pressure front. We observe cooling only on some stages.
We see more thermal pressure related warming heel-ward than toe-ward from the offset
stimulation. This may be related to a stress shadow of the current offset stage.
Strain and microseismic data indicate that fluid moving within the hydraulic fracture continued
through time even after pumping stopped on the offset well as shown in Figure 8.
Strain through the virgin rock often focuses over short intervals on the fiber, and at times broadens
out to a length exceeding two stages during the pumping of the offset stimulation.
When stimulating the offset adjacent well after the stimulation of the fiber well (as shown in Figure
8), we see broad strain signatures exceeding 2 to 3 stage lengths. This highlights the perforation
clusters of the original fiber well as pressure and likely fluid communication is established.
11. URTeC 291 11
A low frequency strain signature is observed more than two stages beyond the current stage
Extension of the stimulation over 750ft typically took over 30 minutes to travel through the prior
completed stimulation of the fiber well.
Extension for the most part followed the max horizontal stress direction out of the perforations in
the offset well, with limited interpreted natural fracture interaction carrying fluid substantially heel-
ward beyond the offset deformation zone.
Thermal variations are noted on the offset fiber for some stages that correlate in depth to the change
in strain on the fiber.
Fiber-Based Microseismic
As outlined prior, fiber optics can be used to record microseismic data. There are numerous advantages, but
also some disadvantages (Hull et. al 2017). While fiber optics can image a microseismic event, it is not
omnidirectional. A fiber optic line is sensitive to energy propagating along the length of the fiber more than
it is across it. Further, the fiber acts as a single component geophone. To locate an event in 3D space requires
the fiber to be located across multiple azimuths to correctly position the event. Here, fiber-based
microseismic was typically recorded only from the horizontal part of the instrumented well, resulting in
some limitations in defining the exact placement of the microseismic events for most of the well.
There is no information provided on depth for fiber oriented only horizontally, given that the fiber records
are a single component. Further, when events late in a stage span both sides of the fiber well, it is unknown
from which direction the events originate.
Fortunately, in this project additional depth work was possible in heel stages where both vertical and
horizontal fiber could record large events. This allowed us to establish some depth control for the fiber-
based microseismic events that in general compared well to the depths recorded by the geophone-based
microseismic. Correlation of both microseismic datasets gave confidence that in the fiber based
microseismic depiction of the cloud extents for a given stage were good.
Distributed Acoustic Processing
The frequency of the energy occurring along the fiber during the completion was analyzed. Low frequency
strain build up on the fiber was observed and correlated with the current and prior stage. This is commonly
referred to as a stress shadow. Stress shadow effects are noted by Ugueto et al. (2019), and for some parts
of the analysis we removed these trends to better normalize variability at the cluster level. In addition,
localized changes occurring at the cluster level at higher frequencies within the stage are noted, as shown
in Figure 9.
In Figure 9 waterfall plots are utilized to assess common frequencies and noted changes occurring across
the clusters that could be related to changes in fluid depicted in Figure 10. Once DTS and DAS fluid
allocations at the cluster level had been obtained, the two techniques were compared to better understand
stage to stage uniformity.
12. URTeC 291 12
Figure 9. Shows the frequency of energy in hertz across five clusters and their temporal changes in energy through time.
From here we can compare key changes in the energy for each cluster and its frequency content in an attempt to relate this
to the downhole dynamics and fluid distribution over each cluster. Some of the noted variations are highlighted in the red
boxes. As shown above for Cluster 4, the DAS data terminated early.
Figure 10. On the left, the DAS shows computed energy thru each cluster for a given stage for fluid and an allocated
percentage based on energy when proppant was being placed downhole. On the right, the DTS assessment is shown for the
same stage using the modeled approach.
Geomechanics and Microseismic
The datasets acquired here support the basic concept that as the pressure increases in the hydraulic fractures,
as well as in the surrounding rock adjacent to the fractures, stress increases. The fiber, because it is coupled
to the rock through the cement, translates these stress changes as strain changes within the fiber optic line.
The temporal strain changes on the DAS can be recorded and displayed over multiple frequencies. As the
hydraulic fracture tip propagates, it introduces deformation and slippage along the fracture face as well as
introduces increases in pressure and strain to the surrounding rock (van der Baan et al. 2013). The
13. URTeC 291 13
deformation process may result in mechanical slippage over a broad range of frequencies that are recorded
as stress drop. Microseismic recording of these stress drops tracks the development of this stress envelope
through time.
The microseismic data presented in this study was processed using commercial vendors. This microseismic
dataset has some uncertainty in the locations of the microseismic events as all such datasets do. Despite
these uncertainties, this dataset has demonstrated clear associations with the strain and thermal effects noted
on the fiber data. That is, in general, for most stages examined herein we see direct correlation of the fiber-
recorded strain changes with the presence of the microseismic near or adjacent to the fiber shown in Figure
8.
It is important to note the microseismic is not typically tracking individual fractures in these very tight
reservoir rock stimulations, but the broader strain-induced deformational zone and the mechanical processes
occurring within these zones.
Geomechanics and Integrating the Near Field and Far Field Data
Two key observations can be made on the fiber-based data that have not been sufficiently discussed in the
literature.
One observation is that fast warmback was experienced on the thermal data. Here the data suggest that the
stimulation fluid is moving away from the well and down the fracture after the completion has stopped
pumping. If the fluid were to remain around the perforations and fiber, the rock should have retained the
cooler fluid temperature with a generally observed slow warmback occurring over days. Note in the near
field data are observations for clusters that return to reservoir temperature within a short period of time
(minutes), shown in Figure 8.
This phenomenon is interpreted to be the in-situ fluid in the stimulated rock next to the fracture and fiber
may be quickly replacing the cold fluid introduced into the fracture as the fracture extends post shutdown.
In fact, in the data displayed in Figure 7, evidence of aperture closing is observed. This was also noted by
van der Baan et al. (2013) who demonstrated that microseismic event activity nearest to the wellbore falls
off when the pumps shut down.
A second observation we have noted on the microseismic data and offsetting fiber well, is that the recorded
deformation continues for a substantial amount of time, often tens of minutes, after pumping has stopped
on the adjacent well, shown in Figure 8. Our data suggests, as noted by Meyer and Bazan (2001), that the
fluid is moving, introducing stress and strain changes in and around the fractured reservoir for upwards of
an hour or more.
Conclusions
A better understanding of the physical processes taking place in the near and far field wellbore environment
during a hydraulic stimulation can be obtained by integrating microseismic, fiber optics, and downhole
pressure data. This understanding can be further improved by visualizing these data, both three
dimensionally as well as temporally,
Developing multidisciplinary teams that can integrate these substantial data sets and reduce the information
into transferable learnings is key to a successful outcome.
Based on this work, we can further develop our hydraulic stimulation models to optimize completions,
increase recovery factors, and reduce communication with offset wells in unconventional plays.
Acknowledgements
We would like to thank Pioneer Natural Resources for allowing us to publish this material; Silixa for
recording the fiber data; and Schlumberger for the microseismic data.
The data acquired and assessed here could not have been accomplished without the help of a large team
effort across multiple groups in multiple cities. We would like to thank Pioneer for allowing us to pursue
such a robust project.
14. URTeC 291 14
References
Han, J.C., 2012. Analytical heat transfer. CRC Press, Boca Raton, Florida.
Huckabee, P.T. 2009. Optic Fiber Distributed Temperature for Fracture Stimulation Diagnostics and Well
Performance Evaluation. Presented at the SPE Hydraulic Fracturing Technology Conference, The
Woodlands, Texas, on 19-21 January. SPE-118831- MS. DOI: 10.2118/118831- MS.
Hull, R.A., Meek, R., Bello, H., and Miller, D., URTeC 2017 2695282, Case History of DAS Fiber-Based
Microseismic and Strain Data, Monitoring Horizontal Hydraulic Stimulations Using Various Tools to
Highlight Physical Deformation Processes.
Hull, R.A., Meek, R., Bello, H., Woller, K., and Wagner, J., Monitoring horizontal well hydraulic
stimulations and geomechanical deformation processes in the unconventional shales of the Midland Basin
using fiber-based time-lapse VSPs, microseismic, and strain data. The Leading Edge volume 38, Issue 2,
Feb 2019.
Meyer, B. R., and Bazan, L. W.: "A Discrete Fracture Network Model for Hydraulically Induced Fractures:
Theory, Parametric and Case Studies," SPE 140514, February 2011
Seth, G., Reynolds, A.C., and Mahadevan, J. 2010. Numerical Model for Interpretation of Distributed-
Temperature-Sensor Data during Hydraulic Fracturing. Paper presented at the SPE Annual Technical
Conference and Exhibition, Florence, Italy. SPE-135603- MS. DOI: 10.2118/135603- MICROSEISMIC.
Ugueto, G., Huckabee, P., Wojtaszek, M., Daredia, T., Reynolds, A., 2019. New Near-Wellbore Insights
from Fiber Optics and Downhole Pressure Gauge Data. Paper presented at the SPE Hydraulic Fracturing
Technology Conference and Exhibition held in the Woodlands, Texas, USA. SPE-194371- MS. DOI:
10.2118/194371- MS.
Van der Baan, M., Eaton, D., and Dusseault, M., 2013. Microseismic Monitoring Developments in
Hydraulic Fracture Stimulation in: Bunger, A.P., McLennan, J., Jeffery, R., Effective and Sustainable
Hydraulic Fracturing, InTech, Rijeka, http://dx.doi.org/10.5772/56444