O. Ovcharenko, V. Kazei, D. Peter, T. Alkhalifah
December 4, 2017
Neural network-based low-frequency data extrapolation
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Outline
Low-frequency data

Artificial Neural Networks
Results for a crop from BP 2004
Application for bandwidth extension
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Acquisition data
Lack of low-frequency data 

- Due to instrumental limitations

- Due to noise
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Low-frequency data in FWI
- Inverts large-scale velocity structures

- Less chance to get stuck in local minima

- Reveals deep model structures / below salt
fHigh
fLow
Multiple
local minima
Smooth
(Kazei et al., 2016)
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
FWI without low frequencies
Modifications of misfit/gradient
(Warner et al., 2015; Leeuwen & Herrmann, 2014; Métivier et al., 2016)

(Alkhalifah, 2015, 2016; Kazei, et al., 2016)

etc…

Pros:
- Established workflow

- Relative robustness

Cons:
- Computational costs

- Prone to event mismatching
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
FWI without low frequencies
Extrapolation of low-frequency data
Pros:
- Cheaper computations

Cons:
- Not well explored robustness

- Wavefield approximations
(Smith et al., 2008; Hu et al., 2014; Li & Demanet, 2015, 2016)

etc…
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Low-frequency extrapolation
Beat tone inversion 

(Hu et al., 2014)
Bandwidth extension for atomic events 

(Li & Demanet, 2015, 2016)
Bandwidth extension using 

Continuous Wavelet Transform

(Smith et al., 2008)
This work: Low-frequency data extrapolation using Artificial Neural Network
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Supervised
Classification etc.
Regression
Regression trees etc.
Artificial Neural Networks
Machine Learning
Learning paradigms Statistical tasks Methods
Unsupervised
Reinforcement
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Feed-forward ANN
Input Hidden Output
x1
x2
x3
x4
Layers:
Neuron
Bias
Weight
a.k.a. 

Multilayer Perceptron
t1
t2
t3
Training 

=

tuning up weights
w
b
w
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Feed-forward ANN
1
-1
a(wx+ bw0)
wx+ bw0
Activation function a(x) = tanh(x)
x1
w4
w1
x2
x3
x4
b w0
w2
1. Dot product of input and weight vectors + bias
2. Substitution into activation function
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Feed-forward ANN
Input Hidden Output
x1
x2
x3
x4
Layers:
t1
t2
t3
+ =
W x b
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Feed-forward ANN
Input Hidden Output
x1
x2
x3
x4
Layers:
t1
t2
t3
+ =
W x b
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Training Feed-forward ANN
Training inputs Training outputs
Minimize L2 norm
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Neural Networks pros and cons
- Lots of parameters

- Hard to interpret

- Comp. costs for training
+ Good for highly-nonlinear problems

+ Good for large inputs

+ Data-driven

+ Easy to implement and parallelize
Not a magic wand, use with care
Pros Cons
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Selection of network configuration
Neural-Network architecture

Feature selection

Training parameters

Trial-and-Error approach
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Main idea
High-frequency data
Predict data on single low-frequency from multiple high-frequency data
Low frequency dataNeural network
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Data selection
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Source Receivers
Real
Imag
Single source single frequency
Amplitude
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Source Receivers
Real
Imag
Single source single frequency
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Multi-source single frequency
NSRC
NREC
NSRC
NREC
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
f1
f2
f3
f4
f0
Raw training data
Real
Imag
High-frequency data Low frequency data
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Raw training data
INPUT
OUTPUT
INPUT
OUTPUT
Real
Imag
Features
Features
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Data processing
Normalization de-Normalization
By offset Data-driven
Real
Imag
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Before
After
INPUTS OUTPUTS
Features Features
Data processing
Overlap of multiple data for configuration
Train
True
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Random model generation
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Random model generation
- Random Gaussian field

- Flat bathymetry

- Fixed model size

- Permissible velocity range

- Use data for each src-rec pair
Sampling multidimensional model space
The more diverse data is - the better
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Random model generation
- Random Gaussian field

- Flat bathymetry

- Fixed model size

- Permissible velocity range

- Use data for each src-rec pair
Sampling multidimensional model space
The more diverse data is - the better
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Random model generation
- Random Gaussian field

- Flat bathymetry

- Fixed model size

- Permissible velocity range

- Use data for each src-rec pair
Sampling multidimensional model space
The more diverse data is - the better
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Main idea
High-frequency data for 

random velocity
models
Low-frequency data for 

random velocity
models
Predict data on single low-frequency from multiple high-frequency data
NSRC*NFREQ training
samples from a single
random velocity model
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Results
Crop from BP 2004 velocity model 

(Billette and Brandsberg-Dahl, 2005)
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
True vs Predicted 0.5 Hz
High-frequencies:
2.41, 3.14, 3.5, 4.07 Hz

Low-frequency:
0.5 Hz
Re
Im
Phase
NSRC
NREC
fn+1=k fn (Sirgue, Pratt, 2004)
λ ~ depth
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
True vs Predicted 0.5 Hz
Re
Im
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
True vs Predicted 0.5 Hz
Re
Im
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Real part
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
0.5 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
0.84 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Phase
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
0.5 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
0.84 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
1.42 Hz from 2.4 - 4 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Beta-version of single frequency FWI at 0.5 Hz
True Initial
True 0.5 HzPred 0.5 Hz
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Computational facts
48000

total
960 240
NN: 3 hidden layers

- 2 * N inputs

- 2 * N outputs

- 1 * N outputs

Batch size: 1024

Learning rate: 0.005

Optimizer: Adam

Weight regularization: 0.005
NVIDIA Quadro K2200TensorFlow 1.3.0Python 3.6 Keras 2.0.5Matlab R2016b
Initialization “xavier”

(Glorot & Bengio, 2010)
Training time ~ 5 min

Prediction time ~ 5 sec
Training data generation
~ 40 min on 24 cores
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Conclusions
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Conclusions
• Phase is predicted better than amplitude

• Lower frequencies are better predicted

• Model generator is crucial

• Current network type and architecture are not optimal
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Acknowledgements
We are grateful to Professor Xiangliang Zhang, Professor Gerhart Pratt,
Basmah Altaf, Jubran Akram, SMI and SWAG groups at KAUST for
fruitful discussions.
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
• Phase is predicted better than amplitude

• Lower frequencies are better predicted

• Model generator is crucial

• Current network type and architecture are not optimal
Next steps:
• Improve data generator

• Search for optimal configuration

• Explore stability

• Compare with other techniques
Conclusions
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
The End
Automated fault detection
(Araya-Polo et al., 2017)
Salt body picking
(Guillen et al., 2017)
Mapping reservoirs on
migrated seismic
(Bougher, 2016)
Facies classification
and reservoir properties prediction
(Hall, 2017; Ahmed et al., 2010)
Event detection
(Akram et al., 2017)
Interpolation of missing data
(Jia and Ma, 2017)
Denoising
(Zhang et al, 2017)
Some NN applications in geophysics
Inversion of seismic, DC data etc.
(Röth, Tarantola, 1994;
Neyamadpour et al., 2009)
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Links
http://onlinelibrary.wiley.com/doi/10.1029/93JB01563/full

http://onlinelibrary.wiley.com/doi/10.1029/93JB01563/full

http://ieeexplore.ieee.org/document/5584501/figures

https://library.seg.org/doi/abs/10.1190/1.3298443

https://library.seg.org/doi/abs/10.1190/segam2017-17761195.1

https://library.seg.org/doi/full/10.1190/tle36030208.1

https://library.seg.org/doi/abs/10.1190/segam2015-5931401.1

https://link.springer.com/article/10.1007/s11200-010-0027-5

http://blackecho.github.io/blog/machine-learning/2016/02/29/denoising-autoencoder-
tensorflow.html

https://www.slim.eos.ubc.ca/content/machine-learning-applications-geophysical-data-analysis

https://library.seg.org/doi/abs/10.1190/1.1443221

https://library.seg.org/doi/pdf/10.1190/tle35100906.1

https://library.seg.org/doi/abs/10.1190/1.3298443

https://library.seg.org/doi/abs/10.1190/1.1444797
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Assumptions made
+ Explicit assumption about source signature
ω ω
oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation
Stack of images to an image, GAN?

Neural network-based low-frequency data extrapolation