Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Neural network-based low-frequency data extrapolation

180 views

Published on

Slides for my talk at SEG Workshop in Manama, Bahrain, December 2017. We introduce an approach to extrapolate for missing low-frequency data from frequency representation of multi-offset seismic data. Meaning that data on multiple high-frequencies is used to infer a single low-frequency for each receiver. In the end, we demonstrate a preliminary example of building an initial model for FWI from the extrapolated data.

Published in: Science
  • Be the first to comment

  • Be the first to like this

Neural network-based low-frequency data extrapolation

  1. 1. O. Ovcharenko, V. Kazei, D. Peter, T. Alkhalifah December 4, 2017 Neural network-based low-frequency data extrapolation
  2. 2. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Outline Low-frequency data Artificial Neural Networks Results for a crop from BP 2004 Application for bandwidth extension
  3. 3. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Acquisition data Lack of low-frequency data - Due to instrumental limitations - Due to noise
  4. 4. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Low-frequency data in FWI - Inverts large-scale velocity structures - Less chance to get stuck in local minima - Reveals deep model structures / below salt fHigh fLow Multiple local minima Smooth (Kazei et al., 2016)
  5. 5. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation FWI without low frequencies Modifications of misfit/gradient (Warner et al., 2015; Leeuwen & Herrmann, 2014; Métivier et al., 2016) (Alkhalifah, 2015, 2016; Kazei, et al., 2016) etc… Pros: - Established workflow - Relative robustness Cons: - Computational costs - Prone to event mismatching
  6. 6. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation FWI without low frequencies Extrapolation of low-frequency data Pros: - Cheaper computations Cons: - Not well explored robustness - Wavefield approximations (Smith et al., 2008; Hu et al., 2014; Li & Demanet, 2015, 2016) etc…
  7. 7. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Low-frequency extrapolation Beat tone inversion (Hu et al., 2014) Bandwidth extension for atomic events (Li & Demanet, 2015, 2016) Bandwidth extension using Continuous Wavelet Transform (Smith et al., 2008) This work: Low-frequency data extrapolation using Artificial Neural Network
  8. 8. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Supervised Classification etc. Regression Regression trees etc. Artificial Neural Networks Machine Learning Learning paradigms Statistical tasks Methods Unsupervised Reinforcement
  9. 9. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Feed-forward ANN Input Hidden Output x1 x2 x3 x4 Layers: Neuron Bias Weight a.k.a. Multilayer Perceptron t1 t2 t3 Training = tuning up weights w b w
  10. 10. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Feed-forward ANN 1 -1 a(wx+ bw0) wx+ bw0 Activation function a(x) = tanh(x) x1 w4 w1 x2 x3 x4 b w0 w2 1. Dot product of input and weight vectors + bias 2. Substitution into activation function
  11. 11. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Feed-forward ANN Input Hidden Output x1 x2 x3 x4 Layers: t1 t2 t3 + = W x b
  12. 12. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Feed-forward ANN Input Hidden Output x1 x2 x3 x4 Layers: t1 t2 t3 + = W x b
  13. 13. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Training Feed-forward ANN Training inputs Training outputs Minimize L2 norm
  14. 14. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Neural Networks pros and cons - Lots of parameters - Hard to interpret - Comp. costs for training + Good for highly-nonlinear problems + Good for large inputs + Data-driven + Easy to implement and parallelize Not a magic wand, use with care Pros Cons
  15. 15. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Selection of network configuration Neural-Network architecture Feature selection Training parameters Trial-and-Error approach
  16. 16. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Main idea High-frequency data Predict data on single low-frequency from multiple high-frequency data Low frequency dataNeural network
  17. 17. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Data selection
  18. 18. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Source Receivers Real Imag Single source single frequency Amplitude
  19. 19. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Source Receivers Real Imag Single source single frequency
  20. 20. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Multi-source single frequency NSRC NREC NSRC NREC
  21. 21. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation f1 f2 f3 f4 f0 Raw training data Real Imag High-frequency data Low frequency data
  22. 22. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Raw training data INPUT OUTPUT INPUT OUTPUT Real Imag Features Features
  23. 23. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Data processing Normalization de-Normalization By offset Data-driven Real Imag
  24. 24. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Before After INPUTS OUTPUTS Features Features Data processing Overlap of multiple data for configuration Train True
  25. 25. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Random model generation
  26. 26. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Random model generation - Random Gaussian field - Flat bathymetry - Fixed model size - Permissible velocity range - Use data for each src-rec pair Sampling multidimensional model space The more diverse data is - the better
  27. 27. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Random model generation - Random Gaussian field - Flat bathymetry - Fixed model size - Permissible velocity range - Use data for each src-rec pair Sampling multidimensional model space The more diverse data is - the better
  28. 28. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Random model generation - Random Gaussian field - Flat bathymetry - Fixed model size - Permissible velocity range - Use data for each src-rec pair Sampling multidimensional model space The more diverse data is - the better
  29. 29. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Main idea High-frequency data for random velocity models Low-frequency data for random velocity models Predict data on single low-frequency from multiple high-frequency data NSRC*NFREQ training samples from a single random velocity model
  30. 30. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Results Crop from BP 2004 velocity model (Billette and Brandsberg-Dahl, 2005)
  31. 31. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation True vs Predicted 0.5 Hz High-frequencies: 2.41, 3.14, 3.5, 4.07 Hz Low-frequency: 0.5 Hz Re Im Phase NSRC NREC fn+1=k fn (Sirgue, Pratt, 2004) λ ~ depth
  32. 32. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation True vs Predicted 0.5 Hz Re Im
  33. 33. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation True vs Predicted 0.5 Hz Re Im
  34. 34. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Real part
  35. 35. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 0.5 Hz from 2.4 - 4 Hz
  36. 36. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 0.84 Hz from 2.4 - 4 Hz
  37. 37. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  38. 38. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  39. 39. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  40. 40. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Phase
  41. 41. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 0.5 Hz from 2.4 - 4 Hz
  42. 42. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 0.84 Hz from 2.4 - 4 Hz
  43. 43. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  44. 44. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  45. 45. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation 1.42 Hz from 2.4 - 4 Hz
  46. 46. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Beta-version of single frequency FWI at 0.5 Hz True Initial True 0.5 HzPred 0.5 Hz
  47. 47. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Computational facts 48000 total 960 240 NN: 3 hidden layers - 2 * N inputs - 2 * N outputs - 1 * N outputs Batch size: 1024 Learning rate: 0.005 Optimizer: Adam Weight regularization: 0.005 NVIDIA Quadro K2200TensorFlow 1.3.0Python 3.6 Keras 2.0.5Matlab R2016b Initialization “xavier” (Glorot & Bengio, 2010) Training time ~ 5 min Prediction time ~ 5 sec Training data generation ~ 40 min on 24 cores
  48. 48. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Conclusions
  49. 49. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Conclusions • Phase is predicted better than amplitude • Lower frequencies are better predicted • Model generator is crucial • Current network type and architecture are not optimal
  50. 50. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Acknowledgements We are grateful to Professor Xiangliang Zhang, Professor Gerhart Pratt, Basmah Altaf, Jubran Akram, SMI and SWAG groups at KAUST for fruitful discussions.
  51. 51. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation • Phase is predicted better than amplitude • Lower frequencies are better predicted • Model generator is crucial • Current network type and architecture are not optimal Next steps: • Improve data generator • Search for optimal configuration • Explore stability • Compare with other techniques Conclusions
  52. 52. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation The End Automated fault detection (Araya-Polo et al., 2017) Salt body picking (Guillen et al., 2017) Mapping reservoirs on migrated seismic (Bougher, 2016) Facies classification and reservoir properties prediction (Hall, 2017; Ahmed et al., 2010) Event detection (Akram et al., 2017) Interpolation of missing data (Jia and Ma, 2017) Denoising (Zhang et al, 2017) Some NN applications in geophysics Inversion of seismic, DC data etc. (Röth, Tarantola, 1994; Neyamadpour et al., 2009)
  53. 53. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Links http://onlinelibrary.wiley.com/doi/10.1029/93JB01563/full http://onlinelibrary.wiley.com/doi/10.1029/93JB01563/full http://ieeexplore.ieee.org/document/5584501/figures https://library.seg.org/doi/abs/10.1190/1.3298443 https://library.seg.org/doi/abs/10.1190/segam2017-17761195.1 https://library.seg.org/doi/full/10.1190/tle36030208.1 https://library.seg.org/doi/abs/10.1190/segam2015-5931401.1 https://link.springer.com/article/10.1007/s11200-010-0027-5 http://blackecho.github.io/blog/machine-learning/2016/02/29/denoising-autoencoder- tensorflow.html https://www.slim.eos.ubc.ca/content/machine-learning-applications-geophysical-data-analysis https://library.seg.org/doi/abs/10.1190/1.1443221 https://library.seg.org/doi/pdf/10.1190/tle35100906.1 https://library.seg.org/doi/abs/10.1190/1.3298443 https://library.seg.org/doi/abs/10.1190/1.1444797
  54. 54. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Assumptions made + Explicit assumption about source signature ω ω
  55. 55. oleg.ovcharenko@kaust.edu.saLow-frequency data extrapolation Stack of images to an image, GAN?

×