SlideShare a Scribd company logo
Robust 3D gravity gradient inversion
 by planting anomalous densities
             Leonardo Uieda
           Valéria C. F. Barbosa

            Observatório Nacional


             September, 2011
                       
Outline




        
Outline
    Forward Problem




                          
Outline
    Forward Problem   Inverse Problem




                              
Outline
    Forward Problem   Inverse Problem   Planting Algorithm




                                        Inspired by René (1986)




                              
Outline
    Forward Problem      Inverse Problem   Planting Algorithm




                                           Inspired by René (1986)

        Synthetic Data




                                 
Outline
    Forward Problem      Inverse Problem     Planting Algorithm




                                              Inspired by René (1986)

        Synthetic Data                      Real Data




                                      Quadrilátero Ferrífero, Brazil
                                 
Forward problem


            
Observed g αβ
                αβ
            g




     
Observed g αβ
                αβ
            g




     
Observed g αβ
                αβ
            g




        Anomalous density




     
Observed g αβ
                                     αβ
                                 g




    Want to model this       Anomalous density




                          
Interpretative model




     
Interpretative model




        Right rectangular prism


                          Δ ρ= p j

     
 
    Prisms with  p j =0    
    not shown
[]
                                 p1
                              p= p 2
                                 ⋮
                                 pM
 
    Prisms with  p j =0    
    not shown
Predicted g αβ
                                       αβ
                                   d




                                   []
                                  p1
                               p= p 2
                                  ⋮
                                  pM
 
    Prisms with  p j =0    
    not shown
Predicted g αβ
                                               αβ
                                           d

                                       M
                              d =∑ p j a
                               αβ                   αβ
                                                    j
                                      j=1




                                           []
                                        p1
                                     p= p 2
                                        ⋮
                                        pM
 
    Prisms with  p j =0    
    not shown
Predicted g αβ
                                               αβ
                                           d

                                       M
                              d =∑ p j a
                               αβ                   αβ
                                                    j
                                      j=1


                                    Contribution of jth prism




                                           []
                                         p1
                                      p= p 2
                                         ⋮
                                         pM
 
    Prisms with  p j =0    
    not shown
More components:
          xx
        d
          xy
        d
          xz
        d
          yy
        d
          yz
        d
          zz
        d

                        
More components:
          xx
        d
          xy
        d
          xz
        d
          yy
                  d
        d
          yz
        d
          zz
        d

                        
More components:
          xx
        d
          xy
        d
                       M
          xz
        d
          yy
                  d =∑ p j a j
        d              j=1
          yz
        d
          zz
        d

                         
More components:
          xx
        d
          xy
        d
                       M
          xz
        d
          yy
                  d =∑ p j a j = A p
        d              j=1
          yz
        d
          zz
        d

                         
More components:
          xx
        d
          xy
        d
                       M
          xz
        d
          yy
                  d =∑ p j a j = A p
        d              j=1
          yz
        d                    Jacobian (sensitivity) matrix
          zz
        d

                         
More components:
          xx
        d
          xy
        d
                       M
          xz
        d
          yy
                  d =∑ p j a j = A p
        d              j=1
          yz
        d               Column vector of   A
          zz
        d

                         
Forward problem:
                       M

         p         d=∑ p j a j   d
                      j=1




                        
Inverse problem:

         p
         ̂         ?   g




                    
Inverse problem


            
Minimize difference between   g   and   d
       Residual vector   r= g−d




                             
Minimize difference between            g   and   d
        Residual vector        r= g−d
    Data­misfit function:

                     N           1

    ϕ( p)=∥r∥2 =
                 (             2 2
                     ∑ (gi−d i )
                     i=1
                                   )

                                        
Minimize difference between              g   and   d
        Residual vector          r= g−d
    Data­misfit function:

                       N           1

    ϕ( p)=∥r∥2 =
                  (              2 2
                       ∑ (gi−d i )
                       i=1
                                     )
        ℓ2­norm of r




                                          
Minimize difference between              g   and   d
        Residual vector          r= g−d
    Data­misfit function:

                       N           1

    ϕ( p)=∥r∥2 =
                  (              2 2
                       ∑ (gi−d i )
                       i=1
                                     )
        ℓ2­norm of r

         Least­squares fit

                                          
Minimize difference between              g   and   d
        Residual vector          r= g−d
    Data­misfit function:

                       N           1                             N

    ϕ( p)=∥r∥2 =
                  (              2 2
                       ∑ (gi−d i )
                       i=1
                                     )       ϕ( p)=∥r∥1=∑ ∣gi −d i∣
                                                             i=1


        ℓ2­norm of r                              ℓ1­norm of r

         Least­squares fit

                                          
Minimize difference between              g   and   d
        Residual vector          r= g−d
    Data­misfit function:

                       N           1                             N

    ϕ( p)=∥r∥2 =
                  (              2 2
                       ∑ (gi−d i )
                       i=1
                                     )       ϕ( p)=∥r∥1=∑ ∣gi −d i∣
                                                             i=1


        ℓ2­norm of r                              ℓ1­norm of r

         Least­squares fit                          Robust fit

                                          
ill­posed problem

    non­existent
    non­unique
    non­stable



                     
constraints
ill­posed problem

    non­existent
    non­unique
    non­stable



                          
constraints
ill­posed problem            well­posed problem

    non­existent                 exist
    non­unique                   unique
    non­stable                   stable



                          
Constraints:
    1. Compact




                  
Constraints:
    1. Compact   no holes inside




                          
Constraints:
    1. Compact      no holes inside

    2. Concentrated around “seeds”




                             
Constraints:
    1. Compact          no holes inside

    2. Concentrated around “seeds”
          ● User­specified prisms
          ●    Given density contrasts ρs
          ●    Any # of ≠ density contrasts




                                  
Constraints:
    1. Compact           no holes inside

    2. Concentrated around “seeds”
           ● User­specified prisms
           ●    Given density contrasts ρs
           ●    Any # of ≠ density contrasts

    3. Only  p j =0 or p j =ρs


                                   
Constraints:
    1. Compact           no holes inside

    2. Concentrated around “seeds”
           ● User­specified prisms
           ●    Given density contrasts ρs
           ●    Any # of ≠ density contrasts

    3. Only  p j =0 or p j =ρs

    4.  p j =ρs of closest seed 
                                    
Well­posed problem: Minimize goal function

              Γ( p)=ϕ( p)+μ θ( p)




                        
Well­posed problem: Minimize goal function

              Γ( p)=ϕ( p)+μ θ( p)

              Data­misfit function




                         
Well­posed problem: Minimize goal function

              Γ( p)=ϕ( p)+μ θ( p)

                   Regularizing parameter
            (Tradeoff between fit and regularization)




                           
Well­posed problem: Minimize goal function

              Γ( p)=ϕ( p)+μ θ( p)

                     Regularizing function
                            M
                                   pj
                    θ( p)=∑                l   β
                                               j
                            j=1   p j +ϵ




                        
Well­posed problem: Minimize goal function

                           Γ( p)=ϕ( p)+μ θ( p)

                                  Regularizing function
                                         M
                                                pj
           Similar to            θ( p)=∑                l   β
                                                            j
    Silva Dias et al. (2009)             j=1   p j +ϵ




                                     
Well­posed problem: Minimize goal function

                           Γ( p)=ϕ( p)+μ θ( p)

                                  Regularizing function
                                         M
                                                pj
           Similar to            θ( p)=∑                l   β
                                                            j
    Silva Dias et al. (2009)             j=1   p j +ϵ
                                                            Distance between 
                                                            jth prism and seed




                                     
Well­posed problem: Minimize goal function

                           Γ( p)=ϕ( p)+μ θ( p)

                                     Regularizing function
                                            M
                                                   pj
           Similar to            θ( p)=∑                   l   β
                                                               j
    Silva Dias et al. (2009)                j=1   p j +ϵ
                                                               Distance between 
                                                               jth prism and seed
     Imposes:
         ● Compactness            Concentration around seeds
                                 ●

                                        
Constraints:
    1. Compact
                                       Regularization
    2. Concentrated around “seeds”




    3. Only  p j =0 or p j =ρs

    4.  p j =ρs of closest seed 
                                    
Constraints:
    1. Compact
                                         Regularization
    2. Concentrated around “seeds”




    3. Only  p j =0 or p j =ρs
                                         Algorithm
    4.  p j =ρs of closest seed        Based on René (1986)

                                    
Planting Algorithm


             
Setup:       g = observed data




          
Setup:                                    g = observed data
Define interpretative model




                       Interpretative model




                                  
Setup:                                    g = observed data
Define interpretative model

All parameters zero


                       Interpretative model




                                  
Setup:                                    g = observed data
Define interpretative model

All parameters zero

N S seeds
                       Interpretative model




                                  
Setup:                            g = observed data
Define interpretative model

All parameters zero

N S seeds
Include seeds




                                               Prisms with  p j =0
                                               not shown
Setup:                            g = observed data
Define interpretative model

All parameters zero

N S seeds
                                  d = predicted data
Include seeds

Compute initial residuals

    (0)        (0)
    r = g− d




                                                Prisms with  p j =0
                                                not shown
Setup:                            g = observed data
Define interpretative model

All parameters zero

N S seeds
                                  d = predicted data
Include seeds

Compute initial residuals

     (0)       (0)
    r = g− d

    Predicted by seeds


                                                Prisms with  p j =0
                                                not shown
Setup:                                 g = observed data
Define interpretative model

All parameters zero

N S seeds
                                       d = predicted data
Include seeds

Compute initial residuals
                NS


            (∑ )
    r (0)= g−
                s=1
                      ρs a j   S




                                                     Prisms with  p j =0
                                                     not shown
Setup:                                 g = observed data
Define interpretative model

All parameters zero

N S seeds
                                       d = predicted data
Include seeds

Compute initial residuals
                NS


            (∑ )
    r (0)= g−
                s=1
                      ρs a j   S



                                             Neighbors
Find neighbors of seeds

                                                     Prisms with  p j =0
                                                     not shown
Growth:
    Try accretion to sth seed:




                                     Prisms with  p j =0
                                     not shown
Growth:
    Try accretion to sth seed:
       Choose neighbor:

          1. Reduce data misfit

          2. Smallest goal function




                                          Prisms with  p j =0
                                          not shown
Growth:
    Try accretion to sth seed:
       Choose neighbor:

          1. Reduce data misfit

          2. Smallest goal function

        j = chosen       p j =ρs (New elements)



                                                  j




                                                      Prisms with  p j =0
                                                      not shown
Growth:
    Try accretion to sth seed:
       Choose neighbor:

          1. Reduce data misfit

          2. Smallest goal function

        j = chosen                       p j =ρs (New elements)
       Update residuals
               ( new)        (old )
           r            =r            − pj aj
                                                                  j




                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
       Choose neighbor:

          1. Reduce data misfit

          2. Smallest goal function

        j = chosen                       p j =ρs (New elements)
       Update residuals
               ( new)        (old )
           r            =r            − pj aj
                                                                  j

                         Contribution of j




                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
       Choose neighbor:

          1. Reduce data misfit

          2. Smallest goal function

        j = chosen                       p j =ρs (New elements)
       Update residuals
               ( new)        (old )
           r            =r            − pj aj
        None found = no accretion                                 j



                 Variable sizes


                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
         Choose neighbor:

           1. Reduce data misfit

    NS     2. Smallest goal function

         j = chosen                      p j =ρs (New elements)
         Update residuals
               ( new)        (old )
           r            =r            − pj aj
         None found = no accretion




                                                                  Prisms with  p j =0
                                                                  not shown
Growth:
    Try accretion to sth seed:
         Choose neighbor:

           1. Reduce data misfit

    NS     2. Smallest goal function

         j = chosen                      p j =ρs (New elements)
         Update residuals
               ( new)        (old )
           r            =r            − pj aj
         None found = no accretion
                                                                  j




                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
         Choose neighbor:

           1. Reduce data misfit

    NS     2. Smallest goal function

         j = chosen                      p j =ρs (New elements)
         Update residuals
               ( new)        (old )
           r            =r            − pj aj
         None found = no accretion
                                                                  j
     At least one seed grow?




                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
         Choose neighbor:

           1. Reduce data misfit

    NS     2. Smallest goal function

         j = chosen                      p j =ρs (New elements)
         Update residuals
               ( new)        (old )
           r            =r            − pj aj
         None found = no accretion
                                                                  j
     At least one seed grow?
     Yes


                                                                      Prisms with  p j =0
                                                                      not shown
Growth:
    Try accretion to sth seed:
         Choose neighbor:

           1. Reduce data misfit

    NS     2. Smallest goal function

         j = chosen                      p j =ρs (New elements)
         Update residuals
               ( new)        (old )
           r            =r            − pj aj
         None found = no accretion
                                                                  j
     At least one seed grow?
     Yes                 No

                                                                      Prisms with  p j =0
                        Done!                                         not shown
Advantages:
      Compact & non­smooth
      Any number of sources
      Any number of different density contrasts
      No large equation system
      Search limited to neighbors




                             
Remember equations:
          Initial residual                  Update residual vector
                     NS
        (0)
       r = g−    (   ∑ ρs a j
                     s=1
                                S   )        r   (new)
                                                         =r   (old )
                                                                   − pj aj




                                         
Remember equations:
            Initial residual                  Update residual vector
                       NS
          (0)
         r = g−
                   (   ∑ ρs a j
                       s=1
                                  S   )        r
                                                   (new)
                                                       =r
                                                            (old )
                                                                 − pj aj

    No matrix multiplication (only vector +)




                                           
Remember equations:
            Initial residual                  Update residual vector
                       NS
          (0)
         r = g−
                   (   ∑ ρs a j
                       s=1
                                  S   )        r
                                                   (new)
                                                       =r
                                                            (old )
                                                                 − pj aj

    No matrix multiplication (only vector +)

                               Only need some columns of A




                                           
Remember equations:
            Initial residual                  Update residual vector
                       NS
          (0)
         r = g−
                   (   ∑ ρs a j
                       s=1
                                  S   )        r
                                                   (new)
                                                       =r
                                                            (old )
                                                                 − pj aj

    No matrix multiplication (only vector +)

                               Only need some columns of A

     Calculate only when needed



                                           
Remember equations:
            Initial residual                  Update residual vector
                       NS
          (0)
         r = g−
                   (   ∑ ρs a j
                       s=1
                                  S   )        r
                                                   (new)
                                                       =r
                                                            (old )
                                                                 − pj aj

    No matrix multiplication (only vector +)

                               Only need some columns of A

     Calculate only when needed & delete after update



                                           
Remember equations:
            Initial residual                  Update residual vector
                       NS
          (0)
         r = g−
                   (   ∑ ρs a j
                       s=1
                                  S   )        r
                                                   (new)
                                                       =r
                                                            (old )
                                                                 − pj aj

    No matrix multiplication (only vector +)

                               Only need some columns of A

     Calculate only when needed & delete after update


                               Lazy evaluation
                                           
Advantages:
      Compact & non­smooth
      Any number of sources
      Any number of different density contrasts
      No large equation system
      Search limited to neighbors




                             
Advantages:
      Compact & non­smooth
      Any number of sources
      Any number of different density contrasts
      No large equation system
      Search limited to neighbors
      No matrix multiplication (only vector +)
      Lazy evaluation of Jacobian


                              
Advantages:
      Compact & non­smooth
      Any number of sources
      Any number of different density contrasts
      No large equation system
      Search limited to neighbors
      No matrix multiplication (only vector +)
      Lazy evaluation of Jacobian

      Fast inversion + low memory usage
                              
Synthetic Data


           
Data set:
        3 components
       ●



        51 x 51 points
       ●



        2601 points/component
       ●



        7803 measurements
       ●



        5 Eötvös noise
       ●




            
Model:




              
Model:    11 prisms
             ●




                           
Model:    11 prisms
             ●             4 outcropping
                          ●




                               
Model:    11 prisms
             ●             4 outcropping
                          ●




                               
Model:    11 prisms
             ●             4 outcropping
                          ●




                               
 Strongly interfering effects
    ●




              
 Strongly interfering effects
    ●



     What if only interested in these?
    ●




              
 Common scenario
    ●




            
 Common scenario
    ●



     May not have prior information
    ●



         Density contrast
        ●



         Approximate depth
        ●




                
 Common scenario
    ●



     May not have prior information
    ●



         Density contrast
        ●



         Approximate depth
        ●




     No way to provide seeds
    ●




                
 Common scenario
    ●



     May not have prior information
    ●



         Density contrast
        ●



         Approximate depth
        ●




     No way to provide seeds
    ●



     Difficult to isolate effect of targets
    ●




                
Robust procedure:




              
Robust procedure:
     ● Seeds only for targets




                
Robust procedure:
     ● Seeds only for targets




                
Robust procedure:
     ● Seeds only for targets
     ●  ℓ1­norm to “ignore” non­targeted




                
Robust procedure:
     ● Seeds only for targets
     ●  ℓ1­norm to “ignore” non­targeted




                
Inversion: ● 13 seeds ● 7,803 data




                          
Inversion: ● 13 seeds ● 7,803 data




                          
Inversion: ● 13 seeds ● 7,803 data




                          
Inversion: ● 13 seeds ● 7,803 data




                          
Inversion: ● 13 seeds ● 7,803 data




                          
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                          
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data    37,500 prisms
                                     ●




●    Recover shape of targets
                                         Only prisms with zero
                                         density contrast not shown
Inversion: ● 13 seeds ● 7,803 data           37,500 prisms
                                            ●




●    Recover shape of targets
●    Total time = 2.2 minutes (on laptop)       Only prisms with zero
                                                density contrast not shown
Inversion: ● 13 seeds ● 7,803 data     37,500 prisms
                                      ●



                 Predicted data in contours




                           
Inversion: ● 13 seeds ● 7,803 data     ● 37,500 prisms
                 Predicted data in contours
                 Effect of true targeted sources




                            
Real Data


         
Data:




     3 components
    ●



     FTG survey
    ●



     Quadrilátero Ferrífero, Brazil
    ●


                                  
Data:




     3 components
    ●                                 Targets:
     FTG survey
    ●                                  Iron ore bodies
                                      ●



     Quadrilátero Ferrífero, Brazil
    ●                                  BIFs of Cauê Formation
                                      ●


                                  
Data:




     3 components
    ●                                 Targets:
     FTG survey
    ●                                  Iron ore bodies
                                      ●



     Quadrilátero Ferrífero, Brazil
    ●                                  BIFs of Cauê Formation
                                      ●


                                  
Data:




    Seeds for iron ore:
     Density contrast 1.0 g/cm3
    ●



     Depth 200 m
    ●



                                   
Inversion:  46 seeds  13,746 data
    Observed
    Predicted   ●    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                                        Only prisms with zero
                           
                                        density contrast not shown
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                                        Only prisms with zero
                           
                                        density contrast not shown
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                                        Only prisms with zero
                           
                                        density contrast not shown
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●               164,892 prisms
                                    ●




                           
Inversion:  46 seeds  13,746 data
           ●         ●                  ● 164,892 prisms

                         ●    Agree with previous interpretations
                                   (Martinez et al., 2010)




                              
Inversion:  46 seeds  13,746 data
           ●         ●                     ● 164,892 prisms

                         ●    Agree with previous interpretations
                                      (Martinez et al., 2010)

                                 ●    Total time = 14 minutes
                                           (on laptop)




                              
Conclusions


          
Conclusions
    ●   New 3D gravity gradient inversion
    ●   Multiple sources
    ●   Interfering gravitational effects
    ●   Non­targeted sources
    ●   No matrix multiplications
    ●   No linear systems
    ●   Lazy evaluation of Jacobian matrix

                                 
Conclusions
    ●   Estimates geometry
    ●   Given density contrasts
    ●   Ideal for:
        ●   Sharp contacts
        ●   Well­constrained physical properties
            –   Ore bodies
            –   Intrusive rocks
            –   Salt domes


                                        
Thank you


         

More Related Content

What's hot

Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
Gabriel Peyré
 
Lesson 13: Derivatives of Logarithmic and Exponential Functions
Lesson 13: Derivatives of Logarithmic and Exponential FunctionsLesson 13: Derivatives of Logarithmic and Exponential Functions
Lesson 13: Derivatives of Logarithmic and Exponential Functions
Matthew Leingang
 
TABLA DE DERIVADAS
TABLA DE DERIVADASTABLA DE DERIVADAS
TABLA DE DERIVADAS
Educación
 
Dsp3
Dsp3Dsp3
A current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systemsA current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systems
Alexander Decker
 
Lesson 14: Exponential Functions
Lesson 14: Exponential FunctionsLesson 14: Exponential Functions
Lesson 14: Exponential Functions
Matthew Leingang
 
rinko2010
rinko2010rinko2010
rinko2010
Seiya Tokui
 
Lesson 14: Exponential Functions
Lesson 14: Exponential FunctionsLesson 14: Exponential Functions
Lesson 14: Exponential Functions
Matthew Leingang
 
rinko2011-agh
rinko2011-aghrinko2011-agh
rinko2011-agh
Seiya Tokui
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer vision
zukun
 
Derivative Free Optimization
Derivative Free OptimizationDerivative Free Optimization
Derivative Free Optimization
Olivier Teytaud
 
Lesson 12: Linear Approximation
Lesson 12: Linear ApproximationLesson 12: Linear Approximation
Lesson 12: Linear Approximation
Matthew Leingang
 
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
SSA KPI
 
Lesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and LogarithmsLesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and Logarithms
Matthew Leingang
 
Bouguet's MatLab Camera Calibration Toolbox
Bouguet's MatLab Camera Calibration ToolboxBouguet's MatLab Camera Calibration Toolbox
Bouguet's MatLab Camera Calibration Toolbox
Yuji Oyamada
 
Lesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and LogarithmsLesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and Logarithms
Matthew Leingang
 
Tprimal agh
Tprimal aghTprimal agh
Tprimal agh
Seiya Tokui
 
4th Semester Mechanincal Engineering (2012-December) Question Papers
4th Semester Mechanincal Engineering (2012-December) Question Papers4th Semester Mechanincal Engineering (2012-December) Question Papers
4th Semester Mechanincal Engineering (2012-December) Question Papers
BGS Institute of Technology, Adichunchanagiri University (ACU)
 

What's hot (18)

Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
 
Lesson 13: Derivatives of Logarithmic and Exponential Functions
Lesson 13: Derivatives of Logarithmic and Exponential FunctionsLesson 13: Derivatives of Logarithmic and Exponential Functions
Lesson 13: Derivatives of Logarithmic and Exponential Functions
 
TABLA DE DERIVADAS
TABLA DE DERIVADASTABLA DE DERIVADAS
TABLA DE DERIVADAS
 
Dsp3
Dsp3Dsp3
Dsp3
 
A current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systemsA current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systems
 
Lesson 14: Exponential Functions
Lesson 14: Exponential FunctionsLesson 14: Exponential Functions
Lesson 14: Exponential Functions
 
rinko2010
rinko2010rinko2010
rinko2010
 
Lesson 14: Exponential Functions
Lesson 14: Exponential FunctionsLesson 14: Exponential Functions
Lesson 14: Exponential Functions
 
rinko2011-agh
rinko2011-aghrinko2011-agh
rinko2011-agh
 
05 history of cv a machine learning (theory) perspective on computer vision
05  history of cv a machine learning (theory) perspective on computer vision05  history of cv a machine learning (theory) perspective on computer vision
05 history of cv a machine learning (theory) perspective on computer vision
 
Derivative Free Optimization
Derivative Free OptimizationDerivative Free Optimization
Derivative Free Optimization
 
Lesson 12: Linear Approximation
Lesson 12: Linear ApproximationLesson 12: Linear Approximation
Lesson 12: Linear Approximation
 
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
Parameter Estimation in Stochastic Differential Equations by Continuous Optim...
 
Lesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and LogarithmsLesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and Logarithms
 
Bouguet's MatLab Camera Calibration Toolbox
Bouguet's MatLab Camera Calibration ToolboxBouguet's MatLab Camera Calibration Toolbox
Bouguet's MatLab Camera Calibration Toolbox
 
Lesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and LogarithmsLesson 15: Inverse Functions and Logarithms
Lesson 15: Inverse Functions and Logarithms
 
Tprimal agh
Tprimal aghTprimal agh
Tprimal agh
 
4th Semester Mechanincal Engineering (2012-December) Question Papers
4th Semester Mechanincal Engineering (2012-December) Question Papers4th Semester Mechanincal Engineering (2012-December) Question Papers
4th Semester Mechanincal Engineering (2012-December) Question Papers
 

Similar to Robust 3D gravity gradient inversion by planting anomalous densities

3D gravity inversion by planting anomalous densities
3D gravity inversion by planting anomalous densities3D gravity inversion by planting anomalous densities
3D gravity inversion by planting anomalous densities
Leonardo Uieda
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
zabidah awang
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
zabidah awang
 
Normal
NormalNormal
Normal
altwirqi
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
zabidah awang
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
zabidah awang
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
ketanaka
 
Basic differential equations in fluid mechanics
Basic differential equations in fluid mechanicsBasic differential equations in fluid mechanics
Basic differential equations in fluid mechanics
Tarun Gehlot
 
Figures
FiguresFigures
Figures
Drradz Maths
 
Figures
FiguresFigures
Figures
Drradz Maths
 
Lista exercintegrais
Lista exercintegraisLista exercintegrais
Lista exercintegrais
Universidade de São Paulo USP
 
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve FittingScientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
Enthought, Inc.
 
Example triple integral
Example triple integralExample triple integral
Example triple integral
Zulaikha Ahmad
 
icml2004 tutorial on bayesian methods for machine learning
icml2004 tutorial on bayesian methods for machine learningicml2004 tutorial on bayesian methods for machine learning
icml2004 tutorial on bayesian methods for machine learning
zukun
 
Beam theory
Beam theoryBeam theory
Beam theory
bissla19
 
Seismic
SeismicSeismic
Seismic
aburocket33
 
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
STAIR Lab, Chiba Institute of Technology
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical Observations
SEENET-MTP
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reduced
Kyro Fitkry
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Tasuku Soma
 

Similar to Robust 3D gravity gradient inversion by planting anomalous densities (20)

3D gravity inversion by planting anomalous densities
3D gravity inversion by planting anomalous densities3D gravity inversion by planting anomalous densities
3D gravity inversion by planting anomalous densities
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
 
Normal
NormalNormal
Normal
 
2 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 20102 senarai rumus add maths k2 trial spm sbp 2010
2 senarai rumus add maths k2 trial spm sbp 2010
 
2 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 20102 senarai rumus add maths k1 trial spm sbp 2010
2 senarai rumus add maths k1 trial spm sbp 2010
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
 
Basic differential equations in fluid mechanics
Basic differential equations in fluid mechanicsBasic differential equations in fluid mechanics
Basic differential equations in fluid mechanics
 
Figures
FiguresFigures
Figures
 
Figures
FiguresFigures
Figures
 
Lista exercintegrais
Lista exercintegraisLista exercintegrais
Lista exercintegrais
 
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve FittingScientific Computing with Python Webinar 9/18/2009:Curve Fitting
Scientific Computing with Python Webinar 9/18/2009:Curve Fitting
 
Example triple integral
Example triple integralExample triple integral
Example triple integral
 
icml2004 tutorial on bayesian methods for machine learning
icml2004 tutorial on bayesian methods for machine learningicml2004 tutorial on bayesian methods for machine learning
icml2004 tutorial on bayesian methods for machine learning
 
Beam theory
Beam theoryBeam theory
Beam theory
 
Seismic
SeismicSeismic
Seismic
 
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
Higher-order Factorization Machines(第5回ステアラボ人工知能セミナー)
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical Observations
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reduced
 
Nonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares MethodNonconvex Compressed Sensing with the Sum-of-Squares Method
Nonconvex Compressed Sensing with the Sum-of-Squares Method
 

More from Leonardo Uieda

Modelagem e inversão em coordenadas esféricas na gravimetria
Modelagem e inversão em coordenadas esféricas na gravimetriaModelagem e inversão em coordenadas esféricas na gravimetria
Modelagem e inversão em coordenadas esféricas na gravimetria
Leonardo Uieda
 
Gravity inversion in spherical coordinates using tesseroids
Gravity inversion in spherical coordinates using tesseroidsGravity inversion in spherical coordinates using tesseroids
Gravity inversion in spherical coordinates using tesseroids
Leonardo Uieda
 
3D magnetic inversion by planting anomalous densities
3D magnetic inversion by planting anomalous densities3D magnetic inversion by planting anomalous densities
3D magnetic inversion by planting anomalous densities
Leonardo Uieda
 
Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
 Iron ore interpretation using gravity-gradient inversions in the Carajás, Br... Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
Leonardo Uieda
 
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
Leonardo Uieda
 
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
Leonardo Uieda
 
Computation of the gravity gradient tensor due to topographic masses using te...
Computation of the gravity gradient tensor due to topographic masses using te...Computation of the gravity gradient tensor due to topographic masses using te...
Computation of the gravity gradient tensor due to topographic masses using te...
Leonardo Uieda
 

More from Leonardo Uieda (7)

Modelagem e inversão em coordenadas esféricas na gravimetria
Modelagem e inversão em coordenadas esféricas na gravimetriaModelagem e inversão em coordenadas esféricas na gravimetria
Modelagem e inversão em coordenadas esféricas na gravimetria
 
Gravity inversion in spherical coordinates using tesseroids
Gravity inversion in spherical coordinates using tesseroidsGravity inversion in spherical coordinates using tesseroids
Gravity inversion in spherical coordinates using tesseroids
 
3D magnetic inversion by planting anomalous densities
3D magnetic inversion by planting anomalous densities3D magnetic inversion by planting anomalous densities
3D magnetic inversion by planting anomalous densities
 
Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
 Iron ore interpretation using gravity-gradient inversions in the Carajás, Br... Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
Iron ore interpretation using gravity-gradient inversions in the Carajás, Br...
 
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
Rapid 3D inversion of gravity and gravity gradient data to test geologic hypo...
 
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
Use of the 'shape-of-anomaly' data misfit in 3D inversion by planting anomalo...
 
Computation of the gravity gradient tensor due to topographic masses using te...
Computation of the gravity gradient tensor due to topographic masses using te...Computation of the gravity gradient tensor due to topographic masses using te...
Computation of the gravity gradient tensor due to topographic masses using te...
 

Recently uploaded

Data structures and Algorithms in Python.pdf
Data structures and Algorithms in Python.pdfData structures and Algorithms in Python.pdf
Data structures and Algorithms in Python.pdf
TIPNGVN2
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
danishmna97
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
Safe Software
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
Large Language Model (LLM) and it’s Geospatial Applications
Large Language Model (LLM) and it’s Geospatial ApplicationsLarge Language Model (LLM) and it’s Geospatial Applications
Large Language Model (LLM) and it’s Geospatial Applications
Rohit Gautam
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
ThomasParaiso2
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
Octavian Nadolu
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
Quotidiano Piemontese
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIEnchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Vladimir Iglovikov, Ph.D.
 
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
Neo4j
 
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfObservability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Paige Cruz
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
DianaGray10
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Nexer Digital
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
Kumud Singh
 
Generative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionGenerative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to Production
Aggregage
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 

Recently uploaded (20)

Data structures and Algorithms in Python.pdf
Data structures and Algorithms in Python.pdfData structures and Algorithms in Python.pdf
Data structures and Algorithms in Python.pdf
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
Large Language Model (LLM) and it’s Geospatial Applications
Large Language Model (LLM) and it’s Geospatial ApplicationsLarge Language Model (LLM) and it’s Geospatial Applications
Large Language Model (LLM) and it’s Geospatial Applications
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIEnchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
 
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
GraphSummit Singapore | Neo4j Product Vision & Roadmap - Q2 2024
 
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfObservability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
 
Generative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionGenerative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to Production
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 

Robust 3D gravity gradient inversion by planting anomalous densities

  • 1. Robust 3D gravity gradient inversion by planting anomalous densities Leonardo Uieda Valéria C. F. Barbosa Observatório Nacional September, 2011    
  • 3. Outline Forward Problem    
  • 4. Outline Forward Problem Inverse Problem    
  • 5. Outline Forward Problem Inverse Problem Planting Algorithm Inspired by René (1986)    
  • 6. Outline Forward Problem Inverse Problem Planting Algorithm Inspired by René (1986) Synthetic Data    
  • 7. Outline Forward Problem Inverse Problem Planting Algorithm Inspired by René (1986) Synthetic Data Real Data Quadrilátero Ferrífero, Brazil    
  • 9. Observed g αβ αβ g    
  • 10. Observed g αβ αβ g    
  • 11. Observed g αβ αβ g Anomalous density    
  • 12. Observed g αβ αβ g Want to model this Anomalous density    
  • 14. Interpretative model Right rectangular prism Δ ρ= p j    
  • 15.   Prisms with  p j =0   not shown
  • 16. [] p1 p= p 2 ⋮ pM   Prisms with  p j =0   not shown
  • 17. Predicted g αβ αβ d [] p1 p= p 2 ⋮ pM   Prisms with  p j =0   not shown
  • 18. Predicted g αβ αβ d M d =∑ p j a αβ αβ j j=1 [] p1 p= p 2 ⋮ pM   Prisms with  p j =0   not shown
  • 19. Predicted g αβ αβ d M d =∑ p j a αβ αβ j j=1 Contribution of jth prism [] p1 p= p 2 ⋮ pM   Prisms with  p j =0   not shown
  • 20. More components: xx d xy d xz d yy d yz d zz d    
  • 21. More components: xx d xy d xz d yy d d yz d zz d    
  • 22. More components: xx d xy d M xz d yy d =∑ p j a j d j=1 yz d zz d    
  • 23. More components: xx d xy d M xz d yy d =∑ p j a j = A p d j=1 yz d zz d    
  • 24. More components: xx d xy d M xz d yy d =∑ p j a j = A p d j=1 yz d Jacobian (sensitivity) matrix zz d    
  • 25. More components: xx d xy d M xz d yy d =∑ p j a j = A p d j=1 yz d Column vector of A zz d    
  • 26. Forward problem: M p d=∑ p j a j d j=1    
  • 27. Inverse problem: p ̂ ? g    
  • 29. Minimize difference between g and d Residual vector r= g−d    
  • 30. Minimize difference between g and d Residual vector r= g−d Data­misfit function: N 1 ϕ( p)=∥r∥2 = ( 2 2 ∑ (gi−d i ) i=1 )    
  • 31. Minimize difference between g and d Residual vector r= g−d Data­misfit function: N 1 ϕ( p)=∥r∥2 = ( 2 2 ∑ (gi−d i ) i=1 )  ℓ2­norm of r    
  • 32. Minimize difference between g and d Residual vector r= g−d Data­misfit function: N 1 ϕ( p)=∥r∥2 = ( 2 2 ∑ (gi−d i ) i=1 )  ℓ2­norm of r Least­squares fit    
  • 33. Minimize difference between g and d Residual vector r= g−d Data­misfit function: N 1 N ϕ( p)=∥r∥2 = ( 2 2 ∑ (gi−d i ) i=1 ) ϕ( p)=∥r∥1=∑ ∣gi −d i∣ i=1  ℓ2­norm of r  ℓ1­norm of r Least­squares fit    
  • 34. Minimize difference between g and d Residual vector r= g−d Data­misfit function: N 1 N ϕ( p)=∥r∥2 = ( 2 2 ∑ (gi−d i ) i=1 ) ϕ( p)=∥r∥1=∑ ∣gi −d i∣ i=1  ℓ2­norm of r  ℓ1­norm of r Least­squares fit Robust fit    
  • 35. ill­posed problem non­existent non­unique non­stable    
  • 36. constraints ill­posed problem non­existent non­unique non­stable    
  • 37. constraints ill­posed problem well­posed problem non­existent exist non­unique unique non­stable stable    
  • 38. Constraints: 1. Compact    
  • 39. Constraints: 1. Compact no holes inside    
  • 40. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds”    
  • 41. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds” ● User­specified prisms ●  Given density contrasts ρs ●  Any # of ≠ density contrasts    
  • 42. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds” ● User­specified prisms ●  Given density contrasts ρs ●  Any # of ≠ density contrasts 3. Only  p j =0 or p j =ρs    
  • 43. Constraints: 1. Compact no holes inside 2. Concentrated around “seeds” ● User­specified prisms ●  Given density contrasts ρs ●  Any # of ≠ density contrasts 3. Only  p j =0 or p j =ρs 4.  p j =ρs of closest seed     
  • 44. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p)    
  • 45. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Data­misfit function    
  • 46. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Regularizing parameter (Tradeoff between fit and regularization)    
  • 47. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Regularizing function M pj θ( p)=∑ l β j j=1 p j +ϵ    
  • 48. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Regularizing function M pj Similar to  θ( p)=∑ l β j Silva Dias et al. (2009) j=1 p j +ϵ    
  • 49. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Regularizing function M pj Similar to  θ( p)=∑ l β j Silva Dias et al. (2009) j=1 p j +ϵ Distance between  jth prism and seed    
  • 50. Well­posed problem: Minimize goal function Γ( p)=ϕ( p)+μ θ( p) Regularizing function M pj Similar to  θ( p)=∑ l β j Silva Dias et al. (2009) j=1 p j +ϵ Distance between  jth prism and seed Imposes: ● Compactness  Concentration around seeds ●    
  • 51. Constraints: 1. Compact Regularization 2. Concentrated around “seeds” 3. Only  p j =0 or p j =ρs 4.  p j =ρs of closest seed     
  • 52. Constraints: 1. Compact Regularization 2. Concentrated around “seeds” 3. Only  p j =0 or p j =ρs Algorithm 4.  p j =ρs of closest seed  Based on René (1986)    
  • 54. Setup: g = observed data    
  • 55. Setup: g = observed data Define interpretative model Interpretative model    
  • 56. Setup: g = observed data Define interpretative model All parameters zero Interpretative model    
  • 57. Setup: g = observed data Define interpretative model All parameters zero N S seeds Interpretative model    
  • 58. Setup: g = observed data Define interpretative model All parameters zero N S seeds Include seeds     Prisms with  p j =0 not shown
  • 59. Setup: g = observed data Define interpretative model All parameters zero N S seeds d = predicted data Include seeds Compute initial residuals (0) (0) r = g− d     Prisms with  p j =0 not shown
  • 60. Setup: g = observed data Define interpretative model All parameters zero N S seeds d = predicted data Include seeds Compute initial residuals (0) (0) r = g− d Predicted by seeds     Prisms with  p j =0 not shown
  • 61. Setup: g = observed data Define interpretative model All parameters zero N S seeds d = predicted data Include seeds Compute initial residuals NS (∑ ) r (0)= g− s=1 ρs a j S     Prisms with  p j =0 not shown
  • 62. Setup: g = observed data Define interpretative model All parameters zero N S seeds d = predicted data Include seeds Compute initial residuals NS (∑ ) r (0)= g− s=1 ρs a j S Neighbors Find neighbors of seeds     Prisms with  p j =0 not shown
  • 63. Growth: Try accretion to sth seed:     Prisms with  p j =0 not shown
  • 64. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit 2. Smallest goal function     Prisms with  p j =0 not shown
  • 65. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit 2. Smallest goal function j = chosen p j =ρs (New elements) j     Prisms with  p j =0 not shown
  • 66. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj j     Prisms with  p j =0 not shown
  • 67. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj j Contribution of j     Prisms with  p j =0 not shown
  • 68. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion j Variable sizes     Prisms with  p j =0 not shown
  • 69. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit NS 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion     Prisms with  p j =0 not shown
  • 70. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit NS 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion j     Prisms with  p j =0 not shown
  • 71. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit NS 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion j At least one seed grow?     Prisms with  p j =0 not shown
  • 72. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit NS 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion j At least one seed grow? Yes     Prisms with  p j =0 not shown
  • 73. Growth: Try accretion to sth seed: Choose neighbor: 1. Reduce data misfit NS 2. Smallest goal function j = chosen p j =ρs (New elements) Update residuals ( new) (old ) r =r − pj aj None found = no accretion j At least one seed grow? Yes No     Prisms with  p j =0 Done! not shown
  • 74. Advantages: Compact & non­smooth Any number of sources Any number of different density contrasts No large equation system Search limited to neighbors    
  • 75. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj    
  • 76. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj No matrix multiplication (only vector +)    
  • 77. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj No matrix multiplication (only vector +) Only need some columns of A    
  • 78. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj No matrix multiplication (only vector +) Only need some columns of A Calculate only when needed    
  • 79. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj No matrix multiplication (only vector +) Only need some columns of A Calculate only when needed & delete after update    
  • 80. Remember equations: Initial residual Update residual vector NS (0) r = g− ( ∑ ρs a j s=1 S ) r (new) =r (old ) − pj aj No matrix multiplication (only vector +) Only need some columns of A Calculate only when needed & delete after update Lazy evaluation    
  • 81. Advantages: Compact & non­smooth Any number of sources Any number of different density contrasts No large equation system Search limited to neighbors    
  • 82. Advantages: Compact & non­smooth Any number of sources Any number of different density contrasts No large equation system Search limited to neighbors No matrix multiplication (only vector +) Lazy evaluation of Jacobian    
  • 83. Advantages: Compact & non­smooth Any number of sources Any number of different density contrasts No large equation system Search limited to neighbors No matrix multiplication (only vector +) Lazy evaluation of Jacobian Fast inversion + low memory usage    
  • 85. Data set:  3 components ●  51 x 51 points ●  2601 points/component ●  7803 measurements ●  5 Eötvös noise ●    
  • 86. Model:    
  • 87. Model:  11 prisms ●    
  • 88. Model:  11 prisms ●  4 outcropping ●    
  • 89. Model:  11 prisms ●  4 outcropping ●    
  • 90. Model:  11 prisms ●  4 outcropping ●    
  • 92.  Strongly interfering effects ●  What if only interested in these? ●    
  • 93.  Common scenario ●    
  • 94.  Common scenario ●  May not have prior information ●  Density contrast ●  Approximate depth ●    
  • 95.  Common scenario ●  May not have prior information ●  Density contrast ●  Approximate depth ●  No way to provide seeds ●    
  • 96.  Common scenario ●  May not have prior information ●  Density contrast ●  Approximate depth ●  No way to provide seeds ●  Difficult to isolate effect of targets ●    
  • 98. Robust procedure: ● Seeds only for targets    
  • 99. Robust procedure: ● Seeds only for targets    
  • 100. Robust procedure: ● Seeds only for targets ●  ℓ1­norm to “ignore” non­targeted    
  • 101. Robust procedure: ● Seeds only for targets ●  ℓ1­norm to “ignore” non­targeted    
  • 107. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ●    
  • 108. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Only prisms with zero     density contrast not shown
  • 109. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Only prisms with zero     density contrast not shown
  • 110. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Only prisms with zero     density contrast not shown
  • 111. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Only prisms with zero     density contrast not shown
  • 112. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Only prisms with zero     density contrast not shown
  • 113. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● ●  Recover shape of targets Only prisms with zero     density contrast not shown
  • 114. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● ●  Recover shape of targets ●  Total time = 2.2 minutes (on laptop) Only prisms with zero     density contrast not shown
  • 115. Inversion: ● 13 seeds ● 7,803 data  37,500 prisms ● Predicted data in contours    
  • 116. Inversion: ● 13 seeds ● 7,803 data ● 37,500 prisms Predicted data in contours Effect of true targeted sources    
  • 118. Data:  3 components ●  FTG survey ●  Quadrilátero Ferrífero, Brazil ●    
  • 119. Data:  3 components ● Targets:  FTG survey ●  Iron ore bodies ●  Quadrilátero Ferrífero, Brazil ●  BIFs of Cauê Formation ●    
  • 120. Data:  3 components ● Targets:  FTG survey ●  Iron ore bodies ●  Quadrilátero Ferrífero, Brazil ●  BIFs of Cauê Formation ●    
  • 121. Data: Seeds for iron ore:  Density contrast 1.0 g/cm3 ●  Depth 200 m ●    
  • 122. Inversion:  46 seeds  13,746 data Observed Predicted ● ●    
  • 123. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 124. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 125. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 126. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 127. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 128. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 129. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ● Only prisms with zero     density contrast not shown
  • 130. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ● Only prisms with zero     density contrast not shown
  • 131. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ● Only prisms with zero     density contrast not shown
  • 132. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 133. Inversion:  46 seeds  13,746 data ● ●  164,892 prisms ●    
  • 134. Inversion:  46 seeds  13,746 data ● ● ● 164,892 prisms ●  Agree with previous interpretations (Martinez et al., 2010)    
  • 135. Inversion:  46 seeds  13,746 data ● ● ● 164,892 prisms ●  Agree with previous interpretations (Martinez et al., 2010) ●  Total time = 14 minutes (on laptop)    
  • 137. Conclusions ● New 3D gravity gradient inversion ● Multiple sources ● Interfering gravitational effects ● Non­targeted sources ● No matrix multiplications ● No linear systems ● Lazy evaluation of Jacobian matrix    
  • 138. Conclusions ● Estimates geometry ● Given density contrasts ● Ideal for: ● Sharp contacts ● Well­constrained physical properties – Ore bodies – Intrusive rocks – Salt domes