SlideShare a Scribd company logo
Fishing for Errors
 Extending the Treatment of Errors in the
        Fisher Matrix Formalism



         !
      ded              Moumita Aich
     a                 Mohammed El-Mufti

 elo                   Eli Kasai
                       Brian Nord
R                      Marina Seikel
                       Sahba Yahya
                       Alan Heavens
                       Bruce Bassett
     12 April 2012
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                 68% confidence that parameters
confidence contours cosmological parameter!                         lie within the blue (dashed)
                                                                                       contour.
Ingredients required:
     a parametrized, physical model for an




                                                           θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]            99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                            θB
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                          68% confidence that parameters
confidence contours cosmological parameter!                                  lie within the blue (dashed)
                                                                                                contour.
Ingredients required:
     a parametrized, physical model for an




                                                                 θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]                        99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                                        θB

General Formalism:
           ⌧ 2
               @ lnL         [definition]
 FAB =                                                               y = f (x; ✓)                  model


                                                                 {
             @✓A @✓B
                                   ✓                         ◆       ✓                  ◆ covariance of
          @f (x)     1 @f (x)  1           1 @C      1 @C                2
  FAB   =        C            + Tr C             C                       y1        0          data variable
           @✓A          @✓B    2             @✓A       @✓B                         2
                                                                         0         yN               errors
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                          68% confidence that parameters
confidence contours cosmological parameter!                                  lie within the blue (dashed)
                                                                                                contour.
Ingredients required:
     a parametrized, physical model for an




                                                                 θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]                        99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                                        θB

General Formalism:
           ⌧ 2
               @ lnL         [definition]
 FAB =                                                               y = f (x; ✓)                  model


                                                                 {
             @✓A @✓B
                                   ✓                         ◆       ✓                  ◆ covariance of
          @f (x)      1 @f (x)  1          1 @C      1 @C                2
  FAB   =        C             + Tr C            C                       y1        0          data variable
           @✓A           @✓B    2            @✓A       @✓B                         2
                                                                         0         yN               errors

                     This work focuses on the covariance matrix, C
Primary Goals and Questions


    •   Will the errors in the independent variable [e.g.,
        redshift] impact predicted constraints on model
        parameters?

    •   What is the impact of the dependent and
        independent variables being correlated?

    •   Can we account for multi-peaked distributions in the
        independent variable? --e.g., double-peaked
        distributions for photometric redshifts. [Next time]
Primary Goals and Questions

✓   2
              ◆
         0
    y1
    0    2        •   Will the errors in the independent variable [e.g.,
         yN           redshift] impact predicted constraints on model
                      parameters?
✓             ◆
    2
    xx   0        •   What is the impact of the dependent and
    0    2
         yy           independent variables being correlated?

                  •   Can we account for multi-peaked distributions in the
✓             ◆       independent variable? --e.g., double-peaked
    2    2
    xx   xy           distributions for photometric redshifts. [Next time]
    2    2
    yx   yy
Outline



•   The motivation: Enhance FM predictions with more
    comprehensive error accounting?

•   General approach: Derive FM from scratch.

•   Introducing Covariances in Observables.
Re-Derive FM From first principles
  Setup [Observables]
     Measured Observables:         {Xi } , {Yi } ; i = 1, . . . N
     True values of Observables:   {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
    Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
     Mean Model: x and y are related by         y = f (x)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                 {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:           {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                  y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                     (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                    (Unpack all the conditional probabilities)


       Let y always be the function of x:       Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                 xi ⇠ U = p(xi |✓)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                 {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:           {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                  y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                     (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                    (Unpack all the conditional probabilities)


       Let y always be the function of x:       Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                 xi ⇠ U = p(xi |✓)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                  {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:            {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                   y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                      (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                     (Unpack all the conditional probabilities)


       Let y always be the function of x:        Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                  xi ⇠ U = p(xi |✓)
                                         Z
                                                                     N        N
                        ) L=                 p(X, Y |x, f , ✓) d x d y
Calculate Likelihood (II.)
      Z
                                                The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of   f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi   Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:
Calculate Likelihood (II.)
      Z
                                                 The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                 but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of    f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi       Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:

                                                     {zi , Zi } = {xi , Xi } for i  N
 Let Z be a 2N-dimensional vector, containing
 both measured and true observables!                 {zi , Zi } = {f ⇤ , Yi } for i > N

                                                     Z                
 This provides the canonical form of the                    1             1
 multi-variate normal distribution:            )L/       p      exp         (Z   z)T C    1
                                                                                              (Z   z)
                                                           detC           2

                                                           ✓                       ◆
With {z,Z} as vectors, the covariance matrix                   CXX         CXY
can be written in block form:                                  CXYT        CY Y
Calculate Likelihood (II.)
      Z
                                                 The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                 but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of    f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi       Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:

                                                     {zi , Zi } = {xi , Xi } for i  N
 Let Z be a 2N-dimensional vector, containing
 both measured and true observables!                 {zi , Zi } = {f ⇤ , Yi } for i > N

                                                     Z                
 This provides the canonical form of the                    1             1
 multi-variate normal distribution:            )L/       p      exp         (Z   z)T C    1
                                                                                              (Z   z)
                                                           detC           2

                                                           ✓                       ◆
With {z,Z} as vectors, the covariance matrix                   CXX         CXY
can be written in block form:                                  CXYT        CY Y

                                         Notice that this form natively contains covariance
                                         among X’s among Y’s and between X’s and Y’s
Calculate Likelihood (III.)
       Z              
             1            1
  L/            exp         (Z    z)T C   1
                                              (Z   z)         Evaluating the exponent and simplifying,
           detC           2

                           
          1                      1 eT     1e
   )L/ p      exp                  Y R    Y
         detR                    2



(where R is a function of Cij and f*)              R = CY Y + CXYT T + T CXY + T CXX T
                                                              ✓                  ◆
                                                                  df (x)
                                                   T = diag
                                                                   dx      x=X
Calculate Likelihood (III.)
       Z              
             1            1
  L/            exp         (Z    z)T C   1
                                              (Z   z)         Evaluating the exponent and simplifying,
           detC           2

                           
          1                      1 eT     1e
   )L/ p      exp                  Y R    Y
         detR                    2



(where R is a function of Cij and f*)              R = CY Y + CXYT T + T CXY + T CXX T
                                                              ✓                  ◆
                                                                  df (x)
                                                   T = diag
                                                                   dx      x=X




Result :      Where the x data are irrelevant,
                  i.e., when derivatives of f are zero,
                  i.e., when CXX (or σx) = 0,
              We recover the original form R→ C = CYY
Main
Result :
Main
Result :
    With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the
    covariance, C, in the canonical formulation:


                         T
                                                   ✓                           ◆
                   @f            1   @f  1                 1 @R        1 @R
           FAB   =    R                 + Tr R                    R
                   @A                @B  2                   @A           @B
Main
Result :
    With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the
    covariance, C, in the canonical formulation:


                         T
                                                   ✓                           ◆
                   @f            1   @f  1                 1 @R        1 @R
           FAB   =    R                 + Tr R                    R
                   @A                @B  2                   @A           @B




           Even if C does not dependent on the parameters, R does depend on
           them [via f]. The trace term is in general non-zero.
Summary: An Extended Formalism
•   Development of general process for evaluating arbitrary
    model functions to 1st order in the FM formalism.

•   Incorporation of correlated errors among observables.




                     Next Steps
     •   Application: Double-peaked Error distributions

     •   Check: Compare to MCMC

     •   Cosmological Application

     •   Incorporate into Fisher4Cast

More Related Content

What's hot

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
Christian Robert
 
Calculus Cheat Sheet All
Calculus Cheat Sheet AllCalculus Cheat Sheet All
Calculus Cheat Sheet AllMoe Han
 
Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)
Matthew Leingang
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
Christian Robert
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
Christian Robert
 
Mvtword
MvtwordMvtword
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
Christian Robert
 
Summary of Integration Methods
Summary of Integration MethodsSummary of Integration Methods
Summary of Integration MethodsSilvius
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
Christian Robert
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
Christian Robert
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)
Matthew Leingang
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
Christian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
Christian Robert
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
Christian Robert
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
Christian Robert
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reduced
Kyro Fitkry
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
Christian Robert
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chapters
Christian Robert
 

What's hot (20)

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
Calculus Cheat Sheet All
Calculus Cheat Sheet AllCalculus Cheat Sheet All
Calculus Cheat Sheet All
 
Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
Mvtword
MvtwordMvtword
Mvtword
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Summary of Integration Methods
Summary of Integration MethodsSummary of Integration Methods
Summary of Integration Methods
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reduced
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chapters
 

Viewers also liked

Results from questionnaire
Results from questionnaireResults from questionnaire
Results from questionnairekelseyjowink
 
Analysis for existing music magazines
Analysis for existing music magazinesAnalysis for existing music magazines
Analysis for existing music magazineskelseyjowink
 
Conventions of a short film
Conventions of a short filmConventions of a short film
Conventions of a short filmkelseyjowink
 
In what ways does your media product use
In what ways does your media product useIn what ways does your media product use
In what ways does your media product usekelseyjowink
 
Target power density
Target power densityTarget power density
Target power density
Marciel Guerino
 

Viewers also liked (10)

Results from questionnaire
Results from questionnaireResults from questionnaire
Results from questionnaire
 
One touch drawing
One touch drawingOne touch drawing
One touch drawing
 
Analysis for existing music magazines
Analysis for existing music magazinesAnalysis for existing music magazines
Analysis for existing music magazines
 
Conventions of a short film
Conventions of a short filmConventions of a short film
Conventions of a short film
 
In what ways does your media product use
In what ways does your media product useIn what ways does your media product use
In what ways does your media product use
 
Tp # 4 noe
Tp # 4 noeTp # 4 noe
Tp # 4 noe
 
Searching for Supernovae
Searching for SupernovaeSearching for Supernovae
Searching for Supernovae
 
Target power density
Target power densityTarget power density
Target power density
 
Weekly plan media
Weekly plan mediaWeekly plan media
Weekly plan media
 
Cognitivism
CognitivismCognitivism
Cognitivism
 

Similar to Fishermatrix extended ctics_reloaded

Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Spacevcuesta
 
Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Spaceguest9fa195
 
MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...
Dagmar Monett
 
Algebras for programming languages
Algebras for programming languagesAlgebras for programming languages
Algebras for programming languages
Yoshihiro Mizoguchi
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
Christian Robert
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Deb Roy
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Christian Robert
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster
Chelsea Battell
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
The Statistical and Applied Mathematical Sciences Institute
 
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGHETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGgrssieee
 
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
The Statistical and Applied Mathematical Sciences Institute
 
Quantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuQuantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuSEENET-MTP
 
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
The Statistical and Applied Mathematical Sciences Institute
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapter
Christian Robert
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
SSA KPI
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
Michael Stumpf
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
Frank Nielsen
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Chiheb Ben Hammouda
 
NIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imagesNIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imageszukun
 

Similar to Fishermatrix extended ctics_reloaded (20)

Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Space
 
Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Space
 
MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...
 
Algebras for programming languages
Algebras for programming languagesAlgebras for programming languages
Algebras for programming languages
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
Midterm II Review
Midterm II ReviewMidterm II Review
Midterm II Review
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
 
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGHETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
 
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
 
Quantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuQuantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion Cotaescu
 
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapter
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
 
NIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imagesNIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual images
 

Recently uploaded

GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
DianaGray10
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
Matthew Sinclair
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
DianaGray10
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
Free Complete Python - A step towards Data Science
Free Complete Python - A step towards Data ScienceFree Complete Python - A step towards Data Science
Free Complete Python - A step towards Data Science
RinaMondal9
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
Matthew Sinclair
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
James Anderson
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
Safe Software
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Nexer Digital
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance
 

Recently uploaded (20)

GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4UiPath Test Automation using UiPath Test Suite series, part 4
UiPath Test Automation using UiPath Test Suite series, part 4
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
Free Complete Python - A step towards Data Science
Free Complete Python - A step towards Data ScienceFree Complete Python - A step towards Data Science
Free Complete Python - A step towards Data Science
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
 

Fishermatrix extended ctics_reloaded

  • 1. Fishing for Errors Extending the Treatment of Errors in the Fisher Matrix Formalism ! ded Moumita Aich a Mohammed El-Mufti elo Eli Kasai Brian Nord R Marina Seikel Sahba Yahya Alan Heavens Bruce Bassett 12 April 2012
  • 2. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB
  • 3. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB General Formalism: ⌧ 2 @ lnL [definition] FAB = y = f (x; ✓) model { @✓A @✓B ✓ ◆ ✓ ◆ covariance of @f (x) 1 @f (x) 1 1 @C 1 @C 2 FAB = C + Tr C C y1 0 data variable @✓A @✓B 2 @✓A @✓B 2 0 yN errors
  • 4. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB General Formalism: ⌧ 2 @ lnL [definition] FAB = y = f (x; ✓) model { @✓A @✓B ✓ ◆ ✓ ◆ covariance of @f (x) 1 @f (x) 1 1 @C 1 @C 2 FAB = C + Tr C C y1 0 data variable @✓A @✓B 2 @✓A @✓B 2 0 yN errors This work focuses on the covariance matrix, C
  • 5. Primary Goals and Questions • Will the errors in the independent variable [e.g., redshift] impact predicted constraints on model parameters? • What is the impact of the dependent and independent variables being correlated? • Can we account for multi-peaked distributions in the independent variable? --e.g., double-peaked distributions for photometric redshifts. [Next time]
  • 6. Primary Goals and Questions ✓ 2 ◆ 0 y1 0 2 • Will the errors in the independent variable [e.g., yN redshift] impact predicted constraints on model parameters? ✓ ◆ 2 xx 0 • What is the impact of the dependent and 0 2 yy independent variables being correlated? • Can we account for multi-peaked distributions in the ✓ ◆ independent variable? --e.g., double-peaked 2 2 xx xy distributions for photometric redshifts. [Next time] 2 2 yx yy
  • 7. Outline • The motivation: Enhance FM predictions with more comprehensive error accounting? • General approach: Derive FM from scratch. • Introducing Covariances in Observables.
  • 8. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x)
  • 9. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓)
  • 10. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓)
  • 11. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓) Z N N ) L= p(X, Y |x, f , ✓) d x d y
  • 12. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.:
  • 13. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.: {zi , Zi } = {xi , Xi } for i  N Let Z be a 2N-dimensional vector, containing both measured and true observables! {zi , Zi } = {f ⇤ , Yi } for i > N Z  This provides the canonical form of the 1 1 multi-variate normal distribution: )L/ p exp (Z z)T C 1 (Z z) detC 2 ✓ ◆ With {z,Z} as vectors, the covariance matrix CXX CXY can be written in block form: CXYT CY Y
  • 14. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.: {zi , Zi } = {xi , Xi } for i  N Let Z be a 2N-dimensional vector, containing both measured and true observables! {zi , Zi } = {f ⇤ , Yi } for i > N Z  This provides the canonical form of the 1 1 multi-variate normal distribution: )L/ p exp (Z z)T C 1 (Z z) detC 2 ✓ ◆ With {z,Z} as vectors, the covariance matrix CXX CXY can be written in block form: CXYT CY Y Notice that this form natively contains covariance among X’s among Y’s and between X’s and Y’s
  • 15. Calculate Likelihood (III.) Z  1 1 L/ exp (Z z)T C 1 (Z z) Evaluating the exponent and simplifying, detC 2  1 1 eT 1e )L/ p exp Y R Y detR 2 (where R is a function of Cij and f*) R = CY Y + CXYT T + T CXY + T CXX T ✓ ◆ df (x) T = diag dx x=X
  • 16. Calculate Likelihood (III.) Z  1 1 L/ exp (Z z)T C 1 (Z z) Evaluating the exponent and simplifying, detC 2  1 1 eT 1e )L/ p exp Y R Y detR 2 (where R is a function of Cij and f*) R = CY Y + CXYT T + T CXY + T CXX T ✓ ◆ df (x) T = diag dx x=X Result : Where the x data are irrelevant, i.e., when derivatives of f are zero, i.e., when CXX (or σx) = 0, We recover the original form R→ C = CYY
  • 18. Main Result : With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the covariance, C, in the canonical formulation: T ✓ ◆ @f 1 @f 1 1 @R 1 @R FAB = R + Tr R R @A @B 2 @A @B
  • 19. Main Result : With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the covariance, C, in the canonical formulation: T ✓ ◆ @f 1 @f 1 1 @R 1 @R FAB = R + Tr R R @A @B 2 @A @B Even if C does not dependent on the parameters, R does depend on them [via f]. The trace term is in general non-zero.
  • 20. Summary: An Extended Formalism • Development of general process for evaluating arbitrary model functions to 1st order in the FM formalism. • Incorporation of correlated errors among observables. Next Steps • Application: Double-peaked Error distributions • Check: Compare to MCMC • Cosmological Application • Incorporate into Fisher4Cast

Editor's Notes

  1. \n
  2. Define the fisher matrix and it the pieces.\nstart motivation for ...\n1) why it’s used\n2) why we want to modify: note where the errors enter\n\n\n given a model, it propagates errors from data onto model parameter estimates.\n
  3. Define the fisher matrix and it the pieces.\nstart motivation for ...\n1) why it’s used\n2) why we want to modify: note where the errors enter\n\n\n given a model, it propagates errors from data onto model parameter estimates.\n
  4. mention Trotta and previous works, and propagation of error\n
  5. \n
  6. Step through this derivation, noting the key features\n\n\n
  7. Step through this derivation, noting the key features\n\n\n
  8. Step through this derivation, noting the key features\n\n\n
  9. key features: 1) the small variation over the interval allows for taylor expansion\n\n
  10. key features: 1) the small variation over the interval allows for taylor expansion\n\n
  11. Start generally and choose the cases that are of interest; \n\nAre the key elements in the derivation that we should mention? \nWhat are the applications for this? [this is a big question for us, since that still needs to be addressed and workedon for the paper.\n\n\n
  12. \n
  13. \n
  14. \n
  15. Show some basic results from the propagation of error method: show the resulting equation for the FM and the behavior with varying sigma_x or sigma_y (plots!); this will be the naive version starting with the FM from slide 2, where people always start. \n\nExamples from the analytics for the linear case.\n
  16. The naive version also starting with the canonical form of the FM.\n\nExamples with the linear case\n
  17. Did we ever nail down why this difference occurred? Does it simply come from the fact that the MOE doesn’t have the 2nd [covariance] term in the canonical FM eqn?\n
  18. the motivation for going deeper was simply that the methods disagreed?\n