SlideShare a Scribd company logo
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Bayes Estimates for the Linear Model
by D.V. Lindley and A. F. M. Smith
Paolo Baudissone
April 17, 2015
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Outline
1 Introduction
2 An introductory example
3 General Bayesian Linear Model
4 An application
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Introduction
Object of this paper is the linear model E[y] = Aθ, where y
is a vector of observations, A a known design matrix and θ a
vector of unknown parameters. Usual estimate of θ employed
in this framework is that derived by the least squares method.
Authors argue the availability of prior information about the
parameters and the fact that this may be exploited in order to
find improved estimates; in particular they focus on situations
in which the parameters themselves have a general linear
structure in terms of other quantities called hyperparameters.
A particular form of prior information is assumed: the one
based on De Finetti’s (1964) idea of exchangeability.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
An introductory example
Suppose, in the general linear model, that we are dealing with a
unit design matrix so that E[yi ] = θi for i = 1, . . . , n and that
y1, . . . , yn are i.i.d. random variables normally distributed with
known variance σ2. Assume moreover that the distribution of the
θi ’s is exchangeable; this exchangeable prior knowledge determines
E[θi ] = µ, a common value for each i; in other words there is a
linear structure to the parameters analogous to the linear structure
supposed for the observations y.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
In this simple example, µ is the only hyperparameter. Denoting by
τ2 the variance of each θi , it can be shown (Lindley - 1971) that
the posterior mean is given by
E[θi|y] =
yi
σ2 +
n
i=1 yi
n
τ2
1
σ2 + 1
τ2
Notice that the Bayes estimate is a weighted average of yi and the
overall mean,
n
i=1 yi
n , with weights inversely proportional to the
variances of yi and θi .
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Notation
Notation y N(µ, D) means that column vector y has a
multivariate normal distribution with mean µ, a column vector,
and dispersion D, a positive semi-definite matrix.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Theorem
Suppose, given θ1, that y|θ1 N(A1θ1, C1) and that, given θ2, a
vector of p2 hyperparameters, θ1|θ2 N(A2θ2, C2). Then
1 y N(A1A2θ2, C1 + A1C2AT
1 )
2 θ1|y N(Bb, B) where
B−1
= AT
1 C−1
1 A1 + C−1
2
b = AT
1 C−1
1 y + C−1
2 A2θ2
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Proof.
Part 1 Observing that y = A1θ1 + u, where u N(0, C1), and
θ1 = A2θ2 + v, where v N(0, C2); it follows that
y = A1A2θ2 + A1v + u. Moreover, A1v + u is a linear function of
independent (multivariate) normal random variables and is
distributed as N(0, C1 + A1C2AT
1 ), hence the result follows.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Proof.
Part 2 To prove the second part we use Bayes’ theorem and write
the posterior distribution of θ1 as
p(θ1|y) ∝ p(y|θ1)p(θ1)
The right-hand side can be written as e−Q
2 , where Q is given by
(y − A1θ1)T
C−1
1 (y − A1θ1) + (θ1 − A2θ2)T
C−1
2 (θ1 − A2θ2)
and after some calculations we obtain that Q is equal to:
(θ1 − Bb)T
B−1
(θ1 − Bb) + {yT C−1
1 y + θT
2 AT
2 C−1
2 A2θ2 − bT
Bb}.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
The following result allows us to rewrite the expression of the
inverse of the dispersion matrix of the marginal distribution of y in
a way that will be useful for subsequent developments of the topic.
Theorem
For any matrices A1, A2, C1 and C2 of appropriate dimensions and
for which the inverses stated in the resul exist, we have
(C1 + A1C2AT
1 )−1 = C−1
1 − C−1
1 A1(AT
1 C−1
1 A1 + C−1
2 )−1AT
1 C−1
1
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Posterior distribution of θ1
Theorem
Suppose that, given θ1, y|θ1 N(A1θ1, C1), given θ2,
θ1|θ2 N(A2θ2, C2) and given θ3, θ2|θ3 N(A3θ3, C3). Then
the posterior distribution of θ1 is:
θ1|{Ai }, {Ci }, θ3, y N(Dd, D)
where D−1 = AT
1 C−1
1 A1 + (C2 + A2C3AT
2 )−1 and
d = AT
1 C−1
1 y + (C2 + A2C3AT
2 )−1A2A3θ3.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Remarks I
Observe that thanks to the first theorem we stated it is possible to
write down the marginal distribution of θ1, that is the prior
distribution, free of the hyperparameters θ2:
θ1 N(A2A3θ3, C2 + A2C3AT
2 )
The mean of the posterior distribution may be regarded as a point
estimate of θ1 to replace the usual least-squares estimate.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Remarks II
The form of this estimate is a weighted average of the
least-squares estimate (AT
1 C−1
1 A1)−1AT
1 C−1
1 y and the prior mean
A2A3θ3 with weights given by the inverses of the corresponding
dispersion matrices, AT
1 C−1
1 A1 for the least-squares values,
C2 + A2C3AT
2 for the prior distribution of θ1.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Thanks to the second theorem we have stated before, we are now
in a position to get several alternative expressions for the posterior
mean and variance; this is underlined in the two following results:
Theorem
An alternative expression for D−1 is given by
AT
1 C−1
1 A1 + C−1
2 − C−1
2 A2(AT
2 C−1
2 A2 + C−1
3 )−1AT
2 C−1
2 .
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Theorem
If C−1
3 = 0, the posterior distribution of θ1 is N(D0d0, D0) where
D−1
0 = AT
1 C−1
1 A1 + C−1
2 − C−1
2 A2(AT
2 C−1
2 A2)−1AT
2 C−1
2
and
d0 = AT
1 C−1
1 y.
This result gives the form mostly often used in applications.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
An application: Two-factor Experimental Designs
Consider t ”treatments” assigned to n experimental units arranged
in b ”blocks”. The usual model is given by
E[yij ] = µ + αi + βj for 1 ≤ i ≤ t, 1 ≤ j ≤ b
with the errors independent N(0, σ2). In the general notation we
have that θT
1 = (µ, α1, . . . , αt, β1, . . . , βb) and A1 describes the
designed used.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
In the second stage it seems reasonable to assume an exchangeable
prior knowledge within treatment constants {αi } and within block
constants {βj }, but that these were independent. Adding the
assumption of normality, second stage can be described by
αi N(0, σ2
α), βj N(0, σ2
β), µ N(ω, σ2
µ),
these distributions being independent. A third stage is not
necessary.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
Our goal now is to derive the expressions for D−1 and d. Noting
that C2 is diagonal, then the same is true for C−1
2 and leading
diagonal is given by (σ−2
µ , σ−2
α , . . . , σ−2
α , σ−2
β , . . . , σ−2
β ). Furthermore,
remember that C3 = 0. Hence, we get that
D−1 = σ−2AT
1 A1 + C−1
2 and d = σ−2AT
1 y.
Bayes estimate Dd, call it θ∗
1, satisfies the equation
(AT
1 A1 + σ2C−1
2 )θ∗
1 = AT
1 y
suggesting that it differ from the least-squares equation only in the
inclusion of the extra term σ2C−1
2 .
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
Outline
Introduction
An introductory example
General Bayesian Linear Model
An application
The Bayes estimate θ∗
1 is given by the following expressions:
µ∗ = y.., α∗
j = bσ2
α(yi.−y..)
(bσ2
α+σ2)
, β∗
j =
tσ2
β(y.j −y..)
(tσ2
β+σ2)
where y.., yi. and y.j are sample means. Observe that the
estimators of the treatment and block effects are shrunk towards
zero by a factor depending on σ2 and σ2
α or σ2
β.
Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an

More Related Content

Similar to Lindley smith 1972

Talk iccf 19_ben_hammouda
Talk iccf 19_ben_hammoudaTalk iccf 19_ben_hammouda
Talk iccf 19_ben_hammouda
Chiheb Ben Hammouda
 
IRJET- On Distributive Meet-Semilattices
IRJET- On Distributive Meet-SemilatticesIRJET- On Distributive Meet-Semilattices
IRJET- On Distributive Meet-Semilattices
IRJET Journal
 
Minimum mean square error estimation and approximation of the Bayesian update
Minimum mean square error estimation and approximation of the Bayesian updateMinimum mean square error estimation and approximation of the Bayesian update
Minimum mean square error estimation and approximation of the Bayesian update
Alexander Litvinenko
 
Mm chap08 -_lossy_compression_algorithms
Mm chap08 -_lossy_compression_algorithmsMm chap08 -_lossy_compression_algorithms
Mm chap08 -_lossy_compression_algorithms
Eellekwameowusu
 
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
SSA KPI
 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
SAJJAD KHUDHUR ABBAS
 
201977 1-1-1-pb
201977 1-1-1-pb201977 1-1-1-pb
201977 1-1-1-pb
AssociateProfessorKM
 
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
andifebby2
 
cswiercz-general-presentation
cswiercz-general-presentationcswiercz-general-presentation
cswiercz-general-presentation
Chris Swierczewski
 
Chapter 14 Part I
Chapter 14 Part IChapter 14 Part I
Chapter 14 Part I
Matthew L Levy
 
Affine Yield Curves: Flexibility versus Incompleteness
Affine Yield Curves: Flexibility versus IncompletenessAffine Yield Curves: Flexibility versus Incompleteness
Affine Yield Curves: Flexibility versus Incompleteness
Dhia Eddine Barbouche
 
A Note on the Generalization of the Mean Value Theorem
A Note on the Generalization of the Mean Value TheoremA Note on the Generalization of the Mean Value Theorem
A Note on the Generalization of the Mean Value Theorem
ijtsrd
 
Threshold network models
Threshold network modelsThreshold network models
Threshold network models
Naoki Masuda
 
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022S
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022SEcon 103 Homework 2Manu NavjeevanAugust 15, 2022S
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022S
EvonCanales257
 
presentasi
presentasipresentasi
presentasi
Deddy Rahmadi
 
Lecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdfLecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdf
MDNomanCh
 
Temporal disaggregation methods
Temporal disaggregation methodsTemporal disaggregation methods
Temporal disaggregation methods
Stephen Bradley
 
Introduction to Machine Learning Lectures
Introduction to Machine Learning LecturesIntroduction to Machine Learning Lectures
Introduction to Machine Learning Lectures
ssuserfece35
 
Stability criterion of periodic oscillations in a (9)
Stability criterion of periodic oscillations in a (9)Stability criterion of periodic oscillations in a (9)
Stability criterion of periodic oscillations in a (9)
Alexander Decker
 
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
Steven Duplij (Stepan Douplii)
 

Similar to Lindley smith 1972 (20)

Talk iccf 19_ben_hammouda
Talk iccf 19_ben_hammoudaTalk iccf 19_ben_hammouda
Talk iccf 19_ben_hammouda
 
IRJET- On Distributive Meet-Semilattices
IRJET- On Distributive Meet-SemilatticesIRJET- On Distributive Meet-Semilattices
IRJET- On Distributive Meet-Semilattices
 
Minimum mean square error estimation and approximation of the Bayesian update
Minimum mean square error estimation and approximation of the Bayesian updateMinimum mean square error estimation and approximation of the Bayesian update
Minimum mean square error estimation and approximation of the Bayesian update
 
Mm chap08 -_lossy_compression_algorithms
Mm chap08 -_lossy_compression_algorithmsMm chap08 -_lossy_compression_algorithms
Mm chap08 -_lossy_compression_algorithms
 
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
Efficient Solution of Two-Stage Stochastic Linear Programs Using Interior Poi...
 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
 
201977 1-1-1-pb
201977 1-1-1-pb201977 1-1-1-pb
201977 1-1-1-pb
 
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
1648796607723_Material-8---Concept-on-Estimation-Variance.pdf
 
cswiercz-general-presentation
cswiercz-general-presentationcswiercz-general-presentation
cswiercz-general-presentation
 
Chapter 14 Part I
Chapter 14 Part IChapter 14 Part I
Chapter 14 Part I
 
Affine Yield Curves: Flexibility versus Incompleteness
Affine Yield Curves: Flexibility versus IncompletenessAffine Yield Curves: Flexibility versus Incompleteness
Affine Yield Curves: Flexibility versus Incompleteness
 
A Note on the Generalization of the Mean Value Theorem
A Note on the Generalization of the Mean Value TheoremA Note on the Generalization of the Mean Value Theorem
A Note on the Generalization of the Mean Value Theorem
 
Threshold network models
Threshold network modelsThreshold network models
Threshold network models
 
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022S
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022SEcon 103 Homework 2Manu NavjeevanAugust 15, 2022S
Econ 103 Homework 2Manu NavjeevanAugust 15, 2022S
 
presentasi
presentasipresentasi
presentasi
 
Lecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdfLecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdf
 
Temporal disaggregation methods
Temporal disaggregation methodsTemporal disaggregation methods
Temporal disaggregation methods
 
Introduction to Machine Learning Lectures
Introduction to Machine Learning LecturesIntroduction to Machine Learning Lectures
Introduction to Machine Learning Lectures
 
Stability criterion of periodic oscillations in a (9)
Stability criterion of periodic oscillations in a (9)Stability criterion of periodic oscillations in a (9)
Stability criterion of periodic oscillations in a (9)
 
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
Steven Duplij, Raimund Vogl, "Polyadic Braid Operators and Higher Braiding Ga...
 

More from Julyan Arbel

UCD_talk_nov_2020
UCD_talk_nov_2020UCD_talk_nov_2020
UCD_talk_nov_2020
Julyan Arbel
 
Bayesian neural networks increasingly sparsify their units with depth
Bayesian neural networks increasingly sparsify their units with depthBayesian neural networks increasingly sparsify their units with depth
Bayesian neural networks increasingly sparsify their units with depth
Julyan Arbel
 
Species sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian NonparametricsSpecies sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian Nonparametrics
Julyan Arbel
 
Dependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian NonparametricsDependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian Nonparametrics
Julyan Arbel
 
Asymptotics for discrete random measures
Asymptotics for discrete random measuresAsymptotics for discrete random measures
Asymptotics for discrete random measures
Julyan Arbel
 
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketingBayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
Julyan Arbel
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
Julyan Arbel
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
Julyan Arbel
 
Berger 2000
Berger 2000Berger 2000
Berger 2000
Julyan Arbel
 
Seneta 1993
Seneta 1993Seneta 1993
Seneta 1993
Julyan Arbel
 
Lehmann 1990
Lehmann 1990Lehmann 1990
Lehmann 1990
Julyan Arbel
 
Diaconis Ylvisaker 1985
Diaconis Ylvisaker 1985Diaconis Ylvisaker 1985
Diaconis Ylvisaker 1985
Julyan Arbel
 
Hastings 1970
Hastings 1970Hastings 1970
Hastings 1970
Julyan Arbel
 
Jefferys Berger 1992
Jefferys Berger 1992Jefferys Berger 1992
Jefferys Berger 1992
Julyan Arbel
 
Bayesian Classics
Bayesian ClassicsBayesian Classics
Bayesian Classics
Julyan Arbel
 
Bayesian Classics
Bayesian ClassicsBayesian Classics
Bayesian Classics
Julyan Arbel
 
R in latex
R in latexR in latex
R in latex
Julyan Arbel
 
Arbel oviedo
Arbel oviedoArbel oviedo
Arbel oviedo
Julyan Arbel
 
Poster DDP (BNP 2011 Veracruz)
Poster DDP (BNP 2011 Veracruz)Poster DDP (BNP 2011 Veracruz)
Poster DDP (BNP 2011 Veracruz)
Julyan Arbel
 
Causesof effects
Causesof effectsCausesof effects
Causesof effects
Julyan Arbel
 

More from Julyan Arbel (20)

UCD_talk_nov_2020
UCD_talk_nov_2020UCD_talk_nov_2020
UCD_talk_nov_2020
 
Bayesian neural networks increasingly sparsify their units with depth
Bayesian neural networks increasingly sparsify their units with depthBayesian neural networks increasingly sparsify their units with depth
Bayesian neural networks increasingly sparsify their units with depth
 
Species sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian NonparametricsSpecies sampling models in Bayesian Nonparametrics
Species sampling models in Bayesian Nonparametrics
 
Dependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian NonparametricsDependent processes in Bayesian Nonparametrics
Dependent processes in Bayesian Nonparametrics
 
Asymptotics for discrete random measures
Asymptotics for discrete random measuresAsymptotics for discrete random measures
Asymptotics for discrete random measures
 
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketingBayesian Nonparametrics, Applications to biology, ecology, and marketing
Bayesian Nonparametrics, Applications to biology, ecology, and marketing
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
 
A Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian NonparametricsA Gentle Introduction to Bayesian Nonparametrics
A Gentle Introduction to Bayesian Nonparametrics
 
Berger 2000
Berger 2000Berger 2000
Berger 2000
 
Seneta 1993
Seneta 1993Seneta 1993
Seneta 1993
 
Lehmann 1990
Lehmann 1990Lehmann 1990
Lehmann 1990
 
Diaconis Ylvisaker 1985
Diaconis Ylvisaker 1985Diaconis Ylvisaker 1985
Diaconis Ylvisaker 1985
 
Hastings 1970
Hastings 1970Hastings 1970
Hastings 1970
 
Jefferys Berger 1992
Jefferys Berger 1992Jefferys Berger 1992
Jefferys Berger 1992
 
Bayesian Classics
Bayesian ClassicsBayesian Classics
Bayesian Classics
 
Bayesian Classics
Bayesian ClassicsBayesian Classics
Bayesian Classics
 
R in latex
R in latexR in latex
R in latex
 
Arbel oviedo
Arbel oviedoArbel oviedo
Arbel oviedo
 
Poster DDP (BNP 2011 Veracruz)
Poster DDP (BNP 2011 Veracruz)Poster DDP (BNP 2011 Veracruz)
Poster DDP (BNP 2011 Veracruz)
 
Causesof effects
Causesof effectsCausesof effects
Causesof effects
 

Recently uploaded

Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
chanes7
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
National Information Standards Organization (NISO)
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
Dr. Shivangi Singh Parihar
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
Dr. Mulla Adam Ali
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
TechSoup
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
tarandeep35
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
Nguyen Thanh Tu Collection
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
NgcHiNguyn25
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
Celine George
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
Priyankaranawat4
 
Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
amberjdewit93
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Excellence Foundation for South Sudan
 
Main Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docxMain Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docx
adhitya5119
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
Priyankaranawat4
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
heathfieldcps1
 
Cognitive Development Adolescence Psychology
Cognitive Development Adolescence PsychologyCognitive Development Adolescence Psychology
Cognitive Development Adolescence Psychology
paigestewart1632
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
Celine George
 
MARY JANE WILSON, A “BOA MÃE” .
MARY JANE WILSON, A “BOA MÃE”           .MARY JANE WILSON, A “BOA MÃE”           .
MARY JANE WILSON, A “BOA MÃE” .
Colégio Santa Teresinha
 
Community pharmacy- Social and preventive pharmacy UNIT 5
Community pharmacy- Social and preventive pharmacy UNIT 5Community pharmacy- Social and preventive pharmacy UNIT 5
Community pharmacy- Social and preventive pharmacy UNIT 5
sayalidalavi006
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
RAHUL
 

Recently uploaded (20)

Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
 
Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
 
Main Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docxMain Java[All of the Base Concepts}.docx
Main Java[All of the Base Concepts}.docx
 
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdfANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
ANATOMY AND BIOMECHANICS OF HIP JOINT.pdf
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
 
Cognitive Development Adolescence Psychology
Cognitive Development Adolescence PsychologyCognitive Development Adolescence Psychology
Cognitive Development Adolescence Psychology
 
How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17How to Make a Field Mandatory in Odoo 17
How to Make a Field Mandatory in Odoo 17
 
MARY JANE WILSON, A “BOA MÃE” .
MARY JANE WILSON, A “BOA MÃE”           .MARY JANE WILSON, A “BOA MÃE”           .
MARY JANE WILSON, A “BOA MÃE” .
 
Community pharmacy- Social and preventive pharmacy UNIT 5
Community pharmacy- Social and preventive pharmacy UNIT 5Community pharmacy- Social and preventive pharmacy UNIT 5
Community pharmacy- Social and preventive pharmacy UNIT 5
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
 

Lindley smith 1972

  • 1. Outline Introduction An introductory example General Bayesian Linear Model An application Bayes Estimates for the Linear Model by D.V. Lindley and A. F. M. Smith Paolo Baudissone April 17, 2015 Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 2. Outline Introduction An introductory example General Bayesian Linear Model An application Outline 1 Introduction 2 An introductory example 3 General Bayesian Linear Model 4 An application Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 3. Outline Introduction An introductory example General Bayesian Linear Model An application Introduction Object of this paper is the linear model E[y] = Aθ, where y is a vector of observations, A a known design matrix and θ a vector of unknown parameters. Usual estimate of θ employed in this framework is that derived by the least squares method. Authors argue the availability of prior information about the parameters and the fact that this may be exploited in order to find improved estimates; in particular they focus on situations in which the parameters themselves have a general linear structure in terms of other quantities called hyperparameters. A particular form of prior information is assumed: the one based on De Finetti’s (1964) idea of exchangeability. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 4. Outline Introduction An introductory example General Bayesian Linear Model An application An introductory example Suppose, in the general linear model, that we are dealing with a unit design matrix so that E[yi ] = θi for i = 1, . . . , n and that y1, . . . , yn are i.i.d. random variables normally distributed with known variance σ2. Assume moreover that the distribution of the θi ’s is exchangeable; this exchangeable prior knowledge determines E[θi ] = µ, a common value for each i; in other words there is a linear structure to the parameters analogous to the linear structure supposed for the observations y. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 5. Outline Introduction An introductory example General Bayesian Linear Model An application In this simple example, µ is the only hyperparameter. Denoting by τ2 the variance of each θi , it can be shown (Lindley - 1971) that the posterior mean is given by E[θi|y] = yi σ2 + n i=1 yi n τ2 1 σ2 + 1 τ2 Notice that the Bayes estimate is a weighted average of yi and the overall mean, n i=1 yi n , with weights inversely proportional to the variances of yi and θi . Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 6. Outline Introduction An introductory example General Bayesian Linear Model An application Notation Notation y N(µ, D) means that column vector y has a multivariate normal distribution with mean µ, a column vector, and dispersion D, a positive semi-definite matrix. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 7. Outline Introduction An introductory example General Bayesian Linear Model An application Theorem Suppose, given θ1, that y|θ1 N(A1θ1, C1) and that, given θ2, a vector of p2 hyperparameters, θ1|θ2 N(A2θ2, C2). Then 1 y N(A1A2θ2, C1 + A1C2AT 1 ) 2 θ1|y N(Bb, B) where B−1 = AT 1 C−1 1 A1 + C−1 2 b = AT 1 C−1 1 y + C−1 2 A2θ2 Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 8. Outline Introduction An introductory example General Bayesian Linear Model An application Proof. Part 1 Observing that y = A1θ1 + u, where u N(0, C1), and θ1 = A2θ2 + v, where v N(0, C2); it follows that y = A1A2θ2 + A1v + u. Moreover, A1v + u is a linear function of independent (multivariate) normal random variables and is distributed as N(0, C1 + A1C2AT 1 ), hence the result follows. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 9. Outline Introduction An introductory example General Bayesian Linear Model An application Proof. Part 2 To prove the second part we use Bayes’ theorem and write the posterior distribution of θ1 as p(θ1|y) ∝ p(y|θ1)p(θ1) The right-hand side can be written as e−Q 2 , where Q is given by (y − A1θ1)T C−1 1 (y − A1θ1) + (θ1 − A2θ2)T C−1 2 (θ1 − A2θ2) and after some calculations we obtain that Q is equal to: (θ1 − Bb)T B−1 (θ1 − Bb) + {yT C−1 1 y + θT 2 AT 2 C−1 2 A2θ2 − bT Bb}. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 10. Outline Introduction An introductory example General Bayesian Linear Model An application The following result allows us to rewrite the expression of the inverse of the dispersion matrix of the marginal distribution of y in a way that will be useful for subsequent developments of the topic. Theorem For any matrices A1, A2, C1 and C2 of appropriate dimensions and for which the inverses stated in the resul exist, we have (C1 + A1C2AT 1 )−1 = C−1 1 − C−1 1 A1(AT 1 C−1 1 A1 + C−1 2 )−1AT 1 C−1 1 Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 11. Outline Introduction An introductory example General Bayesian Linear Model An application Posterior distribution of θ1 Theorem Suppose that, given θ1, y|θ1 N(A1θ1, C1), given θ2, θ1|θ2 N(A2θ2, C2) and given θ3, θ2|θ3 N(A3θ3, C3). Then the posterior distribution of θ1 is: θ1|{Ai }, {Ci }, θ3, y N(Dd, D) where D−1 = AT 1 C−1 1 A1 + (C2 + A2C3AT 2 )−1 and d = AT 1 C−1 1 y + (C2 + A2C3AT 2 )−1A2A3θ3. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 12. Outline Introduction An introductory example General Bayesian Linear Model An application Remarks I Observe that thanks to the first theorem we stated it is possible to write down the marginal distribution of θ1, that is the prior distribution, free of the hyperparameters θ2: θ1 N(A2A3θ3, C2 + A2C3AT 2 ) The mean of the posterior distribution may be regarded as a point estimate of θ1 to replace the usual least-squares estimate. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 13. Outline Introduction An introductory example General Bayesian Linear Model An application Remarks II The form of this estimate is a weighted average of the least-squares estimate (AT 1 C−1 1 A1)−1AT 1 C−1 1 y and the prior mean A2A3θ3 with weights given by the inverses of the corresponding dispersion matrices, AT 1 C−1 1 A1 for the least-squares values, C2 + A2C3AT 2 for the prior distribution of θ1. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 14. Outline Introduction An introductory example General Bayesian Linear Model An application Thanks to the second theorem we have stated before, we are now in a position to get several alternative expressions for the posterior mean and variance; this is underlined in the two following results: Theorem An alternative expression for D−1 is given by AT 1 C−1 1 A1 + C−1 2 − C−1 2 A2(AT 2 C−1 2 A2 + C−1 3 )−1AT 2 C−1 2 . Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 15. Outline Introduction An introductory example General Bayesian Linear Model An application Theorem If C−1 3 = 0, the posterior distribution of θ1 is N(D0d0, D0) where D−1 0 = AT 1 C−1 1 A1 + C−1 2 − C−1 2 A2(AT 2 C−1 2 A2)−1AT 2 C−1 2 and d0 = AT 1 C−1 1 y. This result gives the form mostly often used in applications. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 16. Outline Introduction An introductory example General Bayesian Linear Model An application An application: Two-factor Experimental Designs Consider t ”treatments” assigned to n experimental units arranged in b ”blocks”. The usual model is given by E[yij ] = µ + αi + βj for 1 ≤ i ≤ t, 1 ≤ j ≤ b with the errors independent N(0, σ2). In the general notation we have that θT 1 = (µ, α1, . . . , αt, β1, . . . , βb) and A1 describes the designed used. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 17. Outline Introduction An introductory example General Bayesian Linear Model An application In the second stage it seems reasonable to assume an exchangeable prior knowledge within treatment constants {αi } and within block constants {βj }, but that these were independent. Adding the assumption of normality, second stage can be described by αi N(0, σ2 α), βj N(0, σ2 β), µ N(ω, σ2 µ), these distributions being independent. A third stage is not necessary. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 18. Outline Introduction An introductory example General Bayesian Linear Model An application Our goal now is to derive the expressions for D−1 and d. Noting that C2 is diagonal, then the same is true for C−1 2 and leading diagonal is given by (σ−2 µ , σ−2 α , . . . , σ−2 α , σ−2 β , . . . , σ−2 β ). Furthermore, remember that C3 = 0. Hence, we get that D−1 = σ−2AT 1 A1 + C−1 2 and d = σ−2AT 1 y. Bayes estimate Dd, call it θ∗ 1, satisfies the equation (AT 1 A1 + σ2C−1 2 )θ∗ 1 = AT 1 y suggesting that it differ from the least-squares equation only in the inclusion of the extra term σ2C−1 2 . Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an
  • 19. Outline Introduction An introductory example General Bayesian Linear Model An application The Bayes estimate θ∗ 1 is given by the following expressions: µ∗ = y.., α∗ j = bσ2 α(yi.−y..) (bσ2 α+σ2) , β∗ j = tσ2 β(y.j −y..) (tσ2 β+σ2) where y.., yi. and y.j are sample means. Observe that the estimators of the treatment and block effects are shrunk towards zero by a factor depending on σ2 and σ2 α or σ2 β. Paolo Baudissone Bayes Estimates for the Linear Modelby D.V. Lindley an