Statisticsassignmentexperts.com is an organisation committed to providing world class education solutions in the subject of Statistics to students across the globe. We specialise in offering a multitude of educational services to students for Statistics, who are studying in various institutes and universities, be it assignments, online tutoring, project work, dissertation/thesis, or exam preparation.
When it comes to information and communication technologies, we employ state-of-the-art and latest tools and technology to connect with students and expert tutors. Students from various countries including the USA, UK, Canada, UAE and Australia have used our services for the past several years to achieve excellence in their academic and professional pursuits. Statisticsassignmentexperts.com works closely with its strong and dynamic team of subject experts to create new models for exchange of information, in consonance with the changing needs of students as well as academic and professional programs.
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Bayesian Inference Homework Help
1. 1
Bayesian Inference
For Bayesian analysis, ( )xf |θ , the posterior distribution, plays an important role
in statistical inferential procedure. Some Bayesians suggest that inference should
ideally consist of simply reporting the entire posterior distribution ( )xf |θ
(maybe for a non-informative prior). However, some standard uses of the posterior are
still helpful!!
Some statistical inference problems are
I. Estimation (point estimate), estimation error
II. Interval estimate
III. Hypothesis testing
IV. Predictive inference
I. Estimation (point estimate)
(a) Estimation
The generalized maximum likelihood estimate of θ is θˆ which maximizes
( )xf |θ . θˆ is the most likely value of θ given the prior and the sample X.
Example 3 (continue):
( ) ( )
( )
2
11
1
2
2
1
|,,,1,~,,
∑
=
=
−
−
n
i
ix
n
nn exxfNXX
θ
π
θθ
and ( ) ( )1,~ µθπ N .Then,
( )
++
+
1
1
,
1
1
~,,| 1
n
n
n
x
Nxxf n
µ
θ
n
n
x
1
1
ˆ
+
+
=
µ
θ is then the posterior mode (also posterior mean)
Other commonly used Bayesian estimates of θ include posterior mean and posterior
2. 2
median. In Normal example, posterior mode=posterior mean=posterior median.
Note:
The mean and median of the posterior are frequently better estimates of θ than the
mode. It is worthwhile to calculate and compare all 3 in a Bayesian study.
Example 4:
( ) ( ) .0,1;0,1,~ >=> θθπθθNX
Then,
( ) ( ) ( ) ( )( )
( )
( )( )
( )
2
,0
2
,0
2
2
2
1
||
θ
θ
θ
π
θθθπθ
−
−
∞
−
−
∞
∝
⋅=∝
x
x
eI
eIxfxf
Thus,
( ) ( )( )
( )
( )
θ
θ
θ θ
θ
de
eI
xf x
x
∫
∞ −
−
−
−
∞
=
0
2
2
,0
2
2
|
.
Further,
3. 3
( )
[ ]
( )
( )
( )
( )
( )
( )
( )
( )
( )
( )x
e
x
x
de
x
de
de
de
dex
de
dex
x
x
de
de
de
de
de
de
E
x
x
x
x
x
x
x
x
x
x
x
x
x
x
xf
−Φ−
+=
−Φ−
+=
+=
+
=
+=⇔
=−
=
==
−
∞
−
−
∞
−
−
∞
−
−
∞
−
−
∞
−
−
∞
−
−
∞
−
−
∞ −
−
∞ −
−
∞ −
−
∞ −
−
∞ −
−
∞ −
−
∫
∫
∫
∫
∫
∫
∫
∫
∫
∫
∫
∫
∫
1
2
1
1
2/
2
1
2
22
2
2
2
2
2
2
0
2
0
2
0
2
0
2
0
2
0
2
|
2
2
2
2
2
2
2
2
2
2
2
2
2
2
π
η
π
θ
ηη
θ
η
η
ηη
ηθ
ηθ
θ
θθ
θ
θθ
θ
θθ
θ
η
η
η
η
η
η
η
θ
θ
θ
θ
θ
θ
θ
Note:
The classical MLE in this example is x. However, 0>θ . The classical MLE might
result in senseless conclusion!!
(b) Estimation error
The posterior variance of ( )xδ is
( ) ( )
( )[ ]{ }2|
xExV xf
δθθ
δ −= .
4. 4
The posterior variance is defined as
( ) ( )
( )[ ]2|
xExV xf
µθθ
−= ,
where ( ) ( )
( )θµ θ xf
Ex |
= is the posterior mean.
Note:
( ) ( ) ( ) ( )[ ]2
xxxVxV δµδ −+=
Example 3 (continue):
( )
++
+
1
1
,
1
1
~,,| 1
n
n
n
x
Nxxf n
µ
θ .
Then, the posterior mean is
( )
n
n
x
xx n
1
1
,,1
+
+
=
µ
µ ,
and the posterior variance is
( )
1
1
,,1
+
=
n
xxV n .
Suppose the classical MLE ( ) xxx n =,,1 δ is used, then
( ) ( ) ( ) ( )[ ]
2
11
2
1111
11
1
,,,,,,,,
−
+
+
+
+
=
−+=
∑∑ ==
n
x
n
x
n
xxxxxxVxxV
n
i
i
n
i
i
nnnn
µ
δµδ
5. 5
( )
2
2
1
11
1
11
1
+
−
+
+
=
+
−
+
+
=
∑=
n
x
n
nn
xn
n
n
i
i
µ
µ
Example 5:
( ) ( ) .1,,~ 2
=θπσθNX
Then,
( ) ( )2
,~| σθ xNxf .
Thus,
the posterior mean=the posterior mode=posterior mode=x
=classical MLE
Note:
The Bayesian analysis based on a non-informative prior is often formally the same as
the usual classical maximum likelihood analysis.
Example 4 (continue):
The posterior density is
( ) ( )( )
( )
( )
θ
θ
θ θ
θ
de
eI
xf x
x
∫
∞ −
−
−
−
∞
=
0
2
2
,0
2
2
|
and the posterior mean is
( )
[ ] ( )
( )
( )xx
x
e
xxE
x
xf
ϕπµθθ
+=
−Φ−
+==
−
1
2
1 2
|
2
,
6. 6
where ( )
( )x
e
x
x
−Φ−
=
−
1
2
1 2
2
πϕ . If ( ) xx =δ , then
( ) ( ) ( ) ( )[ ]
( ) ( )[ ]
( ) ( )xxV
xxxxV
xxxVxV
2
2
2
ϕ
ϕ
δµδ
+=
−++=
−+=
Therefore, ( ) ( ) ( )xxVxV 2
ϕδ −= .
( ) ( )[ ] ( )xxxxV ϕϕ+−= 1
since
( ) ( )
[ ]
( )
( )
( )
( )
( )
( )
( )
( )
( ) 1
0
2
0
22
0
2
0
22
0
2
0
22
2|
2
22
2
2
2
2
+−=
+−
=
−
=
−
=
−=
∫
∫
∫
∫
∫
∫
∞ −
−
∞ −
−−
∞ −
−
∞ −
−
∞ −
−
∞ −
−
xx
de
dexe
de
dex
de
dex
xExV
x
xx
x
x
x
x
xf
ϕ
θ
θ
θ
θθ
θ
θθ
θ
θ
θ
θ
θ
θ
θ
θ
δ
where
7. 7
( )
( )
( )
( ) ( )
∫
∫
∫
∫∫∫
−
−
−
−
−−
−−
−
−−
−
+−−=
+−=
−−=
−==−
θθ
µµ
µµ
µµµθθ
θθ
µµ
µµ
µµθ
deex
dee
dee
dededex
xx
x
22
22
22
22222
22
22
22
222
(c) Multivariate estimation
Let ( )t
pθθθ ,,1 = be p-dimensional parameter. Then, the posterior mean is
( ) ( ) ( ) ( )[ ]
[ ] [ ] [ ][ ]t
p
xfxfxf
t
p
EEE
xxxx
θθθ
µµµµ
θθθ )|(
2
)|(
1
)|(
21
=
=
and the posterior variance is
( ) ( )
( )( ) ( )( )[ ]txf
xxExV µθµθθ
−−= |
.
Further, The posterior variance of ( )xδ is
( ) ( )
( )( ) ( )( )[ ]
( ) ( ) ( )[ ] ( ) ( )[ ]t
txf
xxxxxV
xxExV
δµδµ
δθδθθ
δ
−−+=
−−= |