SlideShare a Scribd company logo
1 of 12
Definition 2 
Likelihood Function
Likelihood function 
• The Likekihood function of a random 
variables X1,X2,...,Xn is defined to be the joint 
density of the n random variables, say 
fx1,...,xn(x1,...,xn; Θ) , which is considered to be a 
function of Θ. In particular, if X1,...,Xn is a 
random sample from the density f (x; Θ) , then 
the likelihood function is f (x1; Θ)f (x2; Θ).....f 
(xn; Θ) . ///
• In statistics, the likelihood function (often 
simply the likelihood) is a function of the 
parameters of a statistical model that plays a 
key role in statistical inference. 
• but, in statistical usage, a clear technical 
distinction is made: the probability of some 
observed outcomes given a set of parameter 
values is referred to as the likelihood of the set 
of parameter values given the observed 
outcomes.
1.the likelihood function is a function. 
2.the likelihood function is not a probability 
density function. 
3.if the data are iid then the likelihood is 
___________ iid case only. 
4.the likelihood is only defined up to constant of 
propotionality. 
5.the likelihood function is used (i) to generate 
estimators (the maximum likelihood 
estimator) and (ii) as a key ingredient in 
Bayesian inference.
• Mathematically, writing X for the set of 
observed data and Θ for the set of parameter 
values, the expression P(X | Θ), the probability 
of X given Θ, can be interpreted as the 
expression L( Θ| X) , the likelihood of Θ given 
X. The interpretation of L( Θ | X) as a function 
of Θ is especially obvious when X is fixed and 
Θ is allowed to vary.
• Generally, L(Θ | X) is permitted to be any positive 
multiple of P(X | Θ). More precisely then, a 
likelihood function is any representative from an 
equivalence class of functions, 
• where the constant of proportionality α > 0 is not 
permitted to depend upon Θ. In particular, the 
numerical value L(Θ | X) alone is immaterial; all 
that matters are likelihood ratios, such as those of 
the form that are invariant with respect to the 
constant of proportionality α.
Notation 
• To remind ourselves to think of the likelihood 
function as a function of Θ, we shall use the 
notation L( Θ;x1,...,xn) or L(.;x1,...,xn) for the 
function. ///
The Likelihood Principle 
• An informal summary of the likelihood principle 
may be that inferences from data to hypotheses 
should depend on how likely the actual data are 
under competing hypotheses, not on how likely 
imaginary data would have been under a single 
"null" hypothesis or any other properties of merely 
possible data. 
• Bayesian inferences depend only on the probabilities 
assigned due to the observed data, not due to other 
data that might have been observed.
• A more precise interpretation may be that 
inference procedures which make inferences 
about simple hypotheses should not be 
justified by appealing to probabilities assigned 
to observations that have not occurred. 
• The usual interpretation is that any two 
probability models with the same likelihood 
function yield the same inference for θ.
Difference of Probability and Likelihood 
• 1.“Probability” and “likelihood” can be both 
used to express a prediction and odds of 
occurrences. 
• 2.“Probability” refers to a “chance” while 
likelihood refers to a “possibility.” 
• 3.A probability follows clear parameters and 
computations while a likelihood is based 
merely on observed factors.
End of the slide 
Thank you for listening :)
Definition 
• Let Xn=(x1,..,xn) have joint density 
p(xn; Θ)=P(x1,...,xn; Θ) where Θ ∈Θ. The 
likelihood function L :Θ [0, ∞) is defined by 
L( Θ) = L( Θ;xn) 
where xn is fixed and Θ varies in Θ.

More Related Content

Similar to Likelihood Function Definition

Statistics (1): estimation, Chapter 1: Models
Statistics (1): estimation, Chapter 1: ModelsStatistics (1): estimation, Chapter 1: Models
Statistics (1): estimation, Chapter 1: ModelsChristian Robert
 
random variation 9473 by jaideep.ppt
random variation 9473 by jaideep.pptrandom variation 9473 by jaideep.ppt
random variation 9473 by jaideep.pptBhartiYadav316049
 
DL-unit-1.pptx
DL-unit-1.pptxDL-unit-1.pptx
DL-unit-1.pptxMMAHESH29
 
Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...praveenyadav2020
 
Basic probability theory and statistics
Basic probability theory and statisticsBasic probability theory and statistics
Basic probability theory and statisticsLearnbay Datascience
 
this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...BhojRajAdhikari5
 
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxCHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxanshujain54751
 
Probability distribution
Probability distributionProbability distribution
Probability distributionManoj Bhambu
 
Confidence Intervals––Exact Intervals, Jackknife, and Bootstrap
Confidence Intervals––Exact Intervals, Jackknife, and BootstrapConfidence Intervals––Exact Intervals, Jackknife, and Bootstrap
Confidence Intervals––Exact Intervals, Jackknife, and BootstrapFrancesco Casalegno
 
Discreet and continuous probability
Discreet and continuous probabilityDiscreet and continuous probability
Discreet and continuous probabilitynj1992
 
Communication Theory - Random Process.pdf
Communication Theory - Random Process.pdfCommunication Theory - Random Process.pdf
Communication Theory - Random Process.pdfRajaSekaran923497
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and StatisticsMalik Sb
 
Point estimation.pptx
Point estimation.pptxPoint estimation.pptx
Point estimation.pptxDrNidhiSinha
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testingjemille6
 
Random variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxRandom variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxcatheryncouper
 

Similar to Likelihood Function Definition (20)

Sufficiency
SufficiencySufficiency
Sufficiency
 
Statistics (1): estimation, Chapter 1: Models
Statistics (1): estimation, Chapter 1: ModelsStatistics (1): estimation, Chapter 1: Models
Statistics (1): estimation, Chapter 1: Models
 
random variation 9473 by jaideep.ppt
random variation 9473 by jaideep.pptrandom variation 9473 by jaideep.ppt
random variation 9473 by jaideep.ppt
 
3 es timation-of_parameters[1]
3 es timation-of_parameters[1]3 es timation-of_parameters[1]
3 es timation-of_parameters[1]
 
DL-unit-1.pptx
DL-unit-1.pptxDL-unit-1.pptx
DL-unit-1.pptx
 
Probability
ProbabilityProbability
Probability
 
Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...Fisher_info_ppt and mathematical process to find time domain and frequency do...
Fisher_info_ppt and mathematical process to find time domain and frequency do...
 
Basic probability theory and statistics
Basic probability theory and statisticsBasic probability theory and statistics
Basic probability theory and statistics
 
this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...this materials is useful for the students who studying masters level in elect...
this materials is useful for the students who studying masters level in elect...
 
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxCHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Confidence Intervals––Exact Intervals, Jackknife, and Bootstrap
Confidence Intervals––Exact Intervals, Jackknife, and BootstrapConfidence Intervals––Exact Intervals, Jackknife, and Bootstrap
Confidence Intervals––Exact Intervals, Jackknife, and Bootstrap
 
Discreet and continuous probability
Discreet and continuous probabilityDiscreet and continuous probability
Discreet and continuous probability
 
Ch5
Ch5Ch5
Ch5
 
Communication Theory - Random Process.pdf
Communication Theory - Random Process.pdfCommunication Theory - Random Process.pdf
Communication Theory - Random Process.pdf
 
Probability and Statistics
Probability and StatisticsProbability and Statistics
Probability and Statistics
 
Probability distributionv1
Probability distributionv1Probability distributionv1
Probability distributionv1
 
Point estimation.pptx
Point estimation.pptxPoint estimation.pptx
Point estimation.pptx
 
Probability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis TestingProbability/Statistics Lecture Notes 4: Hypothesis Testing
Probability/Statistics Lecture Notes 4: Hypothesis Testing
 
Random variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docxRandom variables and probability distributions Random Va.docx
Random variables and probability distributions Random Va.docx
 

Likelihood Function Definition

  • 2. Likelihood function • The Likekihood function of a random variables X1,X2,...,Xn is defined to be the joint density of the n random variables, say fx1,...,xn(x1,...,xn; Θ) , which is considered to be a function of Θ. In particular, if X1,...,Xn is a random sample from the density f (x; Θ) , then the likelihood function is f (x1; Θ)f (x2; Θ).....f (xn; Θ) . ///
  • 3. • In statistics, the likelihood function (often simply the likelihood) is a function of the parameters of a statistical model that plays a key role in statistical inference. • but, in statistical usage, a clear technical distinction is made: the probability of some observed outcomes given a set of parameter values is referred to as the likelihood of the set of parameter values given the observed outcomes.
  • 4. 1.the likelihood function is a function. 2.the likelihood function is not a probability density function. 3.if the data are iid then the likelihood is ___________ iid case only. 4.the likelihood is only defined up to constant of propotionality. 5.the likelihood function is used (i) to generate estimators (the maximum likelihood estimator) and (ii) as a key ingredient in Bayesian inference.
  • 5. • Mathematically, writing X for the set of observed data and Θ for the set of parameter values, the expression P(X | Θ), the probability of X given Θ, can be interpreted as the expression L( Θ| X) , the likelihood of Θ given X. The interpretation of L( Θ | X) as a function of Θ is especially obvious when X is fixed and Θ is allowed to vary.
  • 6. • Generally, L(Θ | X) is permitted to be any positive multiple of P(X | Θ). More precisely then, a likelihood function is any representative from an equivalence class of functions, • where the constant of proportionality α > 0 is not permitted to depend upon Θ. In particular, the numerical value L(Θ | X) alone is immaterial; all that matters are likelihood ratios, such as those of the form that are invariant with respect to the constant of proportionality α.
  • 7. Notation • To remind ourselves to think of the likelihood function as a function of Θ, we shall use the notation L( Θ;x1,...,xn) or L(.;x1,...,xn) for the function. ///
  • 8. The Likelihood Principle • An informal summary of the likelihood principle may be that inferences from data to hypotheses should depend on how likely the actual data are under competing hypotheses, not on how likely imaginary data would have been under a single "null" hypothesis or any other properties of merely possible data. • Bayesian inferences depend only on the probabilities assigned due to the observed data, not due to other data that might have been observed.
  • 9. • A more precise interpretation may be that inference procedures which make inferences about simple hypotheses should not be justified by appealing to probabilities assigned to observations that have not occurred. • The usual interpretation is that any two probability models with the same likelihood function yield the same inference for θ.
  • 10. Difference of Probability and Likelihood • 1.“Probability” and “likelihood” can be both used to express a prediction and odds of occurrences. • 2.“Probability” refers to a “chance” while likelihood refers to a “possibility.” • 3.A probability follows clear parameters and computations while a likelihood is based merely on observed factors.
  • 11. End of the slide Thank you for listening :)
  • 12. Definition • Let Xn=(x1,..,xn) have joint density p(xn; Θ)=P(x1,...,xn; Θ) where Θ ∈Θ. The likelihood function L :Θ [0, ∞) is defined by L( Θ) = L( Θ;xn) where xn is fixed and Θ varies in Θ.

Editor's Notes

  1. formula in 3 .,.,. write on the board.
  2. .,.in philosopically speaking the two words the same denotative meaning .Then again, these two words are strictly used in different contexts.