SlideShare a Scribd company logo
1 of 27
Download to read offline
Methods of Point
Estimation
By
Suruchi Somwanshi
M.Sc. (Mathematics)
M.Sc. (Statistics)
TOPICS TO BE COVERED
1. Introduction to statistical inference
2. Theory of estimation
3. Methods of estimation
3.1 Method of maximum likelihood estimation
3.2 Method of moments
1. Introduction to statistical inference
Statistics
Inferential Statistics
Estimation Testing of Hypothesis
Descriptive statistics
Descriptive analysis Graphical Presentation
What do we mean by Statistical Inference?
Drawing conclusion or making decision about population based
on information collected from the sample.
Population Sample
Representative
Making Conclusions
– Statistical inference is further divided into two parts
Testing of hypothesis &
Theory of Estimation
Testing of hypothesis –
➢ The theory of testing of hypothesis is initiated by J. Neyman and
E. S. Pearson.
➢ It provides the rule which makes one to decide about the
acceptance or rejection of the hypothesis under study.
Theory of estimation –
➢ The theory of estimation was founded by Prof. R. A. Fisher.
➢ It discuss the ways of assigning the value to a population
parameter based on values of corresponding statistics (function
of sample observations).
2. Theory of estimation
➢ The theory of estimation was founded by R. A. Fisher.
Inferential
Statistics
Estimation
Point
Estimation
Interval
Estimation
Testing of
hypothesis
What do we mean by Estimation
It discuss the ways of assigning the values to a population parameter
based on the values of the corresponding statistics(function of the
sample observations).
The statistics used to estimate population parameter is called
estimator.
The value of the estimator is called estimate.
Types of estimation
There are two types of estimation
Point estimation
&
Interval estimation
Point Estimation
It involves the use of sample data to calculate a single value(known
as a Point estimate) which is to serve as a best guess or best estimate
of an unknown population parameter. More formally, it is the
application of a point estimator to the data to obtain a point
estimate.
Interval estimation
It is the use of sample data to calculate an interval of possible values
of an unknown population parameter; this is in contrast to point
estimation, which gives a single value
Is an interval which is formed by two quantities based on sample
data within which the parameter will lie with very high probability.
3. Methods of Estimation
– Following are some of the important methods for obtaining good
estimators :
➢ Method of maximum likelihood estimation
➢ Method of moments
3.1 Method of maximum likelihood
estimation
– It is initially formulated by C. F. Gauss.
– In statistics, maximum likelihood estimation (MLE) is a method
of estimating the parameters of a probability distribution by
maximizing a likelihood function, so that under the
assumed statistical model the observed data is most probable.
The point in the parameter space that maximizes the likelihood
function is called the maximum likelihood estimate.
Likelihood function
It is formed from the joint density function of the sample.
i.e.,
𝐿 = 𝐿 𝜃 = 𝑓 𝑥1, 𝜃 … … … . 𝑓 𝑥𝑛, 𝜃 = ෑ
𝑖=1
𝑛
𝑓 𝑥𝑖, 𝜃
Where 𝑥1, 𝑥2, 𝑥3,……. 𝑥𝑛 be a random sample of size n from a
population with density function 𝑓 𝑥, 𝜃 .
Steps to perform in MLE
1. Define the likelihood, ensuring you’re using the correct
distribution for your classification problem.
2. Take the natural log and reduce the product function to a sum
function.
3. Then compute the parameter by considering the case
𝜕
𝜕𝜃
𝑙𝑜𝑔𝐿 = 0 &
𝜕2
𝜕𝜃2 𝑙𝑜𝑔𝐿 < 0
This equations are usually referred to as the Likelihood Equation for
estimating the parameters.
Example
Suppose we have a random sample 𝑋1, 𝑋2, … … … , 𝑋𝑛 where :
𝑋𝑖 = 0 ; 𝑖𝑓 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡𝑒𝑑 𝑠𝑡𝑢𝑑𝑒𝑛𝑡 𝑑𝑜𝑒𝑠 𝑛𝑜𝑡 𝑜𝑤𝑛 𝑎 𝑐𝑎𝑟, 𝑎𝑛𝑑
𝑋𝑖 = 1 ; 𝑖𝑓 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡𝑒𝑑 𝑠𝑡𝑢𝑑𝑒𝑛𝑡 𝑑𝑜𝑒𝑠 𝑜𝑤𝑛 𝑎 𝑐𝑎𝑟.
Assuming that the 𝑋𝑖 are independent Bernoulli random variables with
unknown parameter p, find the maximum likelihood estimator of p, the
proportion of students who own a sports car.
If the 𝑋𝑖 are independent Bernoulli random variables with unknown
parameter p, then the probability mass function of each 𝑋𝑖 is :
𝑓 𝑥; 𝑝 = 𝑝𝑥 1 − 𝑝 1−𝑥
For 𝑋𝑖 = 0𝑜𝑟 1 𝑎𝑛𝑑 0 < 𝑝 < 1.
Therefore, the likelihood function L(p) is, by definition:
Answer
𝐿 𝑝 = ς𝑖=1
𝑛
𝑓 𝑥; 𝑝 = 𝑝𝑥1 1 − 𝑝 1−𝑥1 × 𝑝𝑥2 1 − 𝑝 1−𝑥2 × ⋯ … … … × 𝑝𝑥𝑛ሺ
ሻ
1 −
𝑝 1−𝑥𝑛
For 0 < p < 1.
Simplifying, by summing up the exponents we get:
𝐿 𝑝 = 𝑝σ𝑖=1
𝑛
𝑥𝑖 1 − 𝑝 𝑛 − σ𝑖=1
𝑛
𝑥𝑖 ………………………..(1)
Now, in order to implement the method of maximum likelihood, we
need to find the value of unknown parameter p that maximizes the
likelihood L(p) given in equation (1).
So to maximize the function, we are need to differentiate the likelihood
function with respect to p.
And to make the differentiation easy we are going to use the logarithm
of likelihood function as it is an increasing function of x.
That is, if 𝑥1 < 𝑥2 , then𝑓ሺ𝑥1ሻ < 𝑓ሺ𝑥2ሻ. That means the value of p that
maximizes the natural logarithm of the likelihood function log L(p) is also
the value of p that maximizes the likelihood function L(p).
So we take the derivative of log L(p) with respect to p instead of taking the
derivative of L(p).
In this case, the log likelihood function is :
𝑙𝑜𝑔𝐿 𝑝 = σ𝑖=1
𝑛
𝑥𝑖 log 𝑝 + 𝑛 − σ𝑖=1
𝑛
𝑥𝑖 log 1 − 𝑝 …………….. (2)
Taking the derivative of log L(p) with respect to p and equate it with 0 we get :
𝜕 log 𝐿 𝑝
𝜕𝑝
= 0
=>
σ 𝑥𝑖
𝑝
−
𝑛 − σ 𝑥𝑖
1 − 𝑝
= 0
Now by simplifying this for p we get;
Here (“^”) is used to represent the estimate of parameter p.
Though we find the estimate of parameter p, technically to verify that it is
maximum. For that the second derivative of the logL(p) with respect to p should
negative i.e.,
𝜕2 log 𝐿 𝑝
𝜕𝑝2
< 0 => −𝑛 < 0 … … … … . ሺ𝑏𝑦 3ሻ
Thus, Ƹ
𝑝 =
σ 𝑥𝑖
𝑛
is maximum likelihood estimator of p.
3.2 Method of moments
– This method was discovered and studied in detail by Karl Pearson.
– The basic idea behind this form of the method is to:
1. Equate the first sample moment about the origin
𝑀1 =
1
𝑛
σ𝑖=1
𝑛
𝑋𝑖 = ҧ
𝑥
to the first theoretical moment E(X).
2. Equate the second sample moment about the origin
𝑀2=
1
𝑛
σ𝑖=1
𝑛
𝑋𝑖
2
to the second theoretical moment E(𝑋2
).
3. Continue equating sample moments about the origin, 𝑀𝑘, with
the corresponding theoretical moments E(𝑋𝑘),k=3,4,… until you
have as many equations as you have parameters.
4. Solve this equation for the parameters.
– The resulting values are called method of moments estimators. It
seems reasonable that this method would provide good estimates,
since the empirical distribution converges in some sense to the
probability distribution. Therefore, the corresponding moments
should be about equal.
Another Form of the Method
– The basic idea behind this form of the method is to:
1. Equate the first sample moment about the origin 𝑀1 =
1
𝑛
σ𝑖=1
𝑛
𝑋𝑖 = ҧ
𝑥 to the first theoretical moment E(X).
2. Equate the second sample moment about the mean 𝑀1 =
1
𝑛
σ𝑖=1
𝑛
ሺ𝑥𝑖 − ҧ
𝑥ሻ2 to the second theoretical moment about the
mean E[ሺ𝑋 − 𝜇ሻ2].
3. Continue equating sample moments about the mean Mk∗ with the
corresponding theoretical moments about the
mean E[ሺ𝑋 − 𝜇ሻ𝑘], k=3,4,… until you have as many equations as
you have parameters.
4. Solve for the parameters.
– Again, the resulting values are called method of moments
estimators.
Example
Let 𝑋1, 𝑋2, … … … , 𝑋𝑛 be normal random variate with mean 𝜇 and
variance 𝜎2. What are the method of moment estimators of the
mean 𝜇 and variance 𝜎2 ?
The first and second theoretical moments about the origin are:
𝐸 𝑋𝑖 = 𝜇 𝑎𝑛𝑑 𝐸 𝑋𝑖
2
= 𝜎2 + 𝜇2
Here we have two parameters for which we are trying to derive
method of moment’s estimators.
Answer
Therefore, we need two equations here. Equating the first theoretical
moment about the origin with the corresponding sample moment, we get:
𝐸 𝑋𝑖 = 𝜇 =
1
𝑛
෍
𝑖=1
𝑛
𝑋𝑖 … … … … … … … . ሺ1ሻ
And equating the second theoretical moment about the origin with the
corresponding sample moment, we get:
𝐸 𝑋𝑖
2
= 𝜎2 + 𝜇2 =
1
𝑛
෍
𝑖=1
𝑛
𝑋𝑖
2
… … … … … … … . 2
Now from equation (1) we say that the method of moments estimator
for the mean 𝜇 is the sample mean:
Ƹ
𝜇𝑀𝑀 =
1
𝑛
෍
𝑖=1
𝑛
𝑋𝑖 = ത
𝑋
And by substituting the sample mean as the estimator of 𝜇 in the
second equation and solving for 𝜎2, we get the method of moments
estimator for the variance 𝜎2
is;
෢
𝜎2
𝑀𝑀 =
1
𝑛
෍
𝑖=1
𝑛
𝑋𝑖
2
− 𝜇2 =
1
𝑛
෍
𝑖=1
𝑛
𝑋𝑖
2
− ത
𝑋2 =
1
𝑛
෍
𝑖=1
𝑛
ሺ𝑋𝑖 − ഥ
𝑋 ሻ2
For this example, if cross check, then method of moments estimators
are the same as the maximum likelihood estimators.
Thank
You

More Related Content

What's hot

Introduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorIntroduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorAmir Al-Ansary
 
Statistical inference
Statistical inferenceStatistical inference
Statistical inferenceJags Jagdish
 
Review & Hypothesis Testing
Review & Hypothesis TestingReview & Hypothesis Testing
Review & Hypothesis TestingSr Edith Bogue
 
Estimation in statistics
Estimation in statisticsEstimation in statistics
Estimation in statisticsRabea Jamal
 
Inferential statictis ready go
Inferential statictis ready goInferential statictis ready go
Inferential statictis ready goMmedsc Hahm
 
Regression analysis
Regression analysisRegression analysis
Regression analysissaba khan
 
Statistical Estimation
Statistical Estimation Statistical Estimation
Statistical Estimation Remyagharishs
 
Probability distribution
Probability distributionProbability distribution
Probability distributionRanjan Kumar
 
Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Marina Santini
 
Mathematical Expectation And Variance
Mathematical Expectation And VarianceMathematical Expectation And Variance
Mathematical Expectation And VarianceDataminingTools Inc
 
Statistical inference concept, procedure of hypothesis testing
Statistical inference   concept, procedure of hypothesis testingStatistical inference   concept, procedure of hypothesis testing
Statistical inference concept, procedure of hypothesis testingAmitaChaudhary19
 
F Distribution
F  DistributionF  Distribution
F Distributionjravish
 

What's hot (20)

Introduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorIntroduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood Estimator
 
Point Estimation
Point EstimationPoint Estimation
Point Estimation
 
Estimation
EstimationEstimation
Estimation
 
Statistical inference
Statistical inferenceStatistical inference
Statistical inference
 
Review & Hypothesis Testing
Review & Hypothesis TestingReview & Hypothesis Testing
Review & Hypothesis Testing
 
Estimation in statistics
Estimation in statisticsEstimation in statistics
Estimation in statistics
 
Inferential statictis ready go
Inferential statictis ready goInferential statictis ready go
Inferential statictis ready go
 
Binomial distribution
Binomial distributionBinomial distribution
Binomial distribution
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Bias and Mean square error
Bias and Mean square errorBias and Mean square error
Bias and Mean square error
 
Statistical Estimation
Statistical Estimation Statistical Estimation
Statistical Estimation
 
Hypothesis testing
Hypothesis testingHypothesis testing
Hypothesis testing
 
Sampling Distribution
Sampling DistributionSampling Distribution
Sampling Distribution
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Lecture 5: Interval Estimation
Lecture 5: Interval Estimation
 
Mathematical Expectation And Variance
Mathematical Expectation And VarianceMathematical Expectation And Variance
Mathematical Expectation And Variance
 
Statistical inference concept, procedure of hypothesis testing
Statistical inference   concept, procedure of hypothesis testingStatistical inference   concept, procedure of hypothesis testing
Statistical inference concept, procedure of hypothesis testing
 
F Distribution
F  DistributionF  Distribution
F Distribution
 
Sampling Distributions and Estimators
Sampling Distributions and Estimators Sampling Distributions and Estimators
Sampling Distributions and Estimators
 

Similar to Methods of point estimation

Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfJifarRaya
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdfJifarRaya
 
!Business statistics tekst
!Business statistics tekst!Business statistics tekst
!Business statistics tekstKing Nisar
 
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...Dexlab Analytics
 
machine learning.pdf
machine learning.pdfmachine learning.pdf
machine learning.pdfMoeenAhmad11
 
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxCHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxanshujain54751
 
Koh_Liang_ICML2017
Koh_Liang_ICML2017Koh_Liang_ICML2017
Koh_Liang_ICML2017Masa Kato
 
Factor Extraction method in factor analysis with example in R studio.pptx
Factor Extraction method in factor analysis with example in R studio.pptxFactor Extraction method in factor analysis with example in R studio.pptx
Factor Extraction method in factor analysis with example in R studio.pptxGauravRajole
 
Elementary statistical inference1
Elementary statistical inference1Elementary statistical inference1
Elementary statistical inference1SEMINARGROOT
 
Module-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data scienceModule-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data sciencepujashri1975
 
Introduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsIntroduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsUniversity of Salerno
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxmccormicknadine86
 

Similar to Methods of point estimation (20)

MLE.pdf
MLE.pdfMLE.pdf
MLE.pdf
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdf
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdf
 
3 es timation-of_parameters[1]
3 es timation-of_parameters[1]3 es timation-of_parameters[1]
3 es timation-of_parameters[1]
 
!Business statistics tekst
!Business statistics tekst!Business statistics tekst
!Business statistics tekst
 
Point Estimation
Point EstimationPoint Estimation
Point Estimation
 
Inorganic CHEMISTRY
Inorganic CHEMISTRYInorganic CHEMISTRY
Inorganic CHEMISTRY
 
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...
Basic of Statistical Inference Part-III: The Theory of Estimation from Dexlab...
 
03 Data Mining Techniques
03 Data Mining Techniques03 Data Mining Techniques
03 Data Mining Techniques
 
machine learning.pdf
machine learning.pdfmachine learning.pdf
machine learning.pdf
 
Multiple linear regression
Multiple linear regressionMultiple linear regression
Multiple linear regression
 
Thesis
ThesisThesis
Thesis
 
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptxCHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
CHAPTER 1 THEORY OF PROBABILITY AND STATISTICS.pptx
 
Basic statistics
Basic statistics Basic statistics
Basic statistics
 
Koh_Liang_ICML2017
Koh_Liang_ICML2017Koh_Liang_ICML2017
Koh_Liang_ICML2017
 
Factor Extraction method in factor analysis with example in R studio.pptx
Factor Extraction method in factor analysis with example in R studio.pptxFactor Extraction method in factor analysis with example in R studio.pptx
Factor Extraction method in factor analysis with example in R studio.pptx
 
Elementary statistical inference1
Elementary statistical inference1Elementary statistical inference1
Elementary statistical inference1
 
Module-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data scienceModule-2_Notes-with-Example for data science
Module-2_Notes-with-Example for data science
 
Introduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsIntroduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov Chains
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 

Recently uploaded

Pteris : features, anatomy, morphology and lifecycle
Pteris : features, anatomy, morphology and lifecyclePteris : features, anatomy, morphology and lifecycle
Pteris : features, anatomy, morphology and lifecycleCherry
 
Dr. E. Muralinath_ Blood indices_clinical aspects
Dr. E. Muralinath_ Blood indices_clinical  aspectsDr. E. Muralinath_ Blood indices_clinical  aspects
Dr. E. Muralinath_ Blood indices_clinical aspectsmuralinath2
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxseri bangash
 
Use of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptxUse of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptxRenuJangid3
 
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRLGwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRLkantirani197
 
LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.Cherry
 
Understanding Partial Differential Equations: Types and Solution Methods
Understanding Partial Differential Equations: Types and Solution MethodsUnderstanding Partial Differential Equations: Types and Solution Methods
Understanding Partial Differential Equations: Types and Solution Methodsimroshankoirala
 
Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate ProfessorThyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate Professormuralinath2
 
Role of AI in seed science Predictive modelling and Beyond.pptx
Role of AI in seed science  Predictive modelling and  Beyond.pptxRole of AI in seed science  Predictive modelling and  Beyond.pptx
Role of AI in seed science Predictive modelling and Beyond.pptxArvind Kumar
 
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Cherry
 
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptxClimate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptxDiariAli
 
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....muralinath2
 
Efficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence accelerationEfficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence accelerationSérgio Sacani
 
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot Girls
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot GirlsKanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot Girls
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot GirlsDeepika Singh
 
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort ServiceCall Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort Serviceshivanisharma5244
 
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...Monika Rani
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxMohamedFarag457087
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceAlex Henderson
 
Factory Acceptance Test( FAT).pptx .
Factory Acceptance Test( FAT).pptx       .Factory Acceptance Test( FAT).pptx       .
Factory Acceptance Test( FAT).pptx .Poonam Aher Patil
 

Recently uploaded (20)

Pteris : features, anatomy, morphology and lifecycle
Pteris : features, anatomy, morphology and lifecyclePteris : features, anatomy, morphology and lifecycle
Pteris : features, anatomy, morphology and lifecycle
 
Dr. E. Muralinath_ Blood indices_clinical aspects
Dr. E. Muralinath_ Blood indices_clinical  aspectsDr. E. Muralinath_ Blood indices_clinical  aspects
Dr. E. Muralinath_ Blood indices_clinical aspects
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
 
Use of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptxUse of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptx
 
Clean In Place(CIP).pptx .
Clean In Place(CIP).pptx                 .Clean In Place(CIP).pptx                 .
Clean In Place(CIP).pptx .
 
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRLGwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
 
LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.LUNULARIA -features, morphology, anatomy ,reproduction etc.
LUNULARIA -features, morphology, anatomy ,reproduction etc.
 
Understanding Partial Differential Equations: Types and Solution Methods
Understanding Partial Differential Equations: Types and Solution MethodsUnderstanding Partial Differential Equations: Types and Solution Methods
Understanding Partial Differential Equations: Types and Solution Methods
 
Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate ProfessorThyroid Physiology_Dr.E. Muralinath_ Associate Professor
Thyroid Physiology_Dr.E. Muralinath_ Associate Professor
 
Role of AI in seed science Predictive modelling and Beyond.pptx
Role of AI in seed science  Predictive modelling and  Beyond.pptxRole of AI in seed science  Predictive modelling and  Beyond.pptx
Role of AI in seed science Predictive modelling and Beyond.pptx
 
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.Cyathodium bryophyte: morphology, anatomy, reproduction etc.
Cyathodium bryophyte: morphology, anatomy, reproduction etc.
 
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptxClimate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
 
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
 
Efficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence accelerationEfficient spin-up of Earth System Models usingsequence acceleration
Efficient spin-up of Earth System Models usingsequence acceleration
 
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot Girls
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot GirlsKanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot Girls
Kanchipuram Escorts 🥰 8617370543 Call Girls Offer VIP Hot Girls
 
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort ServiceCall Girls Ahmedabad +917728919243 call me Independent Escort Service
Call Girls Ahmedabad +917728919243 call me Independent Escort Service
 
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS  ESCORT SERVICE In Bhiwan...
Bhiwandi Bhiwandi ❤CALL GIRL 7870993772 ❤CALL GIRLS ESCORT SERVICE In Bhiwan...
 
Digital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptxDigital Dentistry.Digital Dentistryvv.pptx
Digital Dentistry.Digital Dentistryvv.pptx
 
FAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical ScienceFAIRSpectra - Enabling the FAIRification of Analytical Science
FAIRSpectra - Enabling the FAIRification of Analytical Science
 
Factory Acceptance Test( FAT).pptx .
Factory Acceptance Test( FAT).pptx       .Factory Acceptance Test( FAT).pptx       .
Factory Acceptance Test( FAT).pptx .
 

Methods of point estimation

  • 1. Methods of Point Estimation By Suruchi Somwanshi M.Sc. (Mathematics) M.Sc. (Statistics)
  • 2. TOPICS TO BE COVERED 1. Introduction to statistical inference 2. Theory of estimation 3. Methods of estimation 3.1 Method of maximum likelihood estimation 3.2 Method of moments
  • 3. 1. Introduction to statistical inference Statistics Inferential Statistics Estimation Testing of Hypothesis Descriptive statistics Descriptive analysis Graphical Presentation
  • 4. What do we mean by Statistical Inference? Drawing conclusion or making decision about population based on information collected from the sample. Population Sample Representative Making Conclusions
  • 5. – Statistical inference is further divided into two parts Testing of hypothesis & Theory of Estimation Testing of hypothesis – ➢ The theory of testing of hypothesis is initiated by J. Neyman and E. S. Pearson. ➢ It provides the rule which makes one to decide about the acceptance or rejection of the hypothesis under study. Theory of estimation – ➢ The theory of estimation was founded by Prof. R. A. Fisher. ➢ It discuss the ways of assigning the value to a population parameter based on values of corresponding statistics (function of sample observations).
  • 6. 2. Theory of estimation ➢ The theory of estimation was founded by R. A. Fisher. Inferential Statistics Estimation Point Estimation Interval Estimation Testing of hypothesis
  • 7. What do we mean by Estimation It discuss the ways of assigning the values to a population parameter based on the values of the corresponding statistics(function of the sample observations). The statistics used to estimate population parameter is called estimator. The value of the estimator is called estimate.
  • 8. Types of estimation There are two types of estimation Point estimation & Interval estimation
  • 9. Point Estimation It involves the use of sample data to calculate a single value(known as a Point estimate) which is to serve as a best guess or best estimate of an unknown population parameter. More formally, it is the application of a point estimator to the data to obtain a point estimate.
  • 10. Interval estimation It is the use of sample data to calculate an interval of possible values of an unknown population parameter; this is in contrast to point estimation, which gives a single value Is an interval which is formed by two quantities based on sample data within which the parameter will lie with very high probability.
  • 11. 3. Methods of Estimation – Following are some of the important methods for obtaining good estimators : ➢ Method of maximum likelihood estimation ➢ Method of moments
  • 12. 3.1 Method of maximum likelihood estimation – It is initially formulated by C. F. Gauss. – In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of a probability distribution by maximizing a likelihood function, so that under the assumed statistical model the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate.
  • 13. Likelihood function It is formed from the joint density function of the sample. i.e., 𝐿 = 𝐿 𝜃 = 𝑓 𝑥1, 𝜃 … … … . 𝑓 𝑥𝑛, 𝜃 = ෑ 𝑖=1 𝑛 𝑓 𝑥𝑖, 𝜃 Where 𝑥1, 𝑥2, 𝑥3,……. 𝑥𝑛 be a random sample of size n from a population with density function 𝑓 𝑥, 𝜃 .
  • 14. Steps to perform in MLE 1. Define the likelihood, ensuring you’re using the correct distribution for your classification problem. 2. Take the natural log and reduce the product function to a sum function. 3. Then compute the parameter by considering the case 𝜕 𝜕𝜃 𝑙𝑜𝑔𝐿 = 0 & 𝜕2 𝜕𝜃2 𝑙𝑜𝑔𝐿 < 0 This equations are usually referred to as the Likelihood Equation for estimating the parameters.
  • 15. Example Suppose we have a random sample 𝑋1, 𝑋2, … … … , 𝑋𝑛 where : 𝑋𝑖 = 0 ; 𝑖𝑓 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡𝑒𝑑 𝑠𝑡𝑢𝑑𝑒𝑛𝑡 𝑑𝑜𝑒𝑠 𝑛𝑜𝑡 𝑜𝑤𝑛 𝑎 𝑐𝑎𝑟, 𝑎𝑛𝑑 𝑋𝑖 = 1 ; 𝑖𝑓 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡𝑒𝑑 𝑠𝑡𝑢𝑑𝑒𝑛𝑡 𝑑𝑜𝑒𝑠 𝑜𝑤𝑛 𝑎 𝑐𝑎𝑟. Assuming that the 𝑋𝑖 are independent Bernoulli random variables with unknown parameter p, find the maximum likelihood estimator of p, the proportion of students who own a sports car.
  • 16. If the 𝑋𝑖 are independent Bernoulli random variables with unknown parameter p, then the probability mass function of each 𝑋𝑖 is : 𝑓 𝑥; 𝑝 = 𝑝𝑥 1 − 𝑝 1−𝑥 For 𝑋𝑖 = 0𝑜𝑟 1 𝑎𝑛𝑑 0 < 𝑝 < 1. Therefore, the likelihood function L(p) is, by definition: Answer 𝐿 𝑝 = ς𝑖=1 𝑛 𝑓 𝑥; 𝑝 = 𝑝𝑥1 1 − 𝑝 1−𝑥1 × 𝑝𝑥2 1 − 𝑝 1−𝑥2 × ⋯ … … … × 𝑝𝑥𝑛ሺ ሻ 1 − 𝑝 1−𝑥𝑛 For 0 < p < 1. Simplifying, by summing up the exponents we get: 𝐿 𝑝 = 𝑝σ𝑖=1 𝑛 𝑥𝑖 1 − 𝑝 𝑛 − σ𝑖=1 𝑛 𝑥𝑖 ………………………..(1)
  • 17. Now, in order to implement the method of maximum likelihood, we need to find the value of unknown parameter p that maximizes the likelihood L(p) given in equation (1). So to maximize the function, we are need to differentiate the likelihood function with respect to p. And to make the differentiation easy we are going to use the logarithm of likelihood function as it is an increasing function of x. That is, if 𝑥1 < 𝑥2 , then𝑓ሺ𝑥1ሻ < 𝑓ሺ𝑥2ሻ. That means the value of p that maximizes the natural logarithm of the likelihood function log L(p) is also the value of p that maximizes the likelihood function L(p).
  • 18. So we take the derivative of log L(p) with respect to p instead of taking the derivative of L(p). In this case, the log likelihood function is : 𝑙𝑜𝑔𝐿 𝑝 = σ𝑖=1 𝑛 𝑥𝑖 log 𝑝 + 𝑛 − σ𝑖=1 𝑛 𝑥𝑖 log 1 − 𝑝 …………….. (2) Taking the derivative of log L(p) with respect to p and equate it with 0 we get : 𝜕 log 𝐿 𝑝 𝜕𝑝 = 0
  • 19. => σ 𝑥𝑖 𝑝 − 𝑛 − σ 𝑥𝑖 1 − 𝑝 = 0 Now by simplifying this for p we get; Here (“^”) is used to represent the estimate of parameter p. Though we find the estimate of parameter p, technically to verify that it is maximum. For that the second derivative of the logL(p) with respect to p should negative i.e., 𝜕2 log 𝐿 𝑝 𝜕𝑝2 < 0 => −𝑛 < 0 … … … … . ሺ𝑏𝑦 3ሻ Thus, Ƹ 𝑝 = σ 𝑥𝑖 𝑛 is maximum likelihood estimator of p.
  • 20. 3.2 Method of moments – This method was discovered and studied in detail by Karl Pearson. – The basic idea behind this form of the method is to: 1. Equate the first sample moment about the origin 𝑀1 = 1 𝑛 σ𝑖=1 𝑛 𝑋𝑖 = ҧ 𝑥 to the first theoretical moment E(X). 2. Equate the second sample moment about the origin 𝑀2= 1 𝑛 σ𝑖=1 𝑛 𝑋𝑖 2 to the second theoretical moment E(𝑋2 ).
  • 21. 3. Continue equating sample moments about the origin, 𝑀𝑘, with the corresponding theoretical moments E(𝑋𝑘),k=3,4,… until you have as many equations as you have parameters. 4. Solve this equation for the parameters. – The resulting values are called method of moments estimators. It seems reasonable that this method would provide good estimates, since the empirical distribution converges in some sense to the probability distribution. Therefore, the corresponding moments should be about equal.
  • 22. Another Form of the Method – The basic idea behind this form of the method is to: 1. Equate the first sample moment about the origin 𝑀1 = 1 𝑛 σ𝑖=1 𝑛 𝑋𝑖 = ҧ 𝑥 to the first theoretical moment E(X). 2. Equate the second sample moment about the mean 𝑀1 = 1 𝑛 σ𝑖=1 𝑛 ሺ𝑥𝑖 − ҧ 𝑥ሻ2 to the second theoretical moment about the mean E[ሺ𝑋 − 𝜇ሻ2]. 3. Continue equating sample moments about the mean Mk∗ with the corresponding theoretical moments about the mean E[ሺ𝑋 − 𝜇ሻ𝑘], k=3,4,… until you have as many equations as you have parameters. 4. Solve for the parameters. – Again, the resulting values are called method of moments estimators.
  • 23. Example Let 𝑋1, 𝑋2, … … … , 𝑋𝑛 be normal random variate with mean 𝜇 and variance 𝜎2. What are the method of moment estimators of the mean 𝜇 and variance 𝜎2 ?
  • 24. The first and second theoretical moments about the origin are: 𝐸 𝑋𝑖 = 𝜇 𝑎𝑛𝑑 𝐸 𝑋𝑖 2 = 𝜎2 + 𝜇2 Here we have two parameters for which we are trying to derive method of moment’s estimators. Answer
  • 25. Therefore, we need two equations here. Equating the first theoretical moment about the origin with the corresponding sample moment, we get: 𝐸 𝑋𝑖 = 𝜇 = 1 𝑛 ෍ 𝑖=1 𝑛 𝑋𝑖 … … … … … … … . ሺ1ሻ And equating the second theoretical moment about the origin with the corresponding sample moment, we get: 𝐸 𝑋𝑖 2 = 𝜎2 + 𝜇2 = 1 𝑛 ෍ 𝑖=1 𝑛 𝑋𝑖 2 … … … … … … … . 2
  • 26. Now from equation (1) we say that the method of moments estimator for the mean 𝜇 is the sample mean: Ƹ 𝜇𝑀𝑀 = 1 𝑛 ෍ 𝑖=1 𝑛 𝑋𝑖 = ത 𝑋 And by substituting the sample mean as the estimator of 𝜇 in the second equation and solving for 𝜎2, we get the method of moments estimator for the variance 𝜎2 is; ෢ 𝜎2 𝑀𝑀 = 1 𝑛 ෍ 𝑖=1 𝑛 𝑋𝑖 2 − 𝜇2 = 1 𝑛 ෍ 𝑖=1 𝑛 𝑋𝑖 2 − ത 𝑋2 = 1 𝑛 ෍ 𝑖=1 𝑛 ሺ𝑋𝑖 − ഥ 𝑋 ሻ2 For this example, if cross check, then method of moments estimators are the same as the maximum likelihood estimators.