SlideShare a Scribd company logo
METHOD OF LEAST
SQUARES
By: Varun Luthra
11CSU163
INTRODUCTION
In engineering, two types of applications are
encountered:
• Trend analysis. Predicting values of dependent
variable, may include extrapolation beyond data
points or interpolation between data points.
•Hypothesis testing. Comparing existing
mathematical model with measured data.
CURVE FITTING
There are two general approaches for curve fitting:
•Least Squares regression:
Data exhibit a significant degree of scatter. The
strategy is to derive a single curve that represents the
general trend of the data.
•Interpolation:
Data is very precise. The strategy is to pass a curve or
a series of curves through each of the points.
OVERVIEW
•The method of least squares is a standard approach to the
approximate solution of overdetermined systems, i.e., sets
of equations in which there are more equations than
unknowns.
•"Least squares" means that the overall solution minimizes
the sum of the squares of the errors made in the results of
every single equation.
•The least-squares method is usually credited to Carl
Friedrich Gauss (1795), but it was first published
by Adrien-Marie Legendre.
Scatter Diagram:
To find a relationship b/w the set of paired observations
x and y(say), we plot their corresponding values on the
graph , taking one of the variables along the x-axis and
other along the y-axis i.e. (X
1,y1),(x2,y2)…..,(xn,yn).
The resulting diagram showing a collection of dots is
called a scatter diagram. A smooth curve that
approximate the above set of points is known as the
approximate curve.
WORKING PROCEDURE
 To fit the straight line y=a+bx
 Substitute the observed set of n values in this equation.
 Form the normal equations for each constant i.e.
∑y=na+b∑x, ∑xy=a∑x+b∑x2
.
 Solve these normal equations as simultaneous equations of
a and b.
 Substitute the values of a and b in y=a+bx, which is the
required line of best fit.
 To fit the parabola: y=a+bx+cx2
:
 Form the normal equations ∑y=na+b∑x+c∑x2
,
∑xy=a∑x+b∑x2
+c∑x3
and ∑x2
y=a∑x2
+ b∑x3
+ c∑x4
.
 Solve these as simultaneous equations for a,b,c.
 Substitute the values of a,b,c in y=a+bx+cx2
, which is
the required parabola of the best fit.
 In general, the curve y=a+bx+cx2
+……+kxm-1
can be
fitted to a given data by writing m normal equations.
 The normal equation for the unknown a is
obtained by multiplying the equations by the
coefficient of a and adding.
The normal equation for b is obtained by
multiplying the equations by the coefficients of b
(i.e. x) and adding.
 The normal equation for c has been obtained by
multiplying the equations by the coefficients of c
(i.e. x2
) and adding.
Example of a Straight Line
Fit a straight line to the x and y values in the
following Table:
5.119=∑ ii yx
28=∑ ix 0.24=∑ iy
1402
=∑ ix
xi yi xiyi xi
2
1 0.5 0.5 1
2 2.5 5 4
3 2 6 9
4 4 16 16
5 3.5 17.5 25
6 6 36 36
7 5.5 38.5 49
28 24 119.5 140
07142857.048392857.0428571.3
8392857.0
281407
24285.1197
)(
10
2
221
=×−=
−=
=
−×
×−×
=
−
−
=
∑ ∑
∑ ∑ ∑
xaya
xxn
yxyxn
a
ii
iiii
Y = 0.07142857 + 0.8392857 x
Example Of Other Curve
Fit the following Equation:
2
2
b
xay =
to the data in the following table:
xi yi
1 0.5
2 1.7
3 3.4
4 5.7
5 8.4
15 19.7
X*=log xi Y*=logyi
0 -0.301
0.301 0.226
0.477 0.534
0.602 0.753
0.699 0.922
2.079 2.141
)log(log 2
2
b
xay =
2120
**
log
logloglet
b, aaa
x,y, XY
==
==
xbay logloglog 22 +=
*
10
*
XaaY +=
Xi Yi X*i=Log(X) Y*i=Log(Y) X*Y* X*^2
1 0.5 0.0000 -0.3010 0.0000 0.0000
2 1.7 0.3010 0.2304 0.0694 0.0906
3 3.4 0.4771 0.5315 0.2536 0.2276
4 5.7 0.6021 0.7559 0.4551 0.3625
5 8.4 0.6990 0.9243 0.6460 0.4886
Sum 15 19.700 2.079 2.141 1.424 1.169
1 2 22
0 1
5 1.424 2.079 2.141
1.75
5 1.169 2.079( )
0.4282 1.75 0.41584 0.334
i i i i
i i
n x y x y
a
n x x
a y a x
 − × − ×
= = =
× −−

= − = − × = −
∑ ∑ ∑
∑ ∑
Curve Fitting
log y=-0.334+1.75log x
1.75
0.46y x=
APPLICATION
•The most important application is in data fitting. The
best fit in the least-squares sense minimizes the sum of
squared residuals, a residual being the difference
between an observed value and the fitted value provided
by a model.
•When the problem has substantial uncertainties in
the independent variable (the 'x' variable), then simple
regression and least squares methods have problems; in
such cases, the methodology required for fitting errors-
in-variables models may be considered instead of that
for least squares.
Thank You!
Have a nice day. 

More Related Content

What's hot

Binomial distribution
Binomial distributionBinomial distribution
Binomial distribution
Sushmita R Gopinath
 
Curve fitting
Curve fittingCurve fitting
Curve fitting
Mayank Bhatt
 
Least Squares Regression Method | Edureka
Least Squares Regression Method | EdurekaLeast Squares Regression Method | Edureka
Least Squares Regression Method | Edureka
Edureka!
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
Birinder Singh Gulati
 
Poisson distribution
Poisson distributionPoisson distribution
Poisson distribution
Anindya Jana
 
Regression
Regression Regression
Regression
Ali Raza
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
Rohit kumar
 
Chapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares RegressionChapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares Regression
nszakir
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
Avjinder (Avi) Kaler
 
Presentation On Regression
Presentation On RegressionPresentation On Regression
Presentation On Regression
alok tiwari
 
Probability Distributions
Probability DistributionsProbability Distributions
Probability Distributions
Birinder Singh Gulati
 
Curve fitting of exponential curve
Curve fitting of exponential curveCurve fitting of exponential curve
Curve fitting of exponential curve
Divyang Rathod
 
Regression ppt
Regression pptRegression ppt
Regression ppt
Shraddha Tiwari
 
Karl pearson's coefficient of correlation (1)
Karl pearson's coefficient of correlation (1)Karl pearson's coefficient of correlation (1)
Karl pearson's coefficient of correlation (1)
teenathankachen1993
 
Curve fitting
Curve fittingCurve fitting
Curve fitting
shashikantverma32
 
Karl pearson's correlation
Karl pearson's correlationKarl pearson's correlation
Karl pearson's correlation
fairoos1
 
Correlation and Regression
Correlation and RegressionCorrelation and Regression
Correlation and Regression
Sir Parashurambhau College, Pune
 
Linear regression
Linear regressionLinear regression
Linear regression
Karishma Chaudhary
 

What's hot (20)

Binomial distribution
Binomial distributionBinomial distribution
Binomial distribution
 
Regression
RegressionRegression
Regression
 
Curve fitting
Curve fittingCurve fitting
Curve fitting
 
Least Squares Regression Method | Edureka
Least Squares Regression Method | EdurekaLeast Squares Regression Method | Edureka
Least Squares Regression Method | Edureka
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Poisson distribution
Poisson distributionPoisson distribution
Poisson distribution
 
Regression
Regression Regression
Regression
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Chapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares RegressionChapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares Regression
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
 
Presentation On Regression
Presentation On RegressionPresentation On Regression
Presentation On Regression
 
Probability Distributions
Probability DistributionsProbability Distributions
Probability Distributions
 
Curve fitting of exponential curve
Curve fitting of exponential curveCurve fitting of exponential curve
Curve fitting of exponential curve
 
Probability concept and Probability distribution
Probability concept and Probability distributionProbability concept and Probability distribution
Probability concept and Probability distribution
 
Regression ppt
Regression pptRegression ppt
Regression ppt
 
Karl pearson's coefficient of correlation (1)
Karl pearson's coefficient of correlation (1)Karl pearson's coefficient of correlation (1)
Karl pearson's coefficient of correlation (1)
 
Curve fitting
Curve fittingCurve fitting
Curve fitting
 
Karl pearson's correlation
Karl pearson's correlationKarl pearson's correlation
Karl pearson's correlation
 
Correlation and Regression
Correlation and RegressionCorrelation and Regression
Correlation and Regression
 
Linear regression
Linear regressionLinear regression
Linear regression
 

Similar to Least square method

Curve fitting
Curve fittingCurve fitting
Curve fitting
aashikareliya
 
Applied numerical methods lec10
Applied numerical methods lec10Applied numerical methods lec10
Applied numerical methods lec10
Yasser Ahmed
 
Math4 presentation.ppsx
Math4 presentation.ppsxMath4 presentation.ppsx
Math4 presentation.ppsx
RaviPal876687
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdf
RahulUkhande
 
matrices and algbra
matrices and algbramatrices and algbra
matrices and algbra
gandhinagar
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework help
Mark Austin
 
Regression Analysis.pdf
Regression Analysis.pdfRegression Analysis.pdf
Regression Analysis.pdf
ShrutidharaSarma1
 
Chapter 3 solving systems of linear equations
Chapter 3 solving systems of linear equationsChapter 3 solving systems of linear equations
Chapter 3 solving systems of linear equations
ssuser53ee01
 
Department of MathematicsMTL107 Numerical Methods and Com.docx
Department of MathematicsMTL107 Numerical Methods and Com.docxDepartment of MathematicsMTL107 Numerical Methods and Com.docx
Department of MathematicsMTL107 Numerical Methods and Com.docx
salmonpybus
 
18-21 Principles of Least Squares.ppt
18-21 Principles of Least Squares.ppt18-21 Principles of Least Squares.ppt
18-21 Principles of Least Squares.ppt
BAGARAGAZAROMUALD2
 
Linear Regression Ordinary Least Squares Distributed Calculation Example
Linear Regression Ordinary Least Squares Distributed Calculation ExampleLinear Regression Ordinary Least Squares Distributed Calculation Example
Linear Regression Ordinary Least Squares Distributed Calculation Example
Marjan Sterjev
 
Regression analysis presentation
Regression analysis presentationRegression analysis presentation
Regression analysis presentation
MuhammadFaisal733
 
Business Math Chapter 3
Business Math Chapter 3Business Math Chapter 3
Business Math Chapter 3
Nazrin Nazdri
 
20200830230859_PPT4-Lines, Parabolas and Systems.pptx
20200830230859_PPT4-Lines, Parabolas and Systems.pptx20200830230859_PPT4-Lines, Parabolas and Systems.pptx
20200830230859_PPT4-Lines, Parabolas and Systems.pptx
ConanEdogawaShinichi
 
Applied numerical methods lec8
Applied numerical methods lec8Applied numerical methods lec8
Applied numerical methods lec8
Yasser Ahmed
 
Presentation on Regression Analysis
Presentation on Regression AnalysisPresentation on Regression Analysis
Presentation on Regression Analysis
Rekha Rani
 
Convergence Criteria
Convergence CriteriaConvergence Criteria
Convergence Criteria
Tarun Gehlot
 
Output primitives in Computer Graphics
Output primitives in Computer GraphicsOutput primitives in Computer Graphics
Output primitives in Computer Graphics
Kamal Acharya
 
Regression analysis by Muthama JM
Regression analysis by Muthama JMRegression analysis by Muthama JM
Regression analysis by Muthama JM
Japheth Muthama
 

Similar to Least square method (20)

Curve fitting
Curve fittingCurve fitting
Curve fitting
 
Linear algebra03fallleturenotes01
Linear algebra03fallleturenotes01Linear algebra03fallleturenotes01
Linear algebra03fallleturenotes01
 
Applied numerical methods lec10
Applied numerical methods lec10Applied numerical methods lec10
Applied numerical methods lec10
 
Math4 presentation.ppsx
Math4 presentation.ppsxMath4 presentation.ppsx
Math4 presentation.ppsx
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdf
 
matrices and algbra
matrices and algbramatrices and algbra
matrices and algbra
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework help
 
Regression Analysis.pdf
Regression Analysis.pdfRegression Analysis.pdf
Regression Analysis.pdf
 
Chapter 3 solving systems of linear equations
Chapter 3 solving systems of linear equationsChapter 3 solving systems of linear equations
Chapter 3 solving systems of linear equations
 
Department of MathematicsMTL107 Numerical Methods and Com.docx
Department of MathematicsMTL107 Numerical Methods and Com.docxDepartment of MathematicsMTL107 Numerical Methods and Com.docx
Department of MathematicsMTL107 Numerical Methods and Com.docx
 
18-21 Principles of Least Squares.ppt
18-21 Principles of Least Squares.ppt18-21 Principles of Least Squares.ppt
18-21 Principles of Least Squares.ppt
 
Linear Regression Ordinary Least Squares Distributed Calculation Example
Linear Regression Ordinary Least Squares Distributed Calculation ExampleLinear Regression Ordinary Least Squares Distributed Calculation Example
Linear Regression Ordinary Least Squares Distributed Calculation Example
 
Regression analysis presentation
Regression analysis presentationRegression analysis presentation
Regression analysis presentation
 
Business Math Chapter 3
Business Math Chapter 3Business Math Chapter 3
Business Math Chapter 3
 
20200830230859_PPT4-Lines, Parabolas and Systems.pptx
20200830230859_PPT4-Lines, Parabolas and Systems.pptx20200830230859_PPT4-Lines, Parabolas and Systems.pptx
20200830230859_PPT4-Lines, Parabolas and Systems.pptx
 
Applied numerical methods lec8
Applied numerical methods lec8Applied numerical methods lec8
Applied numerical methods lec8
 
Presentation on Regression Analysis
Presentation on Regression AnalysisPresentation on Regression Analysis
Presentation on Regression Analysis
 
Convergence Criteria
Convergence CriteriaConvergence Criteria
Convergence Criteria
 
Output primitives in Computer Graphics
Output primitives in Computer GraphicsOutput primitives in Computer Graphics
Output primitives in Computer Graphics
 
Regression analysis by Muthama JM
Regression analysis by Muthama JMRegression analysis by Muthama JM
Regression analysis by Muthama JM
 

Least square method

  • 1. METHOD OF LEAST SQUARES By: Varun Luthra 11CSU163
  • 2. INTRODUCTION In engineering, two types of applications are encountered: • Trend analysis. Predicting values of dependent variable, may include extrapolation beyond data points or interpolation between data points. •Hypothesis testing. Comparing existing mathematical model with measured data.
  • 3. CURVE FITTING There are two general approaches for curve fitting: •Least Squares regression: Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data. •Interpolation: Data is very precise. The strategy is to pass a curve or a series of curves through each of the points.
  • 4. OVERVIEW •The method of least squares is a standard approach to the approximate solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns. •"Least squares" means that the overall solution minimizes the sum of the squares of the errors made in the results of every single equation. •The least-squares method is usually credited to Carl Friedrich Gauss (1795), but it was first published by Adrien-Marie Legendre.
  • 5. Scatter Diagram: To find a relationship b/w the set of paired observations x and y(say), we plot their corresponding values on the graph , taking one of the variables along the x-axis and other along the y-axis i.e. (X 1,y1),(x2,y2)…..,(xn,yn). The resulting diagram showing a collection of dots is called a scatter diagram. A smooth curve that approximate the above set of points is known as the approximate curve.
  • 6. WORKING PROCEDURE  To fit the straight line y=a+bx  Substitute the observed set of n values in this equation.  Form the normal equations for each constant i.e. ∑y=na+b∑x, ∑xy=a∑x+b∑x2 .  Solve these normal equations as simultaneous equations of a and b.  Substitute the values of a and b in y=a+bx, which is the required line of best fit.
  • 7.  To fit the parabola: y=a+bx+cx2 :  Form the normal equations ∑y=na+b∑x+c∑x2 , ∑xy=a∑x+b∑x2 +c∑x3 and ∑x2 y=a∑x2 + b∑x3 + c∑x4 .  Solve these as simultaneous equations for a,b,c.  Substitute the values of a,b,c in y=a+bx+cx2 , which is the required parabola of the best fit.  In general, the curve y=a+bx+cx2 +……+kxm-1 can be fitted to a given data by writing m normal equations.
  • 8.  The normal equation for the unknown a is obtained by multiplying the equations by the coefficient of a and adding. The normal equation for b is obtained by multiplying the equations by the coefficients of b (i.e. x) and adding.  The normal equation for c has been obtained by multiplying the equations by the coefficients of c (i.e. x2 ) and adding.
  • 9. Example of a Straight Line Fit a straight line to the x and y values in the following Table: 5.119=∑ ii yx 28=∑ ix 0.24=∑ iy 1402 =∑ ix xi yi xiyi xi 2 1 0.5 0.5 1 2 2.5 5 4 3 2 6 9 4 4 16 16 5 3.5 17.5 25 6 6 36 36 7 5.5 38.5 49 28 24 119.5 140
  • 11. Example Of Other Curve Fit the following Equation: 2 2 b xay = to the data in the following table: xi yi 1 0.5 2 1.7 3 3.4 4 5.7 5 8.4 15 19.7 X*=log xi Y*=logyi 0 -0.301 0.301 0.226 0.477 0.534 0.602 0.753 0.699 0.922 2.079 2.141 )log(log 2 2 b xay = 2120 ** log logloglet b, aaa x,y, XY == == xbay logloglog 22 += * 10 * XaaY +=
  • 12. Xi Yi X*i=Log(X) Y*i=Log(Y) X*Y* X*^2 1 0.5 0.0000 -0.3010 0.0000 0.0000 2 1.7 0.3010 0.2304 0.0694 0.0906 3 3.4 0.4771 0.5315 0.2536 0.2276 4 5.7 0.6021 0.7559 0.4551 0.3625 5 8.4 0.6990 0.9243 0.6460 0.4886 Sum 15 19.700 2.079 2.141 1.424 1.169 1 2 22 0 1 5 1.424 2.079 2.141 1.75 5 1.169 2.079( ) 0.4282 1.75 0.41584 0.334 i i i i i i n x y x y a n x x a y a x  − × − × = = = × −−  = − = − × = − ∑ ∑ ∑ ∑ ∑
  • 14. APPLICATION •The most important application is in data fitting. The best fit in the least-squares sense minimizes the sum of squared residuals, a residual being the difference between an observed value and the fitted value provided by a model. •When the problem has substantial uncertainties in the independent variable (the 'x' variable), then simple regression and least squares methods have problems; in such cases, the methodology required for fitting errors- in-variables models may be considered instead of that for least squares.
  • 15. Thank You! Have a nice day. 