SlideShare a Scribd company logo
1 of 45
Principles of Least Squares
Introduction
• In surveying, we often have geometric
constraints for our measurements
– Differential leveling loop closure = 0
– Sum of interior angles of a polygon = (n-2)180°
– Closed traverse: Σlats = Σdeps = 0
• Because of measurement errors, these
constraints are generally not met exactly, so an
adjustment should be performed
Random Error Adjustment
• We assume (hope?) that all systematic errors
have been removed so only random error
remains
• Random error conforms to the laws of
probability
• Should adjust the measurements accordingly
• Why?
Definition of a Residual
If M represents the most probable value of a measured
quantity, and zi represents the ith measurement, then the ith
residual, vi is:
vi = M – zi
Fundamental Principle of Least
Squares
minimum
2
2
3
2
2
2
1
2






 n
v
v
v
v
v 
In order to obtain most probable values (MPVs), the sum
of squares of the residuals must be minimized. (See book
for derivation.) In the weighted case, the weighted squares
of the residuals must be minimized.
minimum
2
2
3
3
2
2
2
2
1
1
2






 n
nv
w
v
w
v
w
v
w
wv 
Technically the weighted form shown assumes that the
measurements are independent, but we can handle the
general case involving covariance.
Stochastic Model
• The covariances (including variances) and hence
the weights as well, form the stochastic model
• Even an “unweighted” adjustment assumes that
all observations have equal weight which is also a
stochastic model
• The stochastic model is different from the
mathematical model
• Stochastic models may be determined through
sample statistics and error propagation, but are
often a priori estimates.
Mathematical Model
• The mathematical model is a set of one or more
equations that define an adjustment condition
• Examples are the constraints mentioned earlier
• Models also include collinearity equations in
photogrammetry and the equation of a line in linear
regression
• It is important that the model properly represents
reality – for example the angles of a plane triangle
should total 180°, but if the triangle is large, spherical
excess cause a systematic error so a more elaborate
model is needed.
Types of Models
Conditional and Parametric
• A conditional model enforces geometric conditions on
the measurements and their residuals
• A parametric model expresses equations in terms of
unknowns that were not directly measured, but relate to
the measurements (e.g. a distance expressed by
coordinate inverse)
• Parametric models are more commonly used because it
can be difficult to express all of the conditions in a
complicated measurement network
Observation Equations
• Observation equations are written for the
parametric model
• One equation is written for each observation
• The equation is generally expressed as a
function of unknown variables (such as
coordinates) equals a measurement plus a
residual
• We want more measurements than unknowns
which gives a redundant adjustment
Elementary Example
Consider the following three equations involving two unknowns.
If Equations (1) and (2) are solved, x = 1.5 and y = 1.5. However,
if Equations (2) and (3) are solved, x = 1.3 and y = 1.1 and if
Equations (1) and (3) are solved, x = 1.6 and y = 1.4.
(1) x + y = 3.0
(2) 2x – y = 1.5
(3) x – y = 0.2
If we consider the right side terms to be measurements, they
have errors and residual terms must be included for consistency.
Example - Continued
x + y – 3.0 = v1
2x – y – 1.5 = v2
x – y – 0.2 = v3
To find the MPVs for x and y we use a least squares solution by
minimizing the sum of squares of residuals.
 








 2
2
2
2
)
2
.
0
(
)
5
.
1
2
(
)
0
.
3
(
)
,
( y
x
y
x
y
x
v
y
x
f
Example - Continued
To minimize, we take partial derivatives with respect to each of the
variables and set them equal to zero. Then solve the two equations.
0
)
1
)(
2
.
0
(
2
)
1
)(
5
.
1
2
(
2
)
0
.
3
(
2
0
)
2
.
0
(
2
)
2
)(
5
.
1
2
(
2
)
0
.
3
(
2


























y
x
y
x
y
x
y
f
y
x
y
x
y
x
x
f
These equations simplify to the following normal equations.
6x – 2y = 6.2
-2x + 3y = 1.3
Example - Continued
Solve by matrix methods.















































443
.
1
514
.
1
3
.
1
2
.
6
6
2
2
3
14
1
3
.
1
2
.
6
3
2
2
6
y
x
y
x
We should also compute residuals:
v1 = 1.514 + 1.443 – 3.0 = -0.044
v2 = 2(1.514) – 1.443 – 1.5 = 0.086
v3 = 1.514 – 1.443 – 0.2 = -0.128
Systematic Formation of Normal
Equations
Resultant Equations
Following derivation in the book results in:
Example – Systematic Approach
Now let’s try the systematic approach to the example.
(1) x + y = 3.0 + v1
(2) 2x – y = 1.5 + v2
(3) x – y = 0.2 + v3
Create a table:
a b l a2 ab b2 al bl
1 1 3.0 1 1 1 3.0 3.0
2 -1 1.5 4 -2 1 3.0 -1.5
1 -1 0.2 1 -1 1 0.2 -0.2
Σ=6 Σ=-2 Σ=3 Σ=6.2 Σ=1.3
Note that this yields the same normal equations.
Matrix Method













mn
m
m
n
n
a
a
a
a
a
a
a
a
a
A







2
1
2
22
21
1
12
11













m
l
l
l
L

2
1













n
x
x
x
X

2
1
Matrix form for linear observation equations:
AX = L + V
Where:













m
v
v
v
V

2
1
Note: m is the number of observations and n is the number of
unknowns. For a redundant solution, m > n .
Least Squares Solution
Applying the condition of minimizing the sum of squared residuals:
ATAX = ATL
or
NX = ATL
Solution is:
X = (ATA)-1ATL = N -1ATL
and residuals are computed from:
V = AX – L
Example – Matrix Approach















































































































3
.
1
2
.
6
2
.
0
5
.
1
0
.
3
1
1
1
1
2
1
3
2
2
6
1
1
1
2
1
1
1
1
1
1
2
1
2
.
0
5
.
1
0
.
3
1
1
1
2
1
1
3
2
1
L
A
y
x
y
x
AX
A
V
L
v
v
v
y
x
AX
T
T
Matrix Form With Weights
Weighted linear observation equations:
WAX = WL + WV
Normal equations:
ATWAX = NX = ATWL
Matrix Form – Nonlinear System
We use a Taylor series approximation. We will need the Jacobian
matrix and a set of initial approximations.
The observation equations are:
JX = K + V
Where: J is the Jacobian matrix (partial derivatives)
X contains corrections for the approximations
K has observed minus computed values
V has the residuals
The least squares solution is: X = (JTJ)-1JTK = N-1JTK
Weighted Form – Nonlinear System
The observation equations are:
WJX = WK + WV
The least squares solution is: X = (JTWJ)-1JTWK = N-1JTWK
Example 10.2
Determine the least squares solution for the following:
F(x,y) = x + y – 2y2 = -4
G(x,y) = x2 + y2 = 8
H(x,y) = 3x2 – y2 = 7.7
Use x0 = 2, and y0 = 2 for initial approximations.
Example - Continued
y
y
F
x
F
4
1
1







y
y
G
x
x
G
2
2






y
y
H
x
x
H
2
6







Take partial derivatives and form the Jacobian matrix.


























4
12
4
4
7
1
2
6
2
2
4
1
1
0
0
0
0
0
y
x
y
x
y
J
Example - Continued











































3
.
0
0
0
8
7
.
7
8
8
)
4
(
4
)
,
(
7
.
7
)
,
(
8
)
,
(
4
0
0
0
0
0
0
y
x
H
y
x
G
y
x
F
K
Form K matrix and set up least squares solution.

























































2
.
1
6
.
3
3
.
0
0
0
4
4
7
12
4
1
81
39
39
161
4
12
4
4
7
1
4
4
7
12
4
1
K
J
J
J
T
T
Example - Continued























00458
.
0
2125
.
0
2
.
1
6
.
3
81
39
39
161
1
X























01004
.
0
00168
.
0
75219
.
0
12393
.
0
40354
.
81
75082
.
38
75082
.
38
61806
.
157
1
X
Add the corrections to get new approximations and repeat.
x0 = 2.00 – 0.02125 = 1.97875 y0 = 2.00 + 0.00458 = 2.00458
Add the new corrections to get better approximations.
x0 = 1.97875 + 0.00168 = 1.98043 y0 = 2.00458 + 0.01004 = 2.01462
Further iterations give negligible corrections so the final solution is:
x = 1.98 y = 2.01
Linear Regression
Fitting x,y data points to a straight line: y = mx + b
Observation Equations
b
mx
v
y
b
mx
v
y
b
mx
v
y
b
mx
v
y
D
y
D
C
y
C
B
y
B
A
y
A
D
C
B
A
























































D
C
B
A
y
y
y
y
D
C
B
A
D
C
B
A
v
v
v
v
y
y
y
y
b
m
x
x
x
x
1
1
1
1
In matrix form: AX = L + V
Example 10.3
point x y
A 3.00 4.50
B 4.25 4.25
C 5.50 5.50
D 8.00 5.50
Fit a straight line to the
points in the table.
Compute m and b by
least squares.
In matrix form:












































D
C
B
A
v
v
v
v
b
m
50
.
5
50
.
5
25
.
4
50
.
4
1
00
.
8
1
50
.
5
1
25
.
4
1
00
.
3
Example - Continued






























663
.
3
246
.
0
7500
.
19
8125
.
105
0000
.
4
7500
.
20
7500
.
20
3125
.
121
)
(
)
(
1
1
L
A
A
A
b
m
X T
T

















































13
.
0
48
.
0
46
.
0
10
.
0
50
.
5
50
.
5
25
.
4
50
.
4
663
.
3
246
.
0
1
00
.
8
1
50
.
5
1
25
.
4
1
00
.
3
L
AX
V
Standard Deviation of Unit Weight
48
.
0
2
4
47
.
0
2
0 





n
m
v
S
Where: m is the number of observations and
n is the number of unknowns
Question: What about x-values? Are they
observations?
Fitting a Parabola to a Set of Points
Equation: Ax2 + Bx + C = y
This is still a linear problem in terms of the unknowns
A, B, and C.
Need more than 3 points for a redundant solution.
Example - Parabola
Parabola Fit Solution - 1



















1
5
5
1
4
4
1
3
3
1
2
2
1
1
1
1
0
0
2
2
2
2
2
2
A



















41
.
93
43
.
98
21
.
102
77
.
104
43
.
105
84
.
103
L
Set up matrices for observation equations
Parabola Fit Solution - 2



































046
.
104
902
.
1
813
.
0
09
.
608
37
.
1482
53
.
5354
6
15
55
15
55
225
55
225
979
)
(
1
1
L
A
A
A
x T
T
Solve by unweighted least squares solution
Compute
residuals
























180
.
0
216
.
0
225
.
0
172
.
0
295
.
0
206
.
0
L
AX
V
Condition Equations
• Establish all independent, redundant
conditions
• Residual terms are treated as unknowns in the
problem
• Method is suitable for “simple” problems
where there is only one condition (e.g. interior
angles of a polygon, horizon closure)
Condition Equation Example
Condition Example - Continued
Condition Example - Continued
Condition Example - Continued
Note that the angle with the smallest standard deviation has
the smallest residual and the largest SD has the largest residual
Example Using Observation Equations
Observation Example - Continued













1
1
1
0
0
1
A

















2
2
2
3
.
4
1
0
0
0
9
.
9
1
0
0
0
7
.
6
1
W
















360
"
14
'
03
142
"
35
'
17
83
"
56
'
38
134
L







06429
.
0
05408
.
0
05408
.
0
07636
.
0
WA
AT







6370848
.
12
7867721
.
14
WL
AT
Observation Example - Continued







 
"
1
.
44
'
17
83
"
2
.
00
'
39
134
)
(
)
( 1


WL
A
WA
A
X T
T
"
7
.
15
'
03
142
"
1
.
44
'
17
83
"
2
.
00
'
39
134
360
3








a
Note that the answer is the same as that obtained with
condition equations.
Simple Method for Angular Closure
Given a set of angles and associated variances and a
misclosure, C, residuals can be computed by the following:


 n
i
i
i
i
C
v
1
2
2


Angular Closure – Simple Method
39
.
161
3
.
4
9
.
9
7
.
6 2
2
2
3
1
2






i
i

"
7
.
1
39
.
161
)
3
.
4
(
"
15
"
1
.
9
39
.
161
)
9
.
9
(
"
15
"
2
.
4
39
.
161
)
7
.
6
(
"
15
2
3
2
2
2
1






v
v
v

More Related Content

What's hot

Lecture 1-aerial photogrammetry
Lecture 1-aerial photogrammetryLecture 1-aerial photogrammetry
Lecture 1-aerial photogrammetryVidhi Khokhani
 
Lecture 2 errors and uncertainty
Lecture 2 errors and uncertaintyLecture 2 errors and uncertainty
Lecture 2 errors and uncertaintySarhat Adam
 
Matrix and Determinants
Matrix and DeterminantsMatrix and Determinants
Matrix and DeterminantsAarjavPinara
 
Moving average method maths ppt
Moving average method maths pptMoving average method maths ppt
Moving average method maths pptAbhishek Mahto
 
Grouped Data Calculation.pdf
Grouped Data Calculation.pdfGrouped Data Calculation.pdf
Grouped Data Calculation.pdfBJYXSZD2
 
Moments, Kurtosis N Skewness
Moments, Kurtosis N SkewnessMoments, Kurtosis N Skewness
Moments, Kurtosis N SkewnessNISHITAKALYANI
 
Measure of Central Tendency (Mean, Median, Mode and Quantiles)
Measure of Central Tendency (Mean, Median, Mode and Quantiles)Measure of Central Tendency (Mean, Median, Mode and Quantiles)
Measure of Central Tendency (Mean, Median, Mode and Quantiles)Salman Khan
 
Determination of Soil Type by Ternary Diagram textural plotting
Determination of Soil Type by Ternary Diagram textural plottingDetermination of Soil Type by Ternary Diagram textural plotting
Determination of Soil Type by Ternary Diagram textural plottingMithun Ray
 
Traverse Survey Part 1/2
Traverse Survey Part 1/2Traverse Survey Part 1/2
Traverse Survey Part 1/2Muhammad Zubair
 
Discrete and continuous probability distributions ppt @ bec doms
Discrete and continuous probability distributions ppt @ bec domsDiscrete and continuous probability distributions ppt @ bec doms
Discrete and continuous probability distributions ppt @ bec domsBabasab Patil
 
Tacheometric surveying
Tacheometric surveying Tacheometric surveying
Tacheometric surveying neharajpl
 
Least Squares Regression Method | Edureka
Least Squares Regression Method | EdurekaLeast Squares Regression Method | Edureka
Least Squares Regression Method | EdurekaEdureka!
 
Least square method
Least square methodLeast square method
Least square methodPandidurai P
 
PROBLEMS ON BEARINGS
PROBLEMS ON BEARINGSPROBLEMS ON BEARINGS
PROBLEMS ON BEARINGSPralhad Kore
 

What's hot (20)

Lecture 1-aerial photogrammetry
Lecture 1-aerial photogrammetryLecture 1-aerial photogrammetry
Lecture 1-aerial photogrammetry
 
Lecture 2 errors and uncertainty
Lecture 2 errors and uncertaintyLecture 2 errors and uncertainty
Lecture 2 errors and uncertainty
 
Matrix and Determinants
Matrix and DeterminantsMatrix and Determinants
Matrix and Determinants
 
Moving average method maths ppt
Moving average method maths pptMoving average method maths ppt
Moving average method maths ppt
 
Grouped Data Calculation.pdf
Grouped Data Calculation.pdfGrouped Data Calculation.pdf
Grouped Data Calculation.pdf
 
Types of curves
Types of curvesTypes of curves
Types of curves
 
Moments, Kurtosis N Skewness
Moments, Kurtosis N SkewnessMoments, Kurtosis N Skewness
Moments, Kurtosis N Skewness
 
Lecture9 traverse
Lecture9 traverseLecture9 traverse
Lecture9 traverse
 
Measure of Central Tendency (Mean, Median, Mode and Quantiles)
Measure of Central Tendency (Mean, Median, Mode and Quantiles)Measure of Central Tendency (Mean, Median, Mode and Quantiles)
Measure of Central Tendency (Mean, Median, Mode and Quantiles)
 
Index numbers
Index numbersIndex numbers
Index numbers
 
Determination of Soil Type by Ternary Diagram textural plotting
Determination of Soil Type by Ternary Diagram textural plottingDetermination of Soil Type by Ternary Diagram textural plotting
Determination of Soil Type by Ternary Diagram textural plotting
 
Introduction to surveying
Introduction to surveyingIntroduction to surveying
Introduction to surveying
 
statistic
statisticstatistic
statistic
 
Traverse Survey Part 1/2
Traverse Survey Part 1/2Traverse Survey Part 1/2
Traverse Survey Part 1/2
 
Discrete and continuous probability distributions ppt @ bec doms
Discrete and continuous probability distributions ppt @ bec domsDiscrete and continuous probability distributions ppt @ bec doms
Discrete and continuous probability distributions ppt @ bec doms
 
Tacheometric surveying
Tacheometric surveying Tacheometric surveying
Tacheometric surveying
 
Least Squares Regression Method | Edureka
Least Squares Regression Method | EdurekaLeast Squares Regression Method | Edureka
Least Squares Regression Method | Edureka
 
Least square method
Least square methodLeast square method
Least square method
 
Curves.pptx
Curves.pptxCurves.pptx
Curves.pptx
 
PROBLEMS ON BEARINGS
PROBLEMS ON BEARINGSPROBLEMS ON BEARINGS
PROBLEMS ON BEARINGS
 

Similar to 18-21 Principles of Least Squares.ppt

Solution of System of Linear Equations
Solution of System of Linear EquationsSolution of System of Linear Equations
Solution of System of Linear Equationsmofassair
 
Numerical integration
Numerical integration Numerical integration
Numerical integration Dhyey Shukla
 
Maths iii quick review by Dr Asish K Mukhopadhyay
Maths iii quick review by Dr Asish K MukhopadhyayMaths iii quick review by Dr Asish K Mukhopadhyay
Maths iii quick review by Dr Asish K MukhopadhyayDr. Asish K Mukhopadhyay
 
MODULE_05-Matrix Decomposition.pptx
MODULE_05-Matrix Decomposition.pptxMODULE_05-Matrix Decomposition.pptx
MODULE_05-Matrix Decomposition.pptxAlokSingh205089
 
Gauss jordan and Guass elimination method
Gauss jordan and Guass elimination methodGauss jordan and Guass elimination method
Gauss jordan and Guass elimination methodMeet Nayak
 
Numerical Techniques
Numerical TechniquesNumerical Techniques
Numerical TechniquesYasir Mahdi
 
Regression Analysis.pptx
Regression Analysis.pptxRegression Analysis.pptx
Regression Analysis.pptxMdRokonMia1
 
Applied numerical methods lec6
Applied numerical methods lec6Applied numerical methods lec6
Applied numerical methods lec6Yasser Ahmed
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfRahulUkhande
 
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docx
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docxSection 0.7 Quadratic Equations from Precalculus Prerequisite.docx
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docxbagotjesusa
 
Business Math Chapter 3
Business Math Chapter 3Business Math Chapter 3
Business Math Chapter 3Nazrin Nazdri
 
Chapter 3: Linear Systems and Matrices - Part 2/Slides
Chapter 3: Linear Systems and Matrices - Part 2/SlidesChapter 3: Linear Systems and Matrices - Part 2/Slides
Chapter 3: Linear Systems and Matrices - Part 2/SlidesChaimae Baroudi
 
Analytic Geometry Period 1
Analytic Geometry Period 1Analytic Geometry Period 1
Analytic Geometry Period 1ingroy
 
Lecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdfLecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdfMDNomanCh
 

Similar to 18-21 Principles of Least Squares.ppt (20)

Solution of System of Linear Equations
Solution of System of Linear EquationsSolution of System of Linear Equations
Solution of System of Linear Equations
 
Numerical integration
Numerical integration Numerical integration
Numerical integration
 
Maths iii quick review by Dr Asish K Mukhopadhyay
Maths iii quick review by Dr Asish K MukhopadhyayMaths iii quick review by Dr Asish K Mukhopadhyay
Maths iii quick review by Dr Asish K Mukhopadhyay
 
MODULE_05-Matrix Decomposition.pptx
MODULE_05-Matrix Decomposition.pptxMODULE_05-Matrix Decomposition.pptx
MODULE_05-Matrix Decomposition.pptx
 
Gauss jordan and Guass elimination method
Gauss jordan and Guass elimination methodGauss jordan and Guass elimination method
Gauss jordan and Guass elimination method
 
Numerical Techniques
Numerical TechniquesNumerical Techniques
Numerical Techniques
 
Regression Analysis.pptx
Regression Analysis.pptxRegression Analysis.pptx
Regression Analysis.pptx
 
Applied numerical methods lec6
Applied numerical methods lec6Applied numerical methods lec6
Applied numerical methods lec6
 
Ch9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdfCh9-Gauss_Elimination4.pdf
Ch9-Gauss_Elimination4.pdf
 
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docx
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docxSection 0.7 Quadratic Equations from Precalculus Prerequisite.docx
Section 0.7 Quadratic Equations from Precalculus Prerequisite.docx
 
Business Math Chapter 3
Business Math Chapter 3Business Math Chapter 3
Business Math Chapter 3
 
5 3 solving trig eqns
5 3 solving trig eqns5 3 solving trig eqns
5 3 solving trig eqns
 
Chapter 3: Linear Systems and Matrices - Part 2/Slides
Chapter 3: Linear Systems and Matrices - Part 2/SlidesChapter 3: Linear Systems and Matrices - Part 2/Slides
Chapter 3: Linear Systems and Matrices - Part 2/Slides
 
Analytic Geometry Period 1
Analytic Geometry Period 1Analytic Geometry Period 1
Analytic Geometry Period 1
 
Input analysis
Input analysisInput analysis
Input analysis
 
numerical.ppt
numerical.pptnumerical.ppt
numerical.ppt
 
1 rules for exponents
1 rules for exponents1 rules for exponents
1 rules for exponents
 
Matrix.pptx
Matrix.pptxMatrix.pptx
Matrix.pptx
 
Lecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdfLecture Notes in Econometrics Arsen Palestini.pdf
Lecture Notes in Econometrics Arsen Palestini.pdf
 
Ca 1.6
Ca 1.6Ca 1.6
Ca 1.6
 

More from BAGARAGAZAROMUALD2

More from BAGARAGAZAROMUALD2 (13)

water-13-00495-v3.pdf
water-13-00495-v3.pdfwater-13-00495-v3.pdf
water-13-00495-v3.pdf
 
AssessingNormalityandDataTransformations.ppt
AssessingNormalityandDataTransformations.pptAssessingNormalityandDataTransformations.ppt
AssessingNormalityandDataTransformations.ppt
 
5116427.ppt
5116427.ppt5116427.ppt
5116427.ppt
 
240-design.ppt
240-design.ppt240-design.ppt
240-design.ppt
 
Remote Sensing_2020-21 (1).pdf
Remote Sensing_2020-21  (1).pdfRemote Sensing_2020-21  (1).pdf
Remote Sensing_2020-21 (1).pdf
 
Szeliski_NLS1.ppt
Szeliski_NLS1.pptSzeliski_NLS1.ppt
Szeliski_NLS1.ppt
 
AssessingNormalityandDataTransformations.ppt
AssessingNormalityandDataTransformations.pptAssessingNormalityandDataTransformations.ppt
AssessingNormalityandDataTransformations.ppt
 
Ch 11.2 Chi Squared Test for Independence.pptx
Ch 11.2 Chi Squared Test for Independence.pptxCh 11.2 Chi Squared Test for Independence.pptx
Ch 11.2 Chi Squared Test for Independence.pptx
 
lecture12.ppt
lecture12.pptlecture12.ppt
lecture12.ppt
 
Chi-Square Presentation - Nikki.ppt
Chi-Square Presentation - Nikki.pptChi-Square Presentation - Nikki.ppt
Chi-Square Presentation - Nikki.ppt
 
StatWRLecture6.ppt
StatWRLecture6.pptStatWRLecture6.ppt
StatWRLecture6.ppt
 
chapter18.ppt
chapter18.pptchapter18.ppt
chapter18.ppt
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
 

Recently uploaded

Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAbdelrhman abooda
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130Suhani Kapoor
 
9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home ServiceSapana Sha
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFAAndrei Kaleshka
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfLars Albertsson
 
Call Girls In Mahipalpur O9654467111 Escorts Service
Call Girls In Mahipalpur O9654467111  Escorts ServiceCall Girls In Mahipalpur O9654467111  Escorts Service
Call Girls In Mahipalpur O9654467111 Escorts ServiceSapana Sha
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDRafezzaman
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceSapana Sha
 
RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998YohFuh
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfLars Albertsson
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingNeil Barnes
 
Data Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxData Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxFurkanTasci3
 
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)jennyeacort
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...soniya singh
 
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...Pooja Nehwal
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxEmmanuel Dauda
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...Florian Roscheck
 
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...dajasot375
 

Recently uploaded (20)

Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptxAmazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
Amazon TQM (2) Amazon TQM (2)Amazon TQM (2).pptx
 
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
VIP Call Girls Service Miyapur Hyderabad Call +91-8250192130
 
9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service
 
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
VIP Call Girls Service Charbagh { Lucknow Call Girls Service 9548273370 } Boo...
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFA
 
Schema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdfSchema on read is obsolete. Welcome metaprogramming..pdf
Schema on read is obsolete. Welcome metaprogramming..pdf
 
Call Girls In Mahipalpur O9654467111 Escorts Service
Call Girls In Mahipalpur O9654467111  Escorts ServiceCall Girls In Mahipalpur O9654467111  Escorts Service
Call Girls In Mahipalpur O9654467111 Escorts Service
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
 
Call Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts ServiceCall Girls In Dwarka 9654467111 Escorts Service
Call Girls In Dwarka 9654467111 Escorts Service
 
RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998RA-11058_IRR-COMPRESS Do 198 series of 1998
RA-11058_IRR-COMPRESS Do 198 series of 1998
 
Industrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdfIndustrialised data - the key to AI success.pdf
Industrialised data - the key to AI success.pdf
 
Brighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data StorytellingBrighton SEO | April 2024 | Data Storytelling
Brighton SEO | April 2024 | Data Storytelling
 
Data Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptxData Science Jobs and Salaries Analysis.pptx
Data Science Jobs and Salaries Analysis.pptx
 
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
Call Us ➥97111√47426🤳Call Girls in Aerocity (Delhi NCR)
 
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
 
E-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptxE-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptx
 
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...{Pooja:  9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
{Pooja: 9892124323 } Call Girl in Mumbai | Jas Kaur Rate 4500 Free Hotel Del...
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptx
 
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
 
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
 

18-21 Principles of Least Squares.ppt

  • 2. Introduction • In surveying, we often have geometric constraints for our measurements – Differential leveling loop closure = 0 – Sum of interior angles of a polygon = (n-2)180° – Closed traverse: Σlats = Σdeps = 0 • Because of measurement errors, these constraints are generally not met exactly, so an adjustment should be performed
  • 3. Random Error Adjustment • We assume (hope?) that all systematic errors have been removed so only random error remains • Random error conforms to the laws of probability • Should adjust the measurements accordingly • Why?
  • 4. Definition of a Residual If M represents the most probable value of a measured quantity, and zi represents the ith measurement, then the ith residual, vi is: vi = M – zi
  • 5. Fundamental Principle of Least Squares minimum 2 2 3 2 2 2 1 2        n v v v v v  In order to obtain most probable values (MPVs), the sum of squares of the residuals must be minimized. (See book for derivation.) In the weighted case, the weighted squares of the residuals must be minimized. minimum 2 2 3 3 2 2 2 2 1 1 2        n nv w v w v w v w wv  Technically the weighted form shown assumes that the measurements are independent, but we can handle the general case involving covariance.
  • 6. Stochastic Model • The covariances (including variances) and hence the weights as well, form the stochastic model • Even an “unweighted” adjustment assumes that all observations have equal weight which is also a stochastic model • The stochastic model is different from the mathematical model • Stochastic models may be determined through sample statistics and error propagation, but are often a priori estimates.
  • 7. Mathematical Model • The mathematical model is a set of one or more equations that define an adjustment condition • Examples are the constraints mentioned earlier • Models also include collinearity equations in photogrammetry and the equation of a line in linear regression • It is important that the model properly represents reality – for example the angles of a plane triangle should total 180°, but if the triangle is large, spherical excess cause a systematic error so a more elaborate model is needed.
  • 8. Types of Models Conditional and Parametric • A conditional model enforces geometric conditions on the measurements and their residuals • A parametric model expresses equations in terms of unknowns that were not directly measured, but relate to the measurements (e.g. a distance expressed by coordinate inverse) • Parametric models are more commonly used because it can be difficult to express all of the conditions in a complicated measurement network
  • 9. Observation Equations • Observation equations are written for the parametric model • One equation is written for each observation • The equation is generally expressed as a function of unknown variables (such as coordinates) equals a measurement plus a residual • We want more measurements than unknowns which gives a redundant adjustment
  • 10. Elementary Example Consider the following three equations involving two unknowns. If Equations (1) and (2) are solved, x = 1.5 and y = 1.5. However, if Equations (2) and (3) are solved, x = 1.3 and y = 1.1 and if Equations (1) and (3) are solved, x = 1.6 and y = 1.4. (1) x + y = 3.0 (2) 2x – y = 1.5 (3) x – y = 0.2 If we consider the right side terms to be measurements, they have errors and residual terms must be included for consistency.
  • 11. Example - Continued x + y – 3.0 = v1 2x – y – 1.5 = v2 x – y – 0.2 = v3 To find the MPVs for x and y we use a least squares solution by minimizing the sum of squares of residuals.            2 2 2 2 ) 2 . 0 ( ) 5 . 1 2 ( ) 0 . 3 ( ) , ( y x y x y x v y x f
  • 12. Example - Continued To minimize, we take partial derivatives with respect to each of the variables and set them equal to zero. Then solve the two equations. 0 ) 1 )( 2 . 0 ( 2 ) 1 )( 5 . 1 2 ( 2 ) 0 . 3 ( 2 0 ) 2 . 0 ( 2 ) 2 )( 5 . 1 2 ( 2 ) 0 . 3 ( 2                           y x y x y x y f y x y x y x x f These equations simplify to the following normal equations. 6x – 2y = 6.2 -2x + 3y = 1.3
  • 13. Example - Continued Solve by matrix methods.                                                443 . 1 514 . 1 3 . 1 2 . 6 6 2 2 3 14 1 3 . 1 2 . 6 3 2 2 6 y x y x We should also compute residuals: v1 = 1.514 + 1.443 – 3.0 = -0.044 v2 = 2(1.514) – 1.443 – 1.5 = 0.086 v3 = 1.514 – 1.443 – 0.2 = -0.128
  • 14. Systematic Formation of Normal Equations
  • 15. Resultant Equations Following derivation in the book results in:
  • 16. Example – Systematic Approach Now let’s try the systematic approach to the example. (1) x + y = 3.0 + v1 (2) 2x – y = 1.5 + v2 (3) x – y = 0.2 + v3 Create a table: a b l a2 ab b2 al bl 1 1 3.0 1 1 1 3.0 3.0 2 -1 1.5 4 -2 1 3.0 -1.5 1 -1 0.2 1 -1 1 0.2 -0.2 Σ=6 Σ=-2 Σ=3 Σ=6.2 Σ=1.3 Note that this yields the same normal equations.
  • 17. Matrix Method              mn m m n n a a a a a a a a a A        2 1 2 22 21 1 12 11              m l l l L  2 1              n x x x X  2 1 Matrix form for linear observation equations: AX = L + V Where:              m v v v V  2 1 Note: m is the number of observations and n is the number of unknowns. For a redundant solution, m > n .
  • 18. Least Squares Solution Applying the condition of minimizing the sum of squared residuals: ATAX = ATL or NX = ATL Solution is: X = (ATA)-1ATL = N -1ATL and residuals are computed from: V = AX – L
  • 19. Example – Matrix Approach                                                                                                                3 . 1 2 . 6 2 . 0 5 . 1 0 . 3 1 1 1 1 2 1 3 2 2 6 1 1 1 2 1 1 1 1 1 1 2 1 2 . 0 5 . 1 0 . 3 1 1 1 2 1 1 3 2 1 L A y x y x AX A V L v v v y x AX T T
  • 20. Matrix Form With Weights Weighted linear observation equations: WAX = WL + WV Normal equations: ATWAX = NX = ATWL
  • 21. Matrix Form – Nonlinear System We use a Taylor series approximation. We will need the Jacobian matrix and a set of initial approximations. The observation equations are: JX = K + V Where: J is the Jacobian matrix (partial derivatives) X contains corrections for the approximations K has observed minus computed values V has the residuals The least squares solution is: X = (JTJ)-1JTK = N-1JTK
  • 22. Weighted Form – Nonlinear System The observation equations are: WJX = WK + WV The least squares solution is: X = (JTWJ)-1JTWK = N-1JTWK
  • 23. Example 10.2 Determine the least squares solution for the following: F(x,y) = x + y – 2y2 = -4 G(x,y) = x2 + y2 = 8 H(x,y) = 3x2 – y2 = 7.7 Use x0 = 2, and y0 = 2 for initial approximations.
  • 24. Example - Continued y y F x F 4 1 1        y y G x x G 2 2       y y H x x H 2 6        Take partial derivatives and form the Jacobian matrix.                           4 12 4 4 7 1 2 6 2 2 4 1 1 0 0 0 0 0 y x y x y J
  • 25. Example - Continued                                            3 . 0 0 0 8 7 . 7 8 8 ) 4 ( 4 ) , ( 7 . 7 ) , ( 8 ) , ( 4 0 0 0 0 0 0 y x H y x G y x F K Form K matrix and set up least squares solution.                                                          2 . 1 6 . 3 3 . 0 0 0 4 4 7 12 4 1 81 39 39 161 4 12 4 4 7 1 4 4 7 12 4 1 K J J J T T
  • 26. Example - Continued                        00458 . 0 2125 . 0 2 . 1 6 . 3 81 39 39 161 1 X                        01004 . 0 00168 . 0 75219 . 0 12393 . 0 40354 . 81 75082 . 38 75082 . 38 61806 . 157 1 X Add the corrections to get new approximations and repeat. x0 = 2.00 – 0.02125 = 1.97875 y0 = 2.00 + 0.00458 = 2.00458 Add the new corrections to get better approximations. x0 = 1.97875 + 0.00168 = 1.98043 y0 = 2.00458 + 0.01004 = 2.01462 Further iterations give negligible corrections so the final solution is: x = 1.98 y = 2.01
  • 27. Linear Regression Fitting x,y data points to a straight line: y = mx + b
  • 29. Example 10.3 point x y A 3.00 4.50 B 4.25 4.25 C 5.50 5.50 D 8.00 5.50 Fit a straight line to the points in the table. Compute m and b by least squares. In matrix form:                                             D C B A v v v v b m 50 . 5 50 . 5 25 . 4 50 . 4 1 00 . 8 1 50 . 5 1 25 . 4 1 00 . 3
  • 30. Example - Continued                               663 . 3 246 . 0 7500 . 19 8125 . 105 0000 . 4 7500 . 20 7500 . 20 3125 . 121 ) ( ) ( 1 1 L A A A b m X T T                                                  13 . 0 48 . 0 46 . 0 10 . 0 50 . 5 50 . 5 25 . 4 50 . 4 663 . 3 246 . 0 1 00 . 8 1 50 . 5 1 25 . 4 1 00 . 3 L AX V
  • 31. Standard Deviation of Unit Weight 48 . 0 2 4 47 . 0 2 0       n m v S Where: m is the number of observations and n is the number of unknowns Question: What about x-values? Are they observations?
  • 32. Fitting a Parabola to a Set of Points Equation: Ax2 + Bx + C = y This is still a linear problem in terms of the unknowns A, B, and C. Need more than 3 points for a redundant solution.
  • 34. Parabola Fit Solution - 1                    1 5 5 1 4 4 1 3 3 1 2 2 1 1 1 1 0 0 2 2 2 2 2 2 A                    41 . 93 43 . 98 21 . 102 77 . 104 43 . 105 84 . 103 L Set up matrices for observation equations
  • 35. Parabola Fit Solution - 2                                    046 . 104 902 . 1 813 . 0 09 . 608 37 . 1482 53 . 5354 6 15 55 15 55 225 55 225 979 ) ( 1 1 L A A A x T T Solve by unweighted least squares solution Compute residuals                         180 . 0 216 . 0 225 . 0 172 . 0 295 . 0 206 . 0 L AX V
  • 36. Condition Equations • Establish all independent, redundant conditions • Residual terms are treated as unknowns in the problem • Method is suitable for “simple” problems where there is only one condition (e.g. interior angles of a polygon, horizon closure)
  • 38. Condition Example - Continued
  • 39. Condition Example - Continued
  • 40. Condition Example - Continued Note that the angle with the smallest standard deviation has the smallest residual and the largest SD has the largest residual
  • 42. Observation Example - Continued              1 1 1 0 0 1 A                  2 2 2 3 . 4 1 0 0 0 9 . 9 1 0 0 0 7 . 6 1 W                 360 " 14 ' 03 142 " 35 ' 17 83 " 56 ' 38 134 L        06429 . 0 05408 . 0 05408 . 0 07636 . 0 WA AT        6370848 . 12 7867721 . 14 WL AT
  • 43. Observation Example - Continued          " 1 . 44 ' 17 83 " 2 . 00 ' 39 134 ) ( ) ( 1   WL A WA A X T T " 7 . 15 ' 03 142 " 1 . 44 ' 17 83 " 2 . 00 ' 39 134 360 3         a Note that the answer is the same as that obtained with condition equations.
  • 44. Simple Method for Angular Closure Given a set of angles and associated variances and a misclosure, C, residuals can be computed by the following:    n i i i i C v 1 2 2  
  • 45. Angular Closure – Simple Method 39 . 161 3 . 4 9 . 9 7 . 6 2 2 2 3 1 2       i i  " 7 . 1 39 . 161 ) 3 . 4 ( " 15 " 1 . 9 39 . 161 ) 9 . 9 ( " 15 " 2 . 4 39 . 161 ) 7 . 6 ( " 15 2 3 2 2 2 1       v v v