2. OUTCOMES OF CHAPTER 4
• To fit curves to data using available techniques.
• To assess the reliability of the answers obtained.
• To choose the preferred method for any particular problem.
3. LEARNING OUTCOME
• To study different techniques to fit curves or approximating
functions to the set of discrete data and to manipulate these
approximating functions.
• Least-squares regression. Get the ‘best’ straight line to fit
through a set of uncertain data points.
• Interpolation. Estimate intermediate values between precise data
points by deriving polynomials in equation forms. Two methods to
be investigated:
(a) Newton’s interpolating polynomial,
(b) Lagrange interpolating polynomial
4. CURVE FITTING
INTRODUCTION
• Curve fitting:
• finding a curve (approximation) which has the best fit to a series of
discrete data
• The curve is the estimate of the trend of the dependent variables
• the curve can be used to determine the intermediate estimate of the data.
• Approaches for curve fitting:
1. Least-square regression
• Data with significant error or noise
• Curve doesn’t pass all data points – curve represent general trend of the data
2. Interpolation
• Data is known to be precise
• Curve passes all data point
5. CURVE FITTING
INTRODUCTION
• Typical data
• is discrete but we are interested to know the intermediate value
• need to estimate these intermediate values
6. LEAST SQUARE
REGRESSION
INTRODUCTION
Regression?
• modeling of relationship between dependent and independent
variables
• finding a curve which represent the best approximation of a series of
data points
• the curve is the estimate of the trend of dependent variables
• How to find the curve?
• by deriving the function of the curve
• functions can be linear, polynomial & exponential
8. LINEAR REGRESSION
•
• Ideally, if all the residuals are zero, one may have found an equation in which all the
points lie on the model.
• Thus, minimization of the residual is an objective of obtaining regression coefficients.
9. LINEAR REGRESSION
•
n = total number of points
This is an inadequate criterion no unique model
10. LINEAR REGRESSION
• Examples of some criteria for “best fit” that are
inadequate for regression:
a) minimizes the sum of the residuals,
b) minimizes the sum of the absolute values
of the residuals, and
c) minimizes the maximum error of any
individual point.
• However, a more practical criterion for
least-squares approach is to minimize the sum
of the squares of the residuals, that is
11. LINEAR REGRESSION
• Best strategy! Yields a unique line for a given set of data.
• Using the regression model:
• the slope and intercept producing the best fit can be found using:
12. EXAMPLE 1 - LINEAR REGRESSION
Fit the best straight line to the following set of x and y values
using the method of least-squares.
Solution:
0 1 2 3 4 5 6
2 5 9 15 17 24 25
0 2 0 0
1 5 1 5
2 9 4 18
3 15 9 45
4 17 16 68
5 24 25 120
6 25 36 150
21 97 91 406
13. EXAMPLE 1 - LINEAR REGRESSION
Knowing the linear equation and using known value:
21 97 91 406
14. EXAMPLE 1 - LINEAR REGRESSION
Least-square fit is given by:
15. ERROR QUANTIFICATION IN
LINEAR REGRESSION
• for a straight line, the sum of the squares of the estimate residuals:
• Standard error of the estimate:
• Quantify the spread of data around the
regression line
• Used to quantify the ‘goodness’ of a fit
18. EXAMPLE 2 - LINEAR REGRESSION
• Determine the coefficient of correlation for the linear regression
line obtained in the Example 1
0 2
1 5
2 9
3 15
4 17
5 24
6 25
21 97
19. POLYNOMIAL REGRESSION
NON-LINEAR MODEL
• The linear least-squares regression procedure
can be readily extended to fit data to a
higher-order polynomial.
• Again, the idea is to minimize the sum of the
squares of the estimate residuals.
• The figure shows the same data fit with:
a) A first order polynomial
b) A second order polynomial
• For second order polynomial regression:
20. POLYNOMIAL REGRESSION
NON-LINEAR MODEL
• For a second order polynomial, the best fit would mean minimizing:
• In general, this would mean minimizing:
• The standard error for fitting an mth
order polynomial to n-data points is:
because the mth
order polynomial has (m+1) coefficients.
• The coefficient of determination r2
is still found using:
21. POLYNOMIAL REGRESSION
NON-LINEAR MODEL
• To find the constants of the polynomial model, we partially
differentiate it with respect to each of the unknown coefficients and
set them equal to zero.
25. QUESTION 1
• Use least-squares regression to fit a straight
line to the respective data.
• Along with the slope and the intercept,
compute the standard error of the estimate
and the correlation coefficient. Plot the data
and the regression line.
• Recompute, but use polynomial regression to
fit a parabola to the data.
• Compare the results.
x y
1 1
2 1.5
3 2
4 3
5 4
6 5
7 8
8 10
9 13
26. LINEARIZATION OF
NONLINEAR RELATIONSHIP
• Linear regression provides a powerful technique for fitting a best
line to data.
• In some cases, techniques such as polynomial regression, are
appropriate.
• For others, transformations can be used to express the data in a
form that is compatible with linear regression.
28. QUESTION 2: EXCEL
EXERCISE
LINEARIZATION OF NONLINEAR RELATIONSHIP
0.75 1.2
2 1.95
3 2
4 2.4
6 2.4
8 2.7
8.5 2.6
Fit the following data with:
a) a saturation-growth-rate model,
b) a power equation, and
c) a parabola.
In each case, plot the data and the
equation.