THE AITKEN’S METHOD
Aitken’s Method
In numerical analysis, Aitken's delta-squared
process or Aitken Extrapolation is a series
acceleration method, used for accelerating the rate
of convergence of a sequence. It is named
after Alexander Aitken, who introduced this method
in 1926. Its early form was known to Seki Kōwa (end
of 17th century) and was found for rectification of
the circle, i.e. the calculation of π. It is most useful
for accelerating the convergence of a sequence that
is converging linearly.
Aitken’sAlgorithm
Aitken's Delta Squared Process
Also called Aitken Extrapolation. An Algorithm which
extrapolates the partial sums of a series whose
Convergence is approximately geometric and accelerates its
rate of Convergence. A simple nonlinear sequence
transformation is the Aitken extrapolation or delta-squared
method
This transformation is commonly used to improve the rate of
convergence of a slowly converging sequence; heuristically, it
eliminates the largest part of the absolute error.
FORMULA:
Derivation:
Adding and Subtracting and In numerator
then group terms
Finally
ACCELERATING CONVERGENCE:
Aitken’s ∆2 process
Used to accelerate linearly
convergent sequences,
regardless of the method used.
this acceleration method is not
only applicable to root-finding
algorithms.
Aitken’s process is an extrapolation
r delta r delta^2 r
rk-2
rk-1 rk-1-rk-2
rk rk-rk-1 (rk-rk-1)-(rk-
1-rk-2)
Steffensen’s Method:
a modified Aitken’s delta-squared process
applied to fixed point iteration.
In numerical analysis, Steffensen's method is
a root-finding technique similar to Newton's
method, named after Johan Frederik
Steffensen. Steffensen's method also
achieves quadratic convergence, but without
using derivatives as Newton's method does.
Derivation usingAitken's delta-squared process
Implemented in the MATLAB can be found using the Aitken's
delta-squared process for accelerating convergence of a
sequence. This method assumes starting with a linearly
convergent sequence and increases the rate of convergence of
that sequence. If the signs of agree and is sufficiently close to
the desired limit of the sequence , we can assume the following:
Example : Find a root of cos[x] - x * exp[x] = 0 with
x0 = 0.0
Let the linear iterative process be
xi+1 = xi + 1/2(cos[xi]- xi * exp[xi] ) i = 0, 1, 2 . . .
Aitken extrapolation can greatly accelerate the convergence
of a linearly convergent iteration
xn+1 = g(xn)
This shows the power of understanding the behaviour
of the error in a numerical process. From that understanding,
we can often improve the accuracy, thru
extrapolation or some other procedure.
This is a justification for using mathematical analyses
to understand numerical methods. We will see this
repeated at later points in the course, and it holds
with many different types of problems and numerical
methods for their solution.
General Comment

Aitken’s method

  • 1.
  • 2.
    Aitken’s Method In numericalanalysis, Aitken's delta-squared process or Aitken Extrapolation is a series acceleration method, used for accelerating the rate of convergence of a sequence. It is named after Alexander Aitken, who introduced this method in 1926. Its early form was known to Seki Kōwa (end of 17th century) and was found for rectification of the circle, i.e. the calculation of π. It is most useful for accelerating the convergence of a sequence that is converging linearly.
  • 3.
  • 5.
    Aitken's Delta SquaredProcess Also called Aitken Extrapolation. An Algorithm which extrapolates the partial sums of a series whose Convergence is approximately geometric and accelerates its rate of Convergence. A simple nonlinear sequence transformation is the Aitken extrapolation or delta-squared method This transformation is commonly used to improve the rate of convergence of a slowly converging sequence; heuristically, it eliminates the largest part of the absolute error.
  • 6.
  • 8.
  • 9.
    Adding and Subtractingand In numerator then group terms
  • 10.
  • 11.
    ACCELERATING CONVERGENCE: Aitken’s ∆2process Used to accelerate linearly convergent sequences, regardless of the method used. this acceleration method is not only applicable to root-finding algorithms.
  • 12.
    Aitken’s process isan extrapolation r delta r delta^2 r rk-2 rk-1 rk-1-rk-2 rk rk-rk-1 (rk-rk-1)-(rk- 1-rk-2)
  • 13.
    Steffensen’s Method: a modifiedAitken’s delta-squared process applied to fixed point iteration. In numerical analysis, Steffensen's method is a root-finding technique similar to Newton's method, named after Johan Frederik Steffensen. Steffensen's method also achieves quadratic convergence, but without using derivatives as Newton's method does.
  • 14.
    Derivation usingAitken's delta-squaredprocess Implemented in the MATLAB can be found using the Aitken's delta-squared process for accelerating convergence of a sequence. This method assumes starting with a linearly convergent sequence and increases the rate of convergence of that sequence. If the signs of agree and is sufficiently close to the desired limit of the sequence , we can assume the following:
  • 16.
    Example : Finda root of cos[x] - x * exp[x] = 0 with x0 = 0.0 Let the linear iterative process be xi+1 = xi + 1/2(cos[xi]- xi * exp[xi] ) i = 0, 1, 2 . . .
  • 17.
    Aitken extrapolation cangreatly accelerate the convergence of a linearly convergent iteration xn+1 = g(xn) This shows the power of understanding the behaviour of the error in a numerical process. From that understanding, we can often improve the accuracy, thru extrapolation or some other procedure. This is a justification for using mathematical analyses to understand numerical methods. We will see this repeated at later points in the course, and it holds with many different types of problems and numerical methods for their solution. General Comment