Orthogonal porjection in statistics
Upcoming SlideShare
Loading in...5
×
 

Orthogonal porjection in statistics

on

  • 1,560 views

Orthogonal projection in statistics a power point lecture (Rajshahi University)

Orthogonal projection in statistics a power point lecture (Rajshahi University)

Statistics

Views

Total Views
1,560
Slideshare-icon Views on SlideShare
1,559
Embed Views
1

Actions

Likes
1
Downloads
100
Comments
0

1 Embed 1

http://www.slashdocs.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Orthogonal porjection in statistics Orthogonal porjection in statistics Presentation Transcript

    • Projection Md. Sahidul Islam Ripon Department of statistics Rajshahi University Email: ripon.ru.statistics@gmail.com
    • Content
      • Orthogonal vector
      • Orthonormal vector
      • Projection
      • Gram-Schmidt orthogonalization
    • Orthogonal vector
      • In mathematics, two vectors are orthogonal if they are perpendicular, i.e., they form a right angle. The relation is clearly symmetric; that is, if x is orthogonal to y then then and so y is orthogonal to x .
      • Two vectors, x and y, in an inner product space, V, are orthogonal if their inner product is zero.
      Fig : The line segments AB and CD are orthogonal to each other
    • Orthogonal vector x y x + y Pythagoras, This always not true. This only true when
    • Example Are the vector (1,2,2) T and (2,3,-4) T are orthogonal ? Are the vector (4,2,3) T and (7,3,-4) T are orthogonal ?
    • Theorem 1: An orthogonal set of non zero vectors in a vector space is linearly independent.
    • Subspace S is orthogonal to subspace T How??? row space is orthogonal to null space Orthogonal Subspace
    • Orthonormal vector
      • Defination: Two vector are said to be orthonormal if it is orthogonal and it has unit length.
      Example: Two vector are said to be orthogonal
    • Orthonormal vector Theorem: Let { u 1 , …, u n } be an orthonormal basis for a vector space V. Let v be a vector in V. v can be written as a linear combination of these vectors as follows. Proof: Since { u 1 , …, u n } is a basis there exist scalars c 1 ,…,c n such that v= c 1 u 1 +…+c n u n We shall show that, c 1 =v 1 .u 1 ,…,c n =v n .u n
    • Projection
      • In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself such that P 2 = P . It leaves its image unchanged. Though abstract, this definition of "projection" formalizes and generalizes the idea of graphical projection. One can also consider the effect of a projection on a geometrical object by examining the effect of the projection on points in the object.
    • Projection
      • Orthogonal Projection
      • Oblique projection
    • Orthogonal projection
      • Let V be any inner-product space and let u V be any vector
      • Let be defined by
      • The vector u  is called the orthogonal projection of   v   onto u.
      • if the vector u is unit. i,e.
    •  
    • Projection Graph
    • Orthogonal projection
      • The projection (or shadow) of a vector x on a vector y is=
      • Projection of x on y
      • Why projection?
      • Beause Ax=b may have no solution.
      • That is when the system of equations are Inconsistent.
    • E xample
      • Solution: Let be the projection vector. The magnitude of is,
      • With direction given by the unit vector
      • Then,
      • So that can be espressed as,
      Find the orthogonal projection of Y=(2,7,1) on to the vector X=(5,6,4)
    • Projection on to a plan
    • Find the orthogonal projection of Y=(7,7,8) on to the plane spanned by vector X 1 =(5,6,4) and X 2 =(9,5,1). Solution: Since must lie in the plane spanned by X 1 and X 2
      • And forming inner products with X and X, we have the equations
    • Application
      • 1. Gram –schmidt orthogonalization
      • 2. Curve fitting by ordinary least square method.
      • 3. The area of a parallelogram
    • Gram Schmit Orthogonalization
        • The Gram-Schmidt orthogonalization process allows us to turn any set of
        • linearly independent vectors into an orthonormal set of the same
        • cardinality. In particular, this holds for a basis of an inner product space.
        • If you feed the machine any basis for the space, the process cranks out an
        • orthonormal basis.
      Jorgen Pedersen Gram (1850 - 1916) Erhard Schmidt (1876 - 1959)
    • Gram Schmit Orthogonalization Let be a basis for vector space V. The set of vector defined as follows is orthogonal. To obtain a orthogonal basis for V, normalized each of the vector
    • Fig: First two steps of Gram schmidt orthogonalization Geometric Interpretation
    • Consider the following set of vectors in R 2 (with the conventional inner product) Now, perform Gram–Schmidt, to obtain an orthogonal set of vectors: We check that the vectors u 1 and u 2 are indeed orthogonal: noting that if the dot product of two vectors is 0 then they are orthogonal. We can then normalize the vectors by dividing out their sizes as shown above: ; Example
    • Example
      • The set {(1,2,0,3), (4,0,5,8), (8,1,5,6)} is linearly independent in R 4 . The vectors form a basis for a three-dimensional subspace V of R 4 . Construct an orthonormal basis for V.
      • Solution:
      • Let v 1 =(1, 2, 0, 3), v 2 =(4, 0, 5, 8), v 3 =(8, 1, 5, 6). We now use the Gram- Schmidt process to consturect an orthogonal st {u, u, u} from these vectors.
    •  
    •  
    •  
    • Modified Gram-Schmidt orthogonalization When this process is implemented on a computer, the vectors u k are often not quite orthogonal, due to rounding errors . For the Gram–Schmidt process as described above (sometimes referred to as "classical Gram–Schmidt") this loss of orthogonality is particularly bad; therefore, it is said that the (classical) Gram–Schmidt process is numerically unstable . The Gram–Schmidt process can be stabilized by a small modification; this version is sometimes referred to as modified Gram-Schmidt or MGS. This approach gives the same result as the original formula in exact arithmetic and introduces smaller errors in finite-precision arithmetic. Instead of computing the vector u k as it is computed as Each step finds a vector orthogonal to . Thus is also orthogonalized against any errors introduced in computation of .
    • Modified Gram-Schmidt orthogonalization
      • Compare classical and modified G-S for the vectors
      Making approximation
    • Classical Vs Modified
    • Classical Vs Modified
    • Classical Vs Modified
      • To check the orthogonality
    • Curve fitting by ordinary least square method
      • One of the most widely used methods of curve fitting a straight line to data points is that of ordinary least squares, which makes use of orthogonal projections
    • Refference
      • Applied linear algebra in the statistical sciences.
      • -by Alexander Basilevsky.
      • 2. Lecture on linear Algebra. (Gilbert Strange)
      • Institute of MIT.
      • Applied Multivariate analysis.
      • -by R. A. Johnson, D. W. Wichern.
      • Linear Algebra Theory and Application (2003)
      • -by Ward Cheney and David Kincaid
      • Linear Algebra with Application (2007)
      • - by Granth Williams
      • Thank You