Orthogonal porjection in statistics


Published on

Orthogonal projection in statistics a power point lecture (Rajshahi University)

Published in: Education, Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Orthogonal porjection in statistics

  1. 1. Projection Md. Sahidul Islam Ripon Department of statistics Rajshahi University Email: ripon.ru.statistics@gmail.com
  2. 2. Content <ul><li>Orthogonal vector </li></ul><ul><li>Orthonormal vector </li></ul><ul><li>Projection </li></ul><ul><li>Gram-Schmidt orthogonalization </li></ul>
  3. 3. Orthogonal vector <ul><li>In mathematics, two vectors are orthogonal if they are perpendicular, i.e., they form a right angle. The relation is clearly symmetric; that is, if x is orthogonal to y then then and so y is orthogonal to x . </li></ul><ul><li> Two vectors, x and y, in an inner product space, V, are orthogonal if their inner product is zero. </li></ul>Fig : The line segments AB and CD are orthogonal to each other
  4. 4. Orthogonal vector x y x + y Pythagoras, This always not true. This only true when
  5. 5. Example Are the vector (1,2,2) T and (2,3,-4) T are orthogonal ? Are the vector (4,2,3) T and (7,3,-4) T are orthogonal ?
  6. 6. Theorem 1: An orthogonal set of non zero vectors in a vector space is linearly independent.
  7. 7. Subspace S is orthogonal to subspace T How??? row space is orthogonal to null space Orthogonal Subspace
  8. 8. Orthonormal vector <ul><li>Defination: Two vector are said to be orthonormal if it is orthogonal and it has unit length. </li></ul>Example: Two vector are said to be orthogonal
  9. 9. Orthonormal vector Theorem: Let { u 1 , …, u n } be an orthonormal basis for a vector space V. Let v be a vector in V. v can be written as a linear combination of these vectors as follows. Proof: Since { u 1 , …, u n } is a basis there exist scalars c 1 ,…,c n such that v= c 1 u 1 +…+c n u n We shall show that, c 1 =v 1 .u 1 ,…,c n =v n .u n
  10. 10. Projection <ul><li>In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself such that P 2 = P . It leaves its image unchanged. Though abstract, this definition of &quot;projection&quot; formalizes and generalizes the idea of graphical projection. One can also consider the effect of a projection on a geometrical object by examining the effect of the projection on points in the object. </li></ul>
  11. 11. Projection <ul><li>Orthogonal Projection </li></ul><ul><li>Oblique projection </li></ul>
  12. 12. Orthogonal projection <ul><li>Let V be any inner-product space and let u V be any vector </li></ul><ul><li> Let be defined by </li></ul><ul><li>The vector u  is called the orthogonal projection of   v   onto u. </li></ul><ul><li>if the vector u is unit. i,e. </li></ul>
  13. 14. Projection Graph
  14. 15. Orthogonal projection
  15. 16. <ul><li>The projection (or shadow) of a vector x on a vector y is= </li></ul><ul><li>Projection of x on y </li></ul>
  16. 17. <ul><li>Why projection? </li></ul><ul><li>Beause Ax=b may have no solution. </li></ul><ul><li>That is when the system of equations are Inconsistent. </li></ul>
  17. 18. E xample <ul><li>Solution: Let be the projection vector. The magnitude of is, </li></ul><ul><li>With direction given by the unit vector </li></ul><ul><li>Then, </li></ul><ul><li>So that can be espressed as, </li></ul>Find the orthogonal projection of Y=(2,7,1) on to the vector X=(5,6,4)
  18. 19. Projection on to a plan
  19. 20. Find the orthogonal projection of Y=(7,7,8) on to the plane spanned by vector X 1 =(5,6,4) and X 2 =(9,5,1). Solution: Since must lie in the plane spanned by X 1 and X 2
  20. 21. <ul><li>And forming inner products with X and X, we have the equations </li></ul>
  21. 22. Application <ul><li>1. Gram –schmidt orthogonalization </li></ul><ul><li>2. Curve fitting by ordinary least square method. </li></ul><ul><li>3. The area of a parallelogram </li></ul>
  22. 23. Gram Schmit Orthogonalization <ul><ul><li>The Gram-Schmidt orthogonalization process allows us to turn any set of </li></ul></ul><ul><ul><li>linearly independent vectors into an orthonormal set of the same </li></ul></ul><ul><ul><li>cardinality. In particular, this holds for a basis of an inner product space. </li></ul></ul><ul><ul><li>If you feed the machine any basis for the space, the process cranks out an </li></ul></ul><ul><ul><li>orthonormal basis. </li></ul></ul>Jorgen Pedersen Gram (1850 - 1916) Erhard Schmidt (1876 - 1959)
  23. 24. Gram Schmit Orthogonalization Let be a basis for vector space V. The set of vector defined as follows is orthogonal. To obtain a orthogonal basis for V, normalized each of the vector
  24. 25. Fig: First two steps of Gram schmidt orthogonalization Geometric Interpretation
  25. 26. Consider the following set of vectors in R 2 (with the conventional inner product) Now, perform Gram–Schmidt, to obtain an orthogonal set of vectors: We check that the vectors u 1 and u 2 are indeed orthogonal: noting that if the dot product of two vectors is 0 then they are orthogonal. We can then normalize the vectors by dividing out their sizes as shown above: ; Example
  26. 27. Example <ul><li>The set {(1,2,0,3), (4,0,5,8), (8,1,5,6)} is linearly independent in R 4 . The vectors form a basis for a three-dimensional subspace V of R 4 . Construct an orthonormal basis for V. </li></ul><ul><li>Solution: </li></ul><ul><li>Let v 1 =(1, 2, 0, 3), v 2 =(4, 0, 5, 8), v 3 =(8, 1, 5, 6). We now use the Gram- Schmidt process to consturect an orthogonal st {u, u, u} from these vectors. </li></ul>
  27. 31. Modified Gram-Schmidt orthogonalization When this process is implemented on a computer, the vectors u k are often not quite orthogonal, due to rounding errors . For the Gram–Schmidt process as described above (sometimes referred to as &quot;classical Gram–Schmidt&quot;) this loss of orthogonality is particularly bad; therefore, it is said that the (classical) Gram–Schmidt process is numerically unstable . The Gram–Schmidt process can be stabilized by a small modification; this version is sometimes referred to as modified Gram-Schmidt or MGS. This approach gives the same result as the original formula in exact arithmetic and introduces smaller errors in finite-precision arithmetic. Instead of computing the vector u k as it is computed as Each step finds a vector orthogonal to . Thus is also orthogonalized against any errors introduced in computation of .
  28. 32. Modified Gram-Schmidt orthogonalization
  29. 33. <ul><li>Compare classical and modified G-S for the vectors </li></ul>Making approximation
  30. 34. Classical Vs Modified
  31. 35. Classical Vs Modified
  32. 36. Classical Vs Modified <ul><li>To check the orthogonality </li></ul>
  33. 37. Curve fitting by ordinary least square method <ul><li>One of the most widely used methods of curve fitting a straight line to data points is that of ordinary least squares, which makes use of orthogonal projections </li></ul>
  34. 38. Refference <ul><li>Applied linear algebra in the statistical sciences. </li></ul><ul><li>-by Alexander Basilevsky. </li></ul><ul><li>2. Lecture on linear Algebra. (Gilbert Strange) </li></ul><ul><li>Institute of MIT. </li></ul><ul><li>Applied Multivariate analysis. </li></ul><ul><li>-by R. A. Johnson, D. W. Wichern. </li></ul><ul><li>Linear Algebra Theory and Application (2003) </li></ul><ul><li>-by Ward Cheney and David Kincaid </li></ul><ul><li>Linear Algebra with Application (2007) </li></ul><ul><li>- by Granth Williams </li></ul>
  35. 39. <ul><li>Thank You </li></ul>