SlideShare a Scribd company logo
1 of 55
Eigenvectors and
   Eigenvalues

By Christopher Gratton
     cg317@exeter.ac.uk
Introduction: Diagonal Matrices
 Before beginning this topic, we must first
clarify the definition of a “Diagonal Matrix”.

A Diagonal Matrix is an n by n Matrix whose
non-diagonal entries have all the value zero.
Introduction: Diagonal Matrices
In this presentation, all Diagonal Matrices will
                be denoted as:




   where dnn is the n-th row and the n-th
     column of the Diagonal Matrix.
Introduction: Diagonal Matrices
For example, the previously given Matrix of:




        Can be written in the form:
              diag(5, 4, 1, 9)
Introduction: Diagonal Matrices
The Effects of a Diagonal Matrix

The Identity Matrix is an example of a
Diagonal Matrix which has the effect of
maintaining the properties of a Vector within
a given System.
For example:
Introduction: Diagonal Matrices
The Effects of a Diagonal Matrix

However, any other Diagonal Matrix will have
the effect of enlarging a Vector in given axes.
For example, the following Diagonal Matrix:




Has the effect of stretching a Vector by a Scale
Factor of 2 in the x-Axis, 3 in the z-Axis and
reflecting the Vector in the y-Axis.
The Goal
By the end of this PowerPoint, we should be
 able to understand and apply the idea of
   Diagonalisation, using Eigenvalues and
                Eigenvectors.
The Goal

The Matrix Point of View

By the end, we should be able to understand
how, given an n by n Matrix, A, we can say
that A is Diagonalisable if and only if there is
a Matrix, δ, that allows the following Matrix
to be Diagonal:

And why this knowledge is significant.
The Points of View
The Square Matrix, A, may be seen as a
Linear Operator, F, defined by:

Where X is a Column Vector.
The Points of View
Furthermore:

Represents the Linear Operator, F, relative to
the Basis, or Coordinate System, S, whose
Elements are the Columns of δ.
The Effects of a Coordinate System

If we are given A, an n by n Matrix of any
kind, then it is possible to interpret it as a
     Linear Transformation in a given
   Coordinate System of n-Dimensions.

For example:




 Has the effect of 45 degree Anticlockwise Rotation, in
            this case, on the Identity Matrix.
The Effects of a Coordinate System

However, it is theorised that it is possible to
 represent this Linear Transformation as a
 Diagonal Matrix within another, different
            Coordinate System.

We define the effect upon a given Vector in
     this new Coordinate System as:
 The a scalar multiplication of the Vector
  relative to all the axes by an unknown
Scale Factor, without affecting direction or
              other properties.
The Effects of a Coordinate System

   This process can be summarised by the
            following definition:
  Current                            New
 Coordinate                       Coordinate
   System                           System

Where:
A is the Transformation Matrix
v is a non-Zero Vector to be Transformed
λ is a Scalar in this new Coordinate System
       that has the same effect on v as A.
The Effects of a Coordinate System

   This process can be summarised by the
            following definition:
  Current                          New
 Coordinate                     Coordinate
   System                         System

Where:
     : The Matrix is a Linear
Transformation upon the Vector .

    : is the Scalar which results in the
same Transformation on as .
The Effects of a Coordinate System

This can be applied in the following example:

Matrix                       Vector
  A

 1




 2
The Effects of a Coordinate System

This can be applied in the following example:

Matrix                          Vector
  A

 3
     Thus, when:
 4
     A is equivalent to the Diagonal Matrix:
                          Which, as discussed
                       previously, has the effect of
                        enlarging the Vector by a
                        Scale Factor of 2 in each
                               Dimension.
Definitions

Thus, if         is true, we call:

  the Eigenvector
  the Eigenvalue of   corresponding to
Exceptions and Additions

• We do not count            as an Eigenvector,
as           for all values of λ.
•        , however, is allowed as an accepted
Eigenvalue.

• If is a known Eigenvector of a Matrix,
then so is    , for all non-Zero values of .
• If Vectors         are both Eigenvectors of a
given Matrix, and both have the same
resultant Eigenvalue, then           will also
be an Eigenvector of the Matrix.
Characteristic Polynomials

Establishing the Essentials

  is an Eigenvalue for the Matrix , relative
to the Eigenvector . Is the Identity Matrix
of the same Dimensions as .

      Thus:                   This is possible, as

                                 Is invertible.
Characteristic Polynomials

Application of the Knowledge

What this, essentially, leads to is the finding
of all Eigenvalues and Eigenvectors of a
specific Matrix.

This is done by considering the Matrix, , in
addition to the Identity Matrix, .
We then multiply the Identity Matrix by the
unknown quantity, .
Characteristic Polynomials

Application of the Knowledge

Proceeding this, we then take the lots of
the Identity Matrix, and subtract the Matrix
from it.
We then take the Determinant of the result
which ends up as a Polynomial equation, in
order to find possible values of , the
Eigenvalues.
This can be exemplified by the following
example:
Characteristic Polynomials

Calculating Eigenvalues from a Matrix

To find the Eigenvalues of:




We must consider          :
Characteristic Polynomials

Calculating Eigenvalues from a Matrix

Then,                  equals:

Which factorises to:

Therefore, the Eigenvalues of the Matrix are:
Characteristic Polynomials

Calculating Eigenvectors from the Values

With:
We need to solve             for all given
values of .

This is done by solving a Homogeneous
System of Linear Equations. In other words,
we must turn             into Echelon Form
and find the values of           , which are
the Diagonals of the Matrix.
Characteristic Polynomials

Calculating Eigenvectors from the Values

For example, we will take       from the
Previous example:
Characteristic Polynomials

Calculating Eigenvectors from the Values

For example, we will take        from the
Previous example:

Therefore, the result is that:
Characteristic Polynomials

Calculating Eigenvectors from the Values

For example, we will take       from the
Previous example:

     Therefore, this is the set of general
     Eigenvectors for the Eigenvalue of 4.
Diagonalisation

Mentioned earlier was the ultimate goal
of Diagonalisation; that is to say, finding a
Matrix, , such that the following can be
applied to a given Matrix, :



Where the result is a Diagonal Matrix.
Diagonalisation

There are a few rules that can be derived
from this:

Firstly, must be an Invertible Matrix, as the
Inverse is necessary to the calculation.

Secondly, the Eigenvectors of must
necessarily be Linearly Independent for this
to work.
Linear Independence will be covered later.
Diagonalisation
Eigenvectors, Eigenvalues & Diagonalisation

It turns out that the columns of the Matrix
are the Eigenvectors of the Matrix . This is
why they must be Linearly Independent, as
Matrix must be Invertible.

Furthermore, the Diagonal Entries of the
resultant Matrix are the Eigenvalues
associated with that Column of Eigenvectors.
Diagonalisation
Eigenvectors, Eigenvalues & Diagonalisation

For example, in the previous example, we can
create a Matrix from the Eigenvalues 4, 2
and 6, respectively.

It is as follows:
Diagonalisation
Eigenvectors, Eigenvalues & Diagonalisation

Furthermore, we can calculate that:
Diagonalisation
Eigenvectors, Eigenvalues & Diagonalisation

Thus, the Diagonalisation of   can be created
by:




Solving this gives:

The Eigenvalues in the Order given!
Linear Independence
Introduction

    This will be a brief section on Linear
     Independence to enforce that the
     Eigenvectors of must be Linearly
   Independent for Diagonalisation to be
                implemented.
Linear Independence
Linear Independency in x-Dimensions

The vectors             are classified as a
Linearly Independent set of Vectors if the
following rule applies:

The only value of the Scalar, , which makes
the equation:

True is       for all instances of
Linear Independence
Linear Independency in x-Dimensions

The vectors             are classified as a
Linearly Independent set of Vectors if the
following rule applies:

The only value of the Scalar, , which makes
the equation:

True is       for all instances of
Linear Independence
Linear Independency in x-Dimensions




If there are any non-zero values of at any
instance of within the equation, then this
set of Vectors,             , is considered
Linearly Dependent.
It is to note that only one instance of at
non-zero is needed to make the dependence.
Linear Independence
Linear Independency in x-Dimensions




Therefore, if, say, at , the value of      ,
then the vector set is Linearly Dependent.
But, if were to be omitted from the set,
given all other instances of were zero, then
the set would, therefore, become Linearly
Independent.
Linear Independence
Implications of Linear Independence

If the set of Vectors,            is Linearly
Independent, then it is not possible to write
any of the Vectors in the set in terms of any
of the other Vectors within the same set.

Conversely, if a set of Vectors is Linearly
Dependent, then it is possible to write at
least one Vector in terms of at least one
other Vector.
Linear Independence
Implications of Linear Independence

For example, the Vector set of:




Is Linearly Dependent, as   can be written
as:
Linear Independence
Implications of Linear Independence

For example, the Vector set of:




We can say, however, that this Vector set may
be considered as Linearly Independent if
were omitted from the set.
Linear Independence
Finding Linear Independency

The previous equation can be more usefully
written as:



More significantly, additionally, is the idea
that this can be translated into a
Homogeneous System of x Linear Equations,
where x is the Dimension quantity of the
System.
Linear Independence
Finding Linear Independency

The previous equation can be more usefully
written as:



More significantly, additionally, is the idea
that this can be translated into a
Homogeneous System of x Linear Equations,
where x is the Dimension quantity of the
System.
Linear Independence
Finding Linear Independency




Therefore, the Matrix of Coefficients, , is an
n by x Matrix, where n is the number of
Vectors in the System and x is the Dimensions
of the System.
The Columns of are equivalent to the
Vectors of the System,             .
Linear Independence
Finding Linear Independency




To observe whether is Linearly
Independent or not, we need to put the
Matrix into Echelon Form.
If, when in Echelon Form, we can observe
that each Column of Unknowns has a Leading
Entry, then the set of Vectors are Linearly
Independent.
Linear Independence
Finding Linear Independency




If not, then the set of Vectors are Linearly
Dependent.
To find the Coefficients, we can put into
Reduced Echelon Form to consider the
general solutions.
Linear Independence
Finding Linear Independency: Example

Let us consider whether the following set of
Vectors are Linearly Independent:
Linear Independence
Finding Linear Independency: Example

These Vectors can be written in the following
form:
Linear Independence
Finding Linear Independency: Example

The following EROs put this Matrix into
Echelon Form:




As this Matrix has a leading entry for every
Column, we can conclude that the set of
Vectors is Linearly Independent.
Summary
Thus, to conclude:

           is the formula for Eigenvectors
           and Eigenvalues.

  is a Matrix that has Eigenvectors and
Eigenvalues to be calculated.
  is an Eigenvector of
  is an Eigenvalue of , corresponding to
Summary
Thus, to conclude:

           is the formula for Eigenvectors
           and Eigenvalues.

Given and , we can find by Matrix
Multiplying      and observing how many
times the result is, relative to .
Summary
Thus, to conclude:

              is the Characteristic Polynomial
              of . This is used to find the
              general set of Eigenvalues of ,
              and thus, its Eigenvectors.
This is done by finding the determinant of
           and solving the resultant
Polynomial equation to isolate the
Eigenvalues.
Summary
Thus, to conclude:

              is the Characteristic Polynomial
              of . This is used to find the
              general set of Eigenvalues of ,
              and thus, its Eigenvectors.
Then, by Substituting the Eigenvalues back
into           and reducing the Matrix to
Echelon Form, we can find the general set of
Eigenvectors for that Eigenvalue.
Summary
Thus, to conclude:

          is the Diagonalisation of .
   is a Matrix created from the Eigenvectors
   of , where each Column is an Eigenvector.
In order for       to exist, must necessarily
be Invertible, where the Eigenvectors of
are Linearly Independent.
Summary
Thus, to conclude:

         is the Diagonalisation of .
  is a Matrix created from the Eigenvectors
  of , where each Column is an Eigenvector.
The resultant           is a Diagonal Matrix
where the diagonal values are the
Eigenvalues in the same column as its
associated Eigenvectors.

More Related Content

What's hot

Matrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIAMatrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIADheeraj Kataria
 
Eigenvalues and Eigenvectors
Eigenvalues and EigenvectorsEigenvalues and Eigenvectors
Eigenvalues and EigenvectorsVinod Srivastava
 
Diagonalization of matrix
Diagonalization of matrixDiagonalization of matrix
Diagonalization of matrixVANDANASAINI29
 
System Of Linear Equations
System Of Linear EquationsSystem Of Linear Equations
System Of Linear Equationssaahil kshatriya
 
Eigenvalues and eigenvectors
Eigenvalues and eigenvectorsEigenvalues and eigenvectors
Eigenvalues and eigenvectorsiraq
 
Eigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremEigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremgidc engineering college
 
Numerical solution of eigenvalues and applications 2
Numerical solution of eigenvalues and applications 2Numerical solution of eigenvalues and applications 2
Numerical solution of eigenvalues and applications 2SamsonAjibola
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectorsRiddhi Patel
 
Introduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisIntroduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisJaclyn Kokx
 
Matrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptMatrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptDrazzer_Dhruv
 
Diagonalization of Matrices
Diagonalization of MatricesDiagonalization of Matrices
Diagonalization of MatricesAmenahGondal1
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectorstirath prajapati
 
My Lecture Notes from Linear Algebra
My Lecture Notes fromLinear AlgebraMy Lecture Notes fromLinear Algebra
My Lecture Notes from Linear AlgebraPaul R. Martin
 
Vector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisVector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisRavi Gelani
 
system linear equations and matrices
 system linear equations and matrices system linear equations and matrices
system linear equations and matricesAditya Vaishampayan
 

What's hot (20)

Matrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIAMatrix presentation By DHEERAJ KATARIA
Matrix presentation By DHEERAJ KATARIA
 
Eigenvalues and Eigenvectors
Eigenvalues and EigenvectorsEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
 
Diagonalization of matrix
Diagonalization of matrixDiagonalization of matrix
Diagonalization of matrix
 
03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra
 
System Of Linear Equations
System Of Linear EquationsSystem Of Linear Equations
System Of Linear Equations
 
Eigenvalues and eigenvectors
Eigenvalues and eigenvectorsEigenvalues and eigenvectors
Eigenvalues and eigenvectors
 
Eigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theoremEigen value , eigen vectors, caley hamilton theorem
Eigen value , eigen vectors, caley hamilton theorem
 
Numerical solution of eigenvalues and applications 2
Numerical solution of eigenvalues and applications 2Numerical solution of eigenvalues and applications 2
Numerical solution of eigenvalues and applications 2
 
Vector space
Vector spaceVector space
Vector space
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectors
 
Introduction to Linear Discriminant Analysis
Introduction to Linear Discriminant AnalysisIntroduction to Linear Discriminant Analysis
Introduction to Linear Discriminant Analysis
 
Vector spaces
Vector spaces Vector spaces
Vector spaces
 
Matrices and System of Linear Equations ppt
Matrices and System of Linear Equations pptMatrices and System of Linear Equations ppt
Matrices and System of Linear Equations ppt
 
Diagonalization of Matrices
Diagonalization of MatricesDiagonalization of Matrices
Diagonalization of Matrices
 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectors
 
Metric space
Metric spaceMetric space
Metric space
 
My Lecture Notes from Linear Algebra
My Lecture Notes fromLinear AlgebraMy Lecture Notes fromLinear Algebra
My Lecture Notes from Linear Algebra
 
rank of matrix
rank of matrixrank of matrix
rank of matrix
 
Vector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisVector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,Basis
 
system linear equations and matrices
 system linear equations and matrices system linear equations and matrices
system linear equations and matrices
 

Similar to Eigenvectors & Eigenvalues: The Road to Diagonalisation

Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxWiamFADEL
 
Eigen values and Eigen vectors ppt world
Eigen values and Eigen vectors ppt worldEigen values and Eigen vectors ppt world
Eigen values and Eigen vectors ppt worldraykoustav145
 
Eigen value and eigen vectors shwetak
Eigen value and eigen vectors shwetakEigen value and eigen vectors shwetak
Eigen value and eigen vectors shwetakMrsShwetaBanait1
 
Linear_Algebra_final.pdf
Linear_Algebra_final.pdfLinear_Algebra_final.pdf
Linear_Algebra_final.pdfRohitAnand125
 
Module 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfModule 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfPrathamPatel560716
 
Linear Algebra presentation.pptx
Linear Algebra presentation.pptxLinear Algebra presentation.pptx
Linear Algebra presentation.pptxProveedorIptvEspaa
 
Linear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraLinear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraMUHAMMADUSMAN93058
 
Aplicaciones y subespacios y subespacios vectoriales en la
Aplicaciones y subespacios y subespacios vectoriales en laAplicaciones y subespacios y subespacios vectoriales en la
Aplicaciones y subespacios y subespacios vectoriales en laemojose107
 
Feature selection using PCA.pptx
Feature selection using PCA.pptxFeature selection using PCA.pptx
Feature selection using PCA.pptxbeherasushree212
 
Beginners Guide to Non-Negative Matrix Factorization
Beginners Guide to Non-Negative Matrix FactorizationBeginners Guide to Non-Negative Matrix Factorization
Beginners Guide to Non-Negative Matrix FactorizationBenjamin Bengfort
 
Direct Methods to Solve Lineal Equations
Direct Methods to Solve Lineal EquationsDirect Methods to Solve Lineal Equations
Direct Methods to Solve Lineal EquationsLizeth Paola Barrero
 
Direct Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations SystemsDirect Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations SystemsLizeth Paola Barrero
 
Quantum algorithm for solving linear systems of equations
 Quantum algorithm for solving linear systems of equations Quantum algorithm for solving linear systems of equations
Quantum algorithm for solving linear systems of equationsXequeMateShannon
 
Eigen Values & Eigen Vectors PPT.pdf
Eigen Values & Eigen Vectors PPT.pdfEigen Values & Eigen Vectors PPT.pdf
Eigen Values & Eigen Vectors PPT.pdfasitbagasitbag13
 

Similar to Eigenvectors & Eigenvalues: The Road to Diagonalisation (20)

Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptx
 
Eigen values and Eigen vectors ppt world
Eigen values and Eigen vectors ppt worldEigen values and Eigen vectors ppt world
Eigen values and Eigen vectors ppt world
 
1640 vector-maths
1640 vector-maths1640 vector-maths
1640 vector-maths
 
Eigen value and eigen vectors shwetak
Eigen value and eigen vectors shwetakEigen value and eigen vectors shwetak
Eigen value and eigen vectors shwetak
 
Linear_Algebra_final.pdf
Linear_Algebra_final.pdfLinear_Algebra_final.pdf
Linear_Algebra_final.pdf
 
Module 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdfModule 1 Theory of Matrices.pdf
Module 1 Theory of Matrices.pdf
 
Matlab eig
Matlab eigMatlab eig
Matlab eig
 
Power method
Power methodPower method
Power method
 
Linear Algebra presentation.pptx
Linear Algebra presentation.pptxLinear Algebra presentation.pptx
Linear Algebra presentation.pptx
 
Linear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear AlgebraLinear Algebra Presentation including basic of linear Algebra
Linear Algebra Presentation including basic of linear Algebra
 
Aplicaciones y subespacios y subespacios vectoriales en la
Aplicaciones y subespacios y subespacios vectoriales en laAplicaciones y subespacios y subespacios vectoriales en la
Aplicaciones y subespacios y subespacios vectoriales en la
 
Feature selection using PCA.pptx
Feature selection using PCA.pptxFeature selection using PCA.pptx
Feature selection using PCA.pptx
 
Beginners Guide to Non-Negative Matrix Factorization
Beginners Guide to Non-Negative Matrix FactorizationBeginners Guide to Non-Negative Matrix Factorization
Beginners Guide to Non-Negative Matrix Factorization
 
Direct Methods to Solve Lineal Equations
Direct Methods to Solve Lineal EquationsDirect Methods to Solve Lineal Equations
Direct Methods to Solve Lineal Equations
 
Direct methods
Direct methodsDirect methods
Direct methods
 
Direct Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations SystemsDirect Methods to Solve Linear Equations Systems
Direct Methods to Solve Linear Equations Systems
 
Direct methods
Direct methodsDirect methods
Direct methods
 
4852014.pptx
4852014.pptx4852014.pptx
4852014.pptx
 
Quantum algorithm for solving linear systems of equations
 Quantum algorithm for solving linear systems of equations Quantum algorithm for solving linear systems of equations
Quantum algorithm for solving linear systems of equations
 
Eigen Values & Eigen Vectors PPT.pdf
Eigen Values & Eigen Vectors PPT.pdfEigen Values & Eigen Vectors PPT.pdf
Eigen Values & Eigen Vectors PPT.pdf
 

More from Christopher Gratton

More from Christopher Gratton (8)

The Algebra of Functions
The Algebra of FunctionsThe Algebra of Functions
The Algebra of Functions
 
Basic Rules & Theorems for Differentiation
Basic Rules & Theorems for DifferentiationBasic Rules & Theorems for Differentiation
Basic Rules & Theorems for Differentiation
 
Conic sections
Conic sectionsConic sections
Conic sections
 
Symmetry in Quantum and Atomic Physics
Symmetry in Quantum and Atomic PhysicsSymmetry in Quantum and Atomic Physics
Symmetry in Quantum and Atomic Physics
 
Logarithmic Differentiation
Logarithmic DifferentiationLogarithmic Differentiation
Logarithmic Differentiation
 
Special Square Matrices
Special Square MatricesSpecial Square Matrices
Special Square Matrices
 
Properties of a Triangular Matrix
Properties of a Triangular MatrixProperties of a Triangular Matrix
Properties of a Triangular Matrix
 
Hypothetical Psychokinetic Potential PowerPoint
Hypothetical Psychokinetic Potential PowerPointHypothetical Psychokinetic Potential PowerPoint
Hypothetical Psychokinetic Potential PowerPoint
 

Recently uploaded

Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterMateoGardella
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 

Recently uploaded (20)

Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 

Eigenvectors & Eigenvalues: The Road to Diagonalisation

  • 1. Eigenvectors and Eigenvalues By Christopher Gratton cg317@exeter.ac.uk
  • 2. Introduction: Diagonal Matrices Before beginning this topic, we must first clarify the definition of a “Diagonal Matrix”. A Diagonal Matrix is an n by n Matrix whose non-diagonal entries have all the value zero.
  • 3. Introduction: Diagonal Matrices In this presentation, all Diagonal Matrices will be denoted as: where dnn is the n-th row and the n-th column of the Diagonal Matrix.
  • 4. Introduction: Diagonal Matrices For example, the previously given Matrix of: Can be written in the form: diag(5, 4, 1, 9)
  • 5. Introduction: Diagonal Matrices The Effects of a Diagonal Matrix The Identity Matrix is an example of a Diagonal Matrix which has the effect of maintaining the properties of a Vector within a given System. For example:
  • 6. Introduction: Diagonal Matrices The Effects of a Diagonal Matrix However, any other Diagonal Matrix will have the effect of enlarging a Vector in given axes. For example, the following Diagonal Matrix: Has the effect of stretching a Vector by a Scale Factor of 2 in the x-Axis, 3 in the z-Axis and reflecting the Vector in the y-Axis.
  • 7. The Goal By the end of this PowerPoint, we should be able to understand and apply the idea of Diagonalisation, using Eigenvalues and Eigenvectors.
  • 8. The Goal The Matrix Point of View By the end, we should be able to understand how, given an n by n Matrix, A, we can say that A is Diagonalisable if and only if there is a Matrix, δ, that allows the following Matrix to be Diagonal: And why this knowledge is significant.
  • 9. The Points of View The Square Matrix, A, may be seen as a Linear Operator, F, defined by: Where X is a Column Vector.
  • 10. The Points of View Furthermore: Represents the Linear Operator, F, relative to the Basis, or Coordinate System, S, whose Elements are the Columns of δ.
  • 11. The Effects of a Coordinate System If we are given A, an n by n Matrix of any kind, then it is possible to interpret it as a Linear Transformation in a given Coordinate System of n-Dimensions. For example: Has the effect of 45 degree Anticlockwise Rotation, in this case, on the Identity Matrix.
  • 12. The Effects of a Coordinate System However, it is theorised that it is possible to represent this Linear Transformation as a Diagonal Matrix within another, different Coordinate System. We define the effect upon a given Vector in this new Coordinate System as: The a scalar multiplication of the Vector relative to all the axes by an unknown Scale Factor, without affecting direction or other properties.
  • 13. The Effects of a Coordinate System This process can be summarised by the following definition: Current New Coordinate Coordinate System System Where: A is the Transformation Matrix v is a non-Zero Vector to be Transformed λ is a Scalar in this new Coordinate System that has the same effect on v as A.
  • 14. The Effects of a Coordinate System This process can be summarised by the following definition: Current New Coordinate Coordinate System System Where: : The Matrix is a Linear Transformation upon the Vector . : is the Scalar which results in the same Transformation on as .
  • 15. The Effects of a Coordinate System This can be applied in the following example: Matrix Vector A 1 2
  • 16. The Effects of a Coordinate System This can be applied in the following example: Matrix Vector A 3 Thus, when: 4 A is equivalent to the Diagonal Matrix: Which, as discussed previously, has the effect of enlarging the Vector by a Scale Factor of 2 in each Dimension.
  • 17. Definitions Thus, if is true, we call: the Eigenvector the Eigenvalue of corresponding to
  • 18. Exceptions and Additions • We do not count as an Eigenvector, as for all values of λ. • , however, is allowed as an accepted Eigenvalue. • If is a known Eigenvector of a Matrix, then so is , for all non-Zero values of . • If Vectors are both Eigenvectors of a given Matrix, and both have the same resultant Eigenvalue, then will also be an Eigenvector of the Matrix.
  • 19. Characteristic Polynomials Establishing the Essentials is an Eigenvalue for the Matrix , relative to the Eigenvector . Is the Identity Matrix of the same Dimensions as . Thus: This is possible, as Is invertible.
  • 20. Characteristic Polynomials Application of the Knowledge What this, essentially, leads to is the finding of all Eigenvalues and Eigenvectors of a specific Matrix. This is done by considering the Matrix, , in addition to the Identity Matrix, . We then multiply the Identity Matrix by the unknown quantity, .
  • 21. Characteristic Polynomials Application of the Knowledge Proceeding this, we then take the lots of the Identity Matrix, and subtract the Matrix from it. We then take the Determinant of the result which ends up as a Polynomial equation, in order to find possible values of , the Eigenvalues. This can be exemplified by the following example:
  • 22. Characteristic Polynomials Calculating Eigenvalues from a Matrix To find the Eigenvalues of: We must consider :
  • 23. Characteristic Polynomials Calculating Eigenvalues from a Matrix Then, equals: Which factorises to: Therefore, the Eigenvalues of the Matrix are:
  • 24. Characteristic Polynomials Calculating Eigenvectors from the Values With: We need to solve for all given values of . This is done by solving a Homogeneous System of Linear Equations. In other words, we must turn into Echelon Form and find the values of , which are the Diagonals of the Matrix.
  • 25. Characteristic Polynomials Calculating Eigenvectors from the Values For example, we will take from the Previous example:
  • 26. Characteristic Polynomials Calculating Eigenvectors from the Values For example, we will take from the Previous example: Therefore, the result is that:
  • 27. Characteristic Polynomials Calculating Eigenvectors from the Values For example, we will take from the Previous example: Therefore, this is the set of general Eigenvectors for the Eigenvalue of 4.
  • 28. Diagonalisation Mentioned earlier was the ultimate goal of Diagonalisation; that is to say, finding a Matrix, , such that the following can be applied to a given Matrix, : Where the result is a Diagonal Matrix.
  • 29. Diagonalisation There are a few rules that can be derived from this: Firstly, must be an Invertible Matrix, as the Inverse is necessary to the calculation. Secondly, the Eigenvectors of must necessarily be Linearly Independent for this to work. Linear Independence will be covered later.
  • 30. Diagonalisation Eigenvectors, Eigenvalues & Diagonalisation It turns out that the columns of the Matrix are the Eigenvectors of the Matrix . This is why they must be Linearly Independent, as Matrix must be Invertible. Furthermore, the Diagonal Entries of the resultant Matrix are the Eigenvalues associated with that Column of Eigenvectors.
  • 31. Diagonalisation Eigenvectors, Eigenvalues & Diagonalisation For example, in the previous example, we can create a Matrix from the Eigenvalues 4, 2 and 6, respectively. It is as follows:
  • 32. Diagonalisation Eigenvectors, Eigenvalues & Diagonalisation Furthermore, we can calculate that:
  • 33. Diagonalisation Eigenvectors, Eigenvalues & Diagonalisation Thus, the Diagonalisation of can be created by: Solving this gives: The Eigenvalues in the Order given!
  • 34. Linear Independence Introduction This will be a brief section on Linear Independence to enforce that the Eigenvectors of must be Linearly Independent for Diagonalisation to be implemented.
  • 35. Linear Independence Linear Independency in x-Dimensions The vectors are classified as a Linearly Independent set of Vectors if the following rule applies: The only value of the Scalar, , which makes the equation: True is for all instances of
  • 36. Linear Independence Linear Independency in x-Dimensions The vectors are classified as a Linearly Independent set of Vectors if the following rule applies: The only value of the Scalar, , which makes the equation: True is for all instances of
  • 37. Linear Independence Linear Independency in x-Dimensions If there are any non-zero values of at any instance of within the equation, then this set of Vectors, , is considered Linearly Dependent. It is to note that only one instance of at non-zero is needed to make the dependence.
  • 38. Linear Independence Linear Independency in x-Dimensions Therefore, if, say, at , the value of , then the vector set is Linearly Dependent. But, if were to be omitted from the set, given all other instances of were zero, then the set would, therefore, become Linearly Independent.
  • 39. Linear Independence Implications of Linear Independence If the set of Vectors, is Linearly Independent, then it is not possible to write any of the Vectors in the set in terms of any of the other Vectors within the same set. Conversely, if a set of Vectors is Linearly Dependent, then it is possible to write at least one Vector in terms of at least one other Vector.
  • 40. Linear Independence Implications of Linear Independence For example, the Vector set of: Is Linearly Dependent, as can be written as:
  • 41. Linear Independence Implications of Linear Independence For example, the Vector set of: We can say, however, that this Vector set may be considered as Linearly Independent if were omitted from the set.
  • 42. Linear Independence Finding Linear Independency The previous equation can be more usefully written as: More significantly, additionally, is the idea that this can be translated into a Homogeneous System of x Linear Equations, where x is the Dimension quantity of the System.
  • 43. Linear Independence Finding Linear Independency The previous equation can be more usefully written as: More significantly, additionally, is the idea that this can be translated into a Homogeneous System of x Linear Equations, where x is the Dimension quantity of the System.
  • 44. Linear Independence Finding Linear Independency Therefore, the Matrix of Coefficients, , is an n by x Matrix, where n is the number of Vectors in the System and x is the Dimensions of the System. The Columns of are equivalent to the Vectors of the System, .
  • 45. Linear Independence Finding Linear Independency To observe whether is Linearly Independent or not, we need to put the Matrix into Echelon Form. If, when in Echelon Form, we can observe that each Column of Unknowns has a Leading Entry, then the set of Vectors are Linearly Independent.
  • 46. Linear Independence Finding Linear Independency If not, then the set of Vectors are Linearly Dependent. To find the Coefficients, we can put into Reduced Echelon Form to consider the general solutions.
  • 47. Linear Independence Finding Linear Independency: Example Let us consider whether the following set of Vectors are Linearly Independent:
  • 48. Linear Independence Finding Linear Independency: Example These Vectors can be written in the following form:
  • 49. Linear Independence Finding Linear Independency: Example The following EROs put this Matrix into Echelon Form: As this Matrix has a leading entry for every Column, we can conclude that the set of Vectors is Linearly Independent.
  • 50. Summary Thus, to conclude: is the formula for Eigenvectors and Eigenvalues. is a Matrix that has Eigenvectors and Eigenvalues to be calculated. is an Eigenvector of is an Eigenvalue of , corresponding to
  • 51. Summary Thus, to conclude: is the formula for Eigenvectors and Eigenvalues. Given and , we can find by Matrix Multiplying and observing how many times the result is, relative to .
  • 52. Summary Thus, to conclude: is the Characteristic Polynomial of . This is used to find the general set of Eigenvalues of , and thus, its Eigenvectors. This is done by finding the determinant of and solving the resultant Polynomial equation to isolate the Eigenvalues.
  • 53. Summary Thus, to conclude: is the Characteristic Polynomial of . This is used to find the general set of Eigenvalues of , and thus, its Eigenvectors. Then, by Substituting the Eigenvalues back into and reducing the Matrix to Echelon Form, we can find the general set of Eigenvectors for that Eigenvalue.
  • 54. Summary Thus, to conclude: is the Diagonalisation of . is a Matrix created from the Eigenvectors of , where each Column is an Eigenvector. In order for to exist, must necessarily be Invertible, where the Eigenvectors of are Linearly Independent.
  • 55. Summary Thus, to conclude: is the Diagonalisation of . is a Matrix created from the Eigenvectors of , where each Column is an Eigenvector. The resultant is a Diagonal Matrix where the diagonal values are the Eigenvalues in the same column as its associated Eigenvectors.