The presentation discusses eigenvalues and eigenvectors, their mathematical significance, and various applications in fields such as physics, engineering, and machine learning. Key points include using these concepts for dimensionality reduction in data analysis, and their importance in the stability of structures like bridges and in communication systems. It concludes that eigenvalues and eigenvectors are crucial for feature extraction and have widespread implications across multiple disciplines.
Basic Information
• Eigenvaluesare a special set of scalars associated with a linear system of
equations (i.e., a matrix equation) that are sometimes also known as characteristic
roots, characteristic values .
• The determination of the eigenvalues and eigenvectors of a system is extremely
important in physics and engineering, where it is equivalent to matrix
diagonalization
*stability analysis *rotating bodies *small oscillations of vibrating systems
• Each eigenvalue is paired with a corresponding eigenvector
• The decomposition of a square matrix into eigenvalues and eigenvectors is known
as eigen decomposition Shweta Kanhere 3
4.
Why use Eigenvalues& Eigenvectors?
• They used to determine a set of important variables (in form of vector) along
with scale along different dimensions (key dimensions based on variance) for
analysing the data in a better manner.
• In PCA, these concepts help in reducing the dimensionality of the data
(curse of dimensionality) resulting in the simpler model which is
computationally efficient and provides greater generalization accuracy.
• Eigenvalues and Eigenvectors concepts are key to training computationally
efficient and high performing machine learning models.
Shweta Kanhere 4
5.
….
Deriving Special Relativityis more natural in the language of linear
algebra-----Einstein's second postulate really states that "Light is an
eigenvector of the Lorentz transform.“
Shweta Kanhere 5
6.
Eigenvalues and eigenvectorsfeature prominently in the analysis
of linear transformations
In this shear mapping the red arrow changes direction, but the blue
arrow does not. The blue arrow is an eigenvector of this shear mapping
because it does not change direction, and since its length is unchanged,
its eigenvalue is 1.
Shweta Kanhere 6
7.
• The pre-processingimages before doing some image classification,
like face recognition. Imagine these actual images need to be
classified:
• We can find a small number of shared elements, and then identify
each face by the variance from these elements, like in a regression.
The reduction is done by finding eigenvectors of the input images
pixels, these eigenvectors can be seen as basis images, from which
the complete (actually nearly complete) images can be reconstructed.
The PCA is used to perform this dimensionality reduction
Shweta Kanhere 7
8.
Shweta Kanhere 8
A2x2 real and symmetric matrix represent a stretching and shearing of the plane. The
eigenvectors of the matrix (red lines) are the two special directions such that every point on them
will just slide on them.
Shweta Kanhere 11
Applicationsof the Eigenvalues and Eigenvectors of a square
matrix
1. Communication systems
• Eigenvalues were used by Claude Shannon to determine the theoretical limit
to how much information can be transmitted through a communication
medium like telephone line or through the air.
• This is done by calculating the eigenvectors and eigenvalues of the
communication channel (expressed a matrix), and then waterfilling on the
eigenvalues. The eigenvalues are then, the gains of the fundamental modes of
the channel, which themselves are captured by the eigenvectors.
12.
2. Designing bridges
•The natural frequency of the bridge is the eigenvalue of smallest
magnitude of a system that models the bridge. The engineers exploit
this knowledge to ensure the stability of their constructions.
3. Designing car stereo system
• Eigenvalue analysis is also used in the design of the car stereo
systems, where it helps to reproduce the vibration of the car due to
the music. Knowledge to ensure the stability of their constructions.
4. Electrical Engineering
• The application of eigenvalues and eigenvectors is useful for
decoupling three-phase systems through symmetrical component
transformation.
Shweta Kanhere 12
13.
5.Mechanical Engineering
Eigenvalues andeigenvectors allow us to "reduce" a linear operation to
separate, simpler, problems. For example, if a stress is applied to a "plastic"
solid, the deformation can be dissected into "principle directions"- those
directions in which the deformation is greatest. Vectors in the principle
directions are the eigenvectors and the percentage deformation in each
principle direction is the corresponding eigenvalue.
6. Oil Companies
frequently use eigenvalue analysis to explore land for oil. Oil, dirt, and other
substances all give rise to linear systems which have different eigenvalues, so
eigenvalue analysis can give a good indication of where oil reserves are
located. Oil companies place probes around a site to pick up the waves that
result from a huge truck used to vibrate the ground. The waves are changed as
they pass through the different substances in the ground. The analysis of these
waves directs the oil companies to possible drilling sites.
Shweta Kanhere 13
14.
Uses in thefield of AI
Shweta Kanhere 14
Using singular value decomposition for image
compression
This explain how we can compress and image by throwing away the small
eigenvalues of AAT. It takes an 88 megapixel image of an Allosaurus, and shows
how the image looks after compressing by
selecting 11,1010,2525,5050,100 and 200 of the largest singular values.
Low rank factorization for collaborative prediction
This what Netflix does (or once did) to predict what rating you'll have for a movie
you have not yet watched. It uses the SVD, and throws away the smallest
eigenvalues of ATA.
The Google Page Rank algorithm
The largest eigenvector of the graph of the internet is how the pages are ranked.
15.
…..
Spectral Clustering
Whether it'sin plants and biology, medical imaging, buisness and
marketing, understanding the connections between fields on Facebook, or
even criminology, clustering is an extremely important part of modern
data analysis. It allows people to find important subsystems or patterns
inside noisy data sets. One such method is spectral clustering which uses
the eigenvalues of a the graph of a network. Even the eigenvector of the
second smallest eigenvalue of the Laplacian matrix allows us to find the
two largest clusters in a network.
Shweta Kanhere 15
16.
…..
Shweta Kanhere 16
DimensionalityReduction/PCA.
The principal components correspond the the largest eigenvalues of ATA
and this yields the least squared projection onto a smaller dimensional
hyperplane, and the eigenvectors become the axes of the hyperplane.
Dimensionality reduction is extremely useful in machine learning and data
analysis as it allows one to understand where most of the variation in the
data comes from.
17.
Shweta Kanhere 17
Rarelyuse
the eigenvalues of a linear mapping is a measure of the distortion induced
by the transformation and the eigenvectors tell us about how the distortion
is oriented. It is precisely this rough picture which makes PCA (Principal
Component Analysis = A statistical procedure) very useful.
Setting the characteristicpolynomial equal to zero, it has roots at λ=1 and λ=3, which
are the two eigenvalues of A. A For λ=1, Equation (5) becomes
Shweta Kanhere 19
Any nonzero vector with v1 = −v2 solves this equation. Therefore,
Any nonzero vector with v1 = v2 solves this
equation. Therefore,
is an eigenvector of A corresponding to λ = 3,
as is any scalar multiple of this vector. Thus,
the vectors vλ=1 and vλ=3 are eigenvectors
of A associated with the
eigenvalues λ=1 and λ=3, respectively.
Shweta Kanhere 21
Thetransformation matrix A preserves the direction of purple vectors parallel
to vλ=1 = [1 −1]T and blue vectors parallel to vλ=3 = [1 1]T.The red vectors are
not parallel to either eigenvector, so, their directions are changed by the
transformation. The lengths of the purple vectors are unchanged after the
transformation (due to their eigenvalue of 1), while blue vectors are three times
the length of the original (due to their eigenvalue of 3)
22.
Conclusions
• Eigenvector isa vector which when multiplied with a transformation
matrix results in another vector multiplied with a scaler multiple having
same direction as Eigenvector. This scaler multiple is known as
Eigenvalue
• Eigenvectors and Eigenvalues are key concepts used in feature extraction
techniques such as Principal Component analysis which is an algorithm
used to reducing dimensionality while training a machine learning
model.
• Eigenvalues and Eigenvector concepts are used in several fields
including machine learning, quantum computing, communication system
design, construction designs, electrical and mechanical engineering etc.
Shweta Kanhere 22