SlideShare a Scribd company logo
Linear Algebra Review
1
n-dimensional vector
• An n-dimensional vector v is denoted as
follows:
• The transpose vT is denoted as follows:
Inner (or dot) product
• Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn),
their dot product defined as follows:
or
(scalar)
Orthogonal / Orthonormal vectors
• A set of vectors x1, x2, . . . , xn is orthogonal if
• A set of vectors x1, x2, . . . , xn is orthonormal if
k
Linear combinations
• A vector v is a linear combination of the
vectors v1, ..., vk if:
where c1, ..., ck are constants.
Example: any vector in R3 can be expressed
as a linear combinations of the unit vectors
i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)
k
j
i
Space spanning
• A set of vectors S=(v1, v2, . . . , vk ) span some
space W if every vector v in W can be written
as a linear combination of the vectors in S
Example: the unit vectors i, j, and k span R3
k
j
i
Linear dependence
• A set of vectors v1, ..., vk are linearly dependent
if at least one of them (e.g., vj) can be written
as a linear combination of the rest:
(i.e., vj does not appear on the
right side of the above equation)
Linear independence
• A set of vectors v1, ..., vk is linearly independent
if no vector vj can be represented as a linear
combination of the remaining vectors, i.e. :
Example:
c1=c2=0
Vector basis
• A set of vectors v1, ..., vk forms a basis in
some vector space W if:
(1) (v1, ..., vk) span W
(2) (v1, ..., vk) are linearly independent
Some standard bases:
R2 R3 Rn
Orthogonal vector basis
• Basis vectors might not be orthogonal.
• Any set of basis vectors (v1, ..., vk) can be
transformed to an orthogonal basis using the
Gram-Schmidt orthogonalization algorithm.
• Normalizing the basis vectors to “unit” length will
yield an orthonormal basis.
• More useful in practice since they simplify
calculations.
Vector Expansion/Projection
• Suppose v1, v2, . . . , vn is an orthogonal base
in W, then any v є W can be represented in
this basis as follows:
• The xi of the expansion can be computed as follows:
(vector expansion or projection)
(coefficients of expansion or projection)
where:
Note: if the basis is orthonormal, then vi.vi=1
Vector basis (cont’d)
• Why do we care about set of basis vectors?
– Given a set of basis vectors, each vector can be
represented (i.e., projected) “uniquely” in this basis.
• Do vector spaces have a unique vector
basis?
– No, simply translate/rotate the basis vectors to obtain a
new basis!
– Some sets of basis vectors are preferred than others
though.
– We will see this when we discuss Principal Components
Analysis (PCA).
Matrix Operations
• Matrix addition/subtraction
– Add/Subtract corresponding elements.
– Matrices must be of same size.
• Matrix multiplication
Condition: n = q
m x n q x p m x p
n
Diagonal Matrices
Special case: Identity matrix
Matrix Transpose
Symmetric Matrices
Example:
Determinants
2 x 2
3 x 3
n x n
Properties:
(expanded along 1st column)
(expanded along kth column)
Matrix Inverse
• The inverse of a matrix A, denoted as A-1, has the
property:
A A-1 = A-1A = I
• A-1 exists only if
• Definitions
– Singular matrix: A-1 does not exist
– Ill-conditioned matrix: A is “close” to being singular
Matrix Inverse (cont’d)
• Properties of the inverse:
Matrix trace
Properties:
Rank of matrix
• Defined as the size of the largest square sub-matrix
of A that has a non-zero determinant.
Example: has rank 3
Rank of matrix (cont’d)
• Alternatively, it can be defined as the maximum
number of linearly independent columns (or
rows) of A.
i.e., rank is not 4!
Example:
Rank of matrix (cont’d)
• Useful properties:
Eigenvalues and Eigenvectors
• The vector v is an eigenvector of matrix A and
λ is an eigenvalue of A if:
Geometric interpretation: the linear transformation
implied by A cannot change the direction of the
eigenvectors v, only their magnitude.
(assume v is non-zero)
Computing λ and v
• To compute the eigenvalues λ of a matrix A,
find the roots of the characteristic polynomial.
• The eigenvectors can then be computed:
Example:
Properties of λ and v
• Eigenvalues and eigenvectors are only
defined for square matrices.
• Eigenvectors are not unique (e.g., if v is an
eigenvector, so is kv) 
• Suppose λ1, λ2, ..., λn are the eigenvalues of
A, then:
Matrix diagonalization
• Given an n x n matrix A, find P such that:
P-1AP=Λ where Λ is diagonal
• Solution: Set P = [v1 v2 . . . vn], where v1,v2 ,. . .
vn are the eigenvectors of A: eigenvalues of A
P-1AP=Λ
Matrix diagonalization (cont’d)
Example:
P-1AP=Λ
• If A is diagonalizable, then the corresponding
eigenvectors v1,v2 ,. . . vn form a basis in Rn
• If A is also symmetric, its eigenvalues are real
and the corresponding eigenvectors are
orthogonal.
Matrix diagonalization (cont’d)
• An n x n matrix A is diagonalizable iff
rank(P)=n, where P-1AP=Λ.
– i.e., A has n linearly independent eigenvectors.
• Theorem: If the eigenvalues of A are all
distinct, then the corresponding eigenvectors
are linearly independent (i.e., A is
diagonalizable).
Are all n x n matrices
diagonalizable?
Matrix decomposition
• If A is diagonalizable, then A can be
decomposed as follows:
Matrix decomposition (cont’d)
• Matrix decomposition can be simplified in
the case of symmetric matrices (i.e.,
orthogonal eigenvectors):
P-1=PT
A=PDPT=

More Related Content

Similar to LinearAlgebraReview.ppt

Data Mining Lecture_9.pptx
Data Mining Lecture_9.pptxData Mining Lecture_9.pptx
Data Mining Lecture_9.pptx
Subrata Kumer Paul
 
Vcla ppt ch=vector space
Vcla ppt ch=vector spaceVcla ppt ch=vector space
Vcla ppt ch=vector space
Mahendra Patel
 
Ch_12 Review of Matrices and Vectors (PPT).pdf
Ch_12 Review of Matrices and Vectors (PPT).pdfCh_12 Review of Matrices and Vectors (PPT).pdf
Ch_12 Review of Matrices and Vectors (PPT).pdf
Mohammed Faizan
 
5-eigenvalues_and_eigenvectors.pptx
5-eigenvalues_and_eigenvectors.pptx5-eigenvalues_and_eigenvectors.pptx
5-eigenvalues_and_eigenvectors.pptx
MalligaarjunanN
 
5-eigenvalues_and_eigenvectors.ppt
5-eigenvalues_and_eigenvectors.ppt5-eigenvalues_and_eigenvectors.ppt
5-eigenvalues_and_eigenvectors.ppt
MalligaarjunanN
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
AbdusSadik
 
Independence, basis and dimension
Independence, basis and dimensionIndependence, basis and dimension
Independence, basis and dimension
ATUL KUMAR YADAV
 
Vector space interpretation_of_random_variables
Vector space interpretation_of_random_variablesVector space interpretation_of_random_variables
Vector space interpretation_of_random_variables
Gopi Saiteja
 
1619 quantum computing
1619 quantum computing1619 quantum computing
1619 quantum computing
Dr Fereidoun Dejahang
 
Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptx
WiamFADEL
 
TENSOR .pptx
TENSOR .pptxTENSOR .pptx
TENSOR .pptx
KiruthikaRajasekaran
 
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By PearsonChapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Chaimae Baroudi
 
vector spaces notes.pdf
vector spaces notes.pdfvector spaces notes.pdf
vector spaces notes.pdf
jacky489530
 
Mathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data MiningMathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data Mining
MadhavRao65
 
Cs221 linear algebra
Cs221 linear algebraCs221 linear algebra
Cs221 linear algebradarwinrlo
 
Lesson 1: Vectors and Scalars
Lesson 1: Vectors and ScalarsLesson 1: Vectors and Scalars
Lesson 1: Vectors and Scalars
VectorKing
 
Maths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectorsMaths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectors
Jaydev Kishnani
 
1625 signal processing and representation theory
1625 signal processing and representation theory1625 signal processing and representation theory
1625 signal processing and representation theory
Dr Fereidoun Dejahang
 
MATHEMATICS Lecture lesson helpful 12.pptx
MATHEMATICS Lecture lesson helpful 12.pptxMATHEMATICS Lecture lesson helpful 12.pptx
MATHEMATICS Lecture lesson helpful 12.pptx
PangalanTotoo
 

Similar to LinearAlgebraReview.ppt (20)

Data Mining Lecture_9.pptx
Data Mining Lecture_9.pptxData Mining Lecture_9.pptx
Data Mining Lecture_9.pptx
 
Vcla ppt ch=vector space
Vcla ppt ch=vector spaceVcla ppt ch=vector space
Vcla ppt ch=vector space
 
Ch_12 Review of Matrices and Vectors (PPT).pdf
Ch_12 Review of Matrices and Vectors (PPT).pdfCh_12 Review of Matrices and Vectors (PPT).pdf
Ch_12 Review of Matrices and Vectors (PPT).pdf
 
5-eigenvalues_and_eigenvectors.pptx
5-eigenvalues_and_eigenvectors.pptx5-eigenvalues_and_eigenvectors.pptx
5-eigenvalues_and_eigenvectors.pptx
 
5-eigenvalues_and_eigenvectors.ppt
5-eigenvalues_and_eigenvectors.ppt5-eigenvalues_and_eigenvectors.ppt
5-eigenvalues_and_eigenvectors.ppt
 
machine learning.pptx
machine learning.pptxmachine learning.pptx
machine learning.pptx
 
Independence, basis and dimension
Independence, basis and dimensionIndependence, basis and dimension
Independence, basis and dimension
 
Vector space interpretation_of_random_variables
Vector space interpretation_of_random_variablesVector space interpretation_of_random_variables
Vector space interpretation_of_random_variables
 
1619 quantum computing
1619 quantum computing1619 quantum computing
1619 quantum computing
 
Fundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptxFundamentals of Machine Learning.pptx
Fundamentals of Machine Learning.pptx
 
TENSOR .pptx
TENSOR .pptxTENSOR .pptx
TENSOR .pptx
 
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By PearsonChapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
 
vector spaces notes.pdf
vector spaces notes.pdfvector spaces notes.pdf
vector spaces notes.pdf
 
Mathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data MiningMathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data Mining
 
Cs221 linear algebra
Cs221 linear algebraCs221 linear algebra
Cs221 linear algebra
 
Lesson 1: Vectors and Scalars
Lesson 1: Vectors and ScalarsLesson 1: Vectors and Scalars
Lesson 1: Vectors and Scalars
 
Maths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectorsMaths-->>Eigenvalues and eigenvectors
Maths-->>Eigenvalues and eigenvectors
 
1625 signal processing and representation theory
1625 signal processing and representation theory1625 signal processing and representation theory
1625 signal processing and representation theory
 
MATHEMATICS Lecture lesson helpful 12.pptx
MATHEMATICS Lecture lesson helpful 12.pptxMATHEMATICS Lecture lesson helpful 12.pptx
MATHEMATICS Lecture lesson helpful 12.pptx
 
Report in determinants
Report in determinantsReport in determinants
Report in determinants
 

Recently uploaded

The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
SACHIN R KONDAGURI
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 
Guidance_and_Counselling.pdf B.Ed. 4th Semester
Guidance_and_Counselling.pdf B.Ed. 4th SemesterGuidance_and_Counselling.pdf B.Ed. 4th Semester
Guidance_and_Counselling.pdf B.Ed. 4th Semester
Atul Kumar Singh
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
timhan337
 
Digital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion DesignsDigital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion Designs
chanes7
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
Scholarhat
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
Peter Windle
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
EugeneSaldivar
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
Levi Shapiro
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 

Recently uploaded (20)

The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 
Guidance_and_Counselling.pdf B.Ed. 4th Semester
Guidance_and_Counselling.pdf B.Ed. 4th SemesterGuidance_and_Counselling.pdf B.Ed. 4th Semester
Guidance_and_Counselling.pdf B.Ed. 4th Semester
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
 
Digital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion DesignsDigital Artifact 2 - Investigating Pavilion Designs
Digital Artifact 2 - Investigating Pavilion Designs
 
Azure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHatAzure Interview Questions and Answers PDF By ScholarHat
Azure Interview Questions and Answers PDF By ScholarHat
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 

LinearAlgebraReview.ppt

  • 2. n-dimensional vector • An n-dimensional vector v is denoted as follows: • The transpose vT is denoted as follows:
  • 3. Inner (or dot) product • Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn), their dot product defined as follows: or (scalar)
  • 4. Orthogonal / Orthonormal vectors • A set of vectors x1, x2, . . . , xn is orthogonal if • A set of vectors x1, x2, . . . , xn is orthonormal if k
  • 5. Linear combinations • A vector v is a linear combination of the vectors v1, ..., vk if: where c1, ..., ck are constants. Example: any vector in R3 can be expressed as a linear combinations of the unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1) k j i
  • 6. Space spanning • A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector v in W can be written as a linear combination of the vectors in S Example: the unit vectors i, j, and k span R3 k j i
  • 7. Linear dependence • A set of vectors v1, ..., vk are linearly dependent if at least one of them (e.g., vj) can be written as a linear combination of the rest: (i.e., vj does not appear on the right side of the above equation)
  • 8. Linear independence • A set of vectors v1, ..., vk is linearly independent if no vector vj can be represented as a linear combination of the remaining vectors, i.e. : Example: c1=c2=0
  • 9. Vector basis • A set of vectors v1, ..., vk forms a basis in some vector space W if: (1) (v1, ..., vk) span W (2) (v1, ..., vk) are linearly independent Some standard bases: R2 R3 Rn
  • 10. Orthogonal vector basis • Basis vectors might not be orthogonal. • Any set of basis vectors (v1, ..., vk) can be transformed to an orthogonal basis using the Gram-Schmidt orthogonalization algorithm. • Normalizing the basis vectors to “unit” length will yield an orthonormal basis. • More useful in practice since they simplify calculations.
  • 11. Vector Expansion/Projection • Suppose v1, v2, . . . , vn is an orthogonal base in W, then any v є W can be represented in this basis as follows: • The xi of the expansion can be computed as follows: (vector expansion or projection) (coefficients of expansion or projection) where: Note: if the basis is orthonormal, then vi.vi=1
  • 12. Vector basis (cont’d) • Why do we care about set of basis vectors? – Given a set of basis vectors, each vector can be represented (i.e., projected) “uniquely” in this basis. • Do vector spaces have a unique vector basis? – No, simply translate/rotate the basis vectors to obtain a new basis! – Some sets of basis vectors are preferred than others though. – We will see this when we discuss Principal Components Analysis (PCA).
  • 13. Matrix Operations • Matrix addition/subtraction – Add/Subtract corresponding elements. – Matrices must be of same size. • Matrix multiplication Condition: n = q m x n q x p m x p n
  • 17. Determinants 2 x 2 3 x 3 n x n Properties: (expanded along 1st column) (expanded along kth column)
  • 18. Matrix Inverse • The inverse of a matrix A, denoted as A-1, has the property: A A-1 = A-1A = I • A-1 exists only if • Definitions – Singular matrix: A-1 does not exist – Ill-conditioned matrix: A is “close” to being singular
  • 19. Matrix Inverse (cont’d) • Properties of the inverse:
  • 21. Rank of matrix • Defined as the size of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3
  • 22. Rank of matrix (cont’d) • Alternatively, it can be defined as the maximum number of linearly independent columns (or rows) of A. i.e., rank is not 4! Example:
  • 23. Rank of matrix (cont’d) • Useful properties:
  • 24. Eigenvalues and Eigenvectors • The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: Geometric interpretation: the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume v is non-zero)
  • 25. Computing λ and v • To compute the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial. • The eigenvectors can then be computed: Example:
  • 26. Properties of λ and v • Eigenvalues and eigenvectors are only defined for square matrices. • Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv)  • Suppose λ1, λ2, ..., λn are the eigenvalues of A, then:
  • 27. Matrix diagonalization • Given an n x n matrix A, find P such that: P-1AP=Λ where Λ is diagonal • Solution: Set P = [v1 v2 . . . vn], where v1,v2 ,. . . vn are the eigenvectors of A: eigenvalues of A P-1AP=Λ
  • 29. • If A is diagonalizable, then the corresponding eigenvectors v1,v2 ,. . . vn form a basis in Rn • If A is also symmetric, its eigenvalues are real and the corresponding eigenvectors are orthogonal. Matrix diagonalization (cont’d)
  • 30. • An n x n matrix A is diagonalizable iff rank(P)=n, where P-1AP=Λ. – i.e., A has n linearly independent eigenvectors. • Theorem: If the eigenvalues of A are all distinct, then the corresponding eigenvectors are linearly independent (i.e., A is diagonalizable). Are all n x n matrices diagonalizable?
  • 31. Matrix decomposition • If A is diagonalizable, then A can be decomposed as follows:
  • 32. Matrix decomposition (cont’d) • Matrix decomposition can be simplified in the case of symmetric matrices (i.e., orthogonal eigenvectors): P-1=PT A=PDPT=