Linear Algebra
Manjul Krishna Gupta
(RVU- March – June 2023)
Outline
• Syllabus and Curriculum
• Assessment
• Introduction
• Why?
• Applications
Syllabus (1 of 3)
Introduction to vectors (definition, types, representations);linear combinations; vector algebra,
operations (dot products, cross products); Projection; 1D and 2D motion
Vector Geometry - lines (Parametric and normal form); planes (parametric and normal forms)
Matrices; rank; minor; System of linear equations - augmented matrix; row echelon form;
Gauss Elimination; pivoting; Reduced row echelon form; Gauss Jordan Elimination
Homogeneous equations; Span; linear independence
Matrix algebra; addition and scalar multiplication, matrix multiplication; rank of a matrix,
determinants-I,
Syllabus (2 of 3)
Inverse; invertibility; LU factorization
vector spaces-I; linear dependence of basis;
subspaces; dimension, rank and nullity; rank-nullity theorem
Linear transformations (maps), domains; codomains; image; range and kernel of a linear map;
composition of linear maps; Inverse of a linear transformation;
Matrix associated with a linear map. Linear Transformations: Rotations, Reflections, Scaling,
Shearing, Projection.
Homogeneous coordinates; Affine transformations; composite transformations; change of
coordinates
Eigenvalues, eigenvectors,
Cramer’s rule; Determinants-II; similarity and diagonalization;
orthogonality; orthogonal/orthonormal basis, matrices, projections, decomposition;
Syllabus (3 of 3)
Gram-Schmidt orthogonalization;
Spectral theory; QR decomposition;
Orthogonal diagonalization;
quadratic forms; principal axis theorem; constrained optimization;
Principle Component Analysis
linear vector spaces-II; change of bases;
Inner product spaces; norm and distance;
least squares approximation,
Introduction to neural networks; Covariance matrix;
Tensor; Hadamard product;
Pseudoinverse; convolution
Linear regression
Revision and make up
Assessment
• Tutorials (10 %)
• Lab Activity (Implement concepts using Python) (40 %)
• Quiz 1 (10 %)
• Quiz 2 (10 %)
• Final Exam (30 %)
Prescribed Textbook
• Linear Algebra – A Modern Introduction 3rd Edition (David Poole)
What is Linear Algebra
Where is it used? / Applications
• Image Filters / Altering Colors
• Cryptography
• Computer Graphics / Animations
• GPS System
• Traffic / Network Flows
• Machine Learning
• Data Representation / Vectors, Feature Vectors
• Vector Embeddings
• Dimension Reduction Techniques (Singular Value Decomposition / Principal Component Analysis)
• Chemical Reactions
• Economics
• Structural Engineering
• Electrical, Mechanical Engineering
• Genetics (gene expression analysis)/Medical (CT Scan /MRI)
• Robotics
• Quantum Physics / Quantum Computing
Linear Equation
• Row Picture
• Column Picture
• Matrix Form
Geometrical Perspective to Linear Equations
• Linear Equation in 3D is represented as Plane
• Linear Equation in 2D is represented as Line
Tensor
Real-World Examples of 0D, 1D, 2D, 3D, 4D and 5D Tensors | by
Rukshan Pramoditha | Data Science 365 | Medium
Solutions
• Unique Solution (Consistent System)
• Infinite Solutions (Consistent System)
• No solution (Inconsistent System)

Lecture 1- Introduction.pptx

  • 1.
    Linear Algebra Manjul KrishnaGupta (RVU- March – June 2023)
  • 2.
    Outline • Syllabus andCurriculum • Assessment • Introduction • Why? • Applications
  • 3.
    Syllabus (1 of3) Introduction to vectors (definition, types, representations);linear combinations; vector algebra, operations (dot products, cross products); Projection; 1D and 2D motion Vector Geometry - lines (Parametric and normal form); planes (parametric and normal forms) Matrices; rank; minor; System of linear equations - augmented matrix; row echelon form; Gauss Elimination; pivoting; Reduced row echelon form; Gauss Jordan Elimination Homogeneous equations; Span; linear independence Matrix algebra; addition and scalar multiplication, matrix multiplication; rank of a matrix, determinants-I,
  • 4.
    Syllabus (2 of3) Inverse; invertibility; LU factorization vector spaces-I; linear dependence of basis; subspaces; dimension, rank and nullity; rank-nullity theorem Linear transformations (maps), domains; codomains; image; range and kernel of a linear map; composition of linear maps; Inverse of a linear transformation; Matrix associated with a linear map. Linear Transformations: Rotations, Reflections, Scaling, Shearing, Projection. Homogeneous coordinates; Affine transformations; composite transformations; change of coordinates Eigenvalues, eigenvectors, Cramer’s rule; Determinants-II; similarity and diagonalization; orthogonality; orthogonal/orthonormal basis, matrices, projections, decomposition;
  • 5.
    Syllabus (3 of3) Gram-Schmidt orthogonalization; Spectral theory; QR decomposition; Orthogonal diagonalization; quadratic forms; principal axis theorem; constrained optimization; Principle Component Analysis linear vector spaces-II; change of bases; Inner product spaces; norm and distance; least squares approximation, Introduction to neural networks; Covariance matrix; Tensor; Hadamard product; Pseudoinverse; convolution Linear regression Revision and make up
  • 6.
    Assessment • Tutorials (10%) • Lab Activity (Implement concepts using Python) (40 %) • Quiz 1 (10 %) • Quiz 2 (10 %) • Final Exam (30 %)
  • 7.
    Prescribed Textbook • LinearAlgebra – A Modern Introduction 3rd Edition (David Poole)
  • 8.
  • 9.
    Where is itused? / Applications • Image Filters / Altering Colors • Cryptography • Computer Graphics / Animations • GPS System • Traffic / Network Flows • Machine Learning • Data Representation / Vectors, Feature Vectors • Vector Embeddings • Dimension Reduction Techniques (Singular Value Decomposition / Principal Component Analysis) • Chemical Reactions • Economics • Structural Engineering • Electrical, Mechanical Engineering • Genetics (gene expression analysis)/Medical (CT Scan /MRI) • Robotics • Quantum Physics / Quantum Computing
  • 10.
    Linear Equation • RowPicture • Column Picture • Matrix Form
  • 11.
    Geometrical Perspective toLinear Equations • Linear Equation in 3D is represented as Plane • Linear Equation in 2D is represented as Line
  • 12.
    Tensor Real-World Examples of0D, 1D, 2D, 3D, 4D and 5D Tensors | by Rukshan Pramoditha | Data Science 365 | Medium
  • 13.
    Solutions • Unique Solution(Consistent System) • Infinite Solutions (Consistent System) • No solution (Inconsistent System)

Editor's Notes

  • #9 Working with system of linear equations Explain what is linear equation Give example 2-3 variable equation, solve it by back substitution How can we solve 10 million equations like this
  • #10 Relevant data occupies only a small portion of the ambient large-dimensional vector space Working within that smaller space increases efficiency, saves resources, and reveals relevant structure in the data Matrix transformation is used for CG (axis rotation / flipping etc) GPS systems (x,y,z & t) – 4 satellites