This document provides an overview of key linear algebra concepts:
1) It defines vectors, inner products, orthogonality, linear combinations, and vector spaces.
2) It covers matrix operations like addition, multiplication, transposes, and properties of diagonal, symmetric, and identity matrices.
3) It discusses matrix inverses, determinants, ranks, eigenvalues/eigenvectors, matrix diagonalization, and decomposition.
4) Key concepts are illustrated with examples, including how vectors can be expressed as linear combinations of basis vectors and how matrices can be diagonalized using eigenvectors.
Y$mkyjiow4yj iow4j oy4yjopyp4jopy4joyjoppj 4yopjwpoyj wo yj zpoyjpozjy4 op4yjopy4jopyj4\yrh\sh
shr\h
shsr
wyr rh hr rwy rw yr ryw sfh h. h h. ryw wy. h yh sy rs dyhj gdy hjgd jd. y r uryw w etj ej hrt
Y$mkyjiow4yj iow4j oy4yjopyp4jopy4joyjoppj 4yopjwpoyj wo yj zpoyjpozjy4 op4yjopy4jopyj4\yrh\sh
shr\h
shsr
wyr rh hr rwy rw yr ryw sfh h. h h. ryw wy. h yh sy rs dyhj gdy hjgd jd. y r uryw w etj ej hrt
Lecture 9: Dimensionality Reduction, Singular Value Decomposition (SVD), Principal Component Analysis (PCA). (ppt,pdf)
Appendices A, B from the book “Introduction to Data Mining” by Tan, Steinbach, Kumar.
This upload is actually experimental, so sorry for the lost animations. This is my first post on SlideShare. Future presentations will take into account the loss of animation.
Also, I saw that the titles of all my slides got covered by something, so I'll never use this theme again. The titles of the slides are:
Slide 1: Vectors and Scalars
Slide 2: In this lecture, you will learn
Slide 3: What are vectors?
Slide 4: What are scalars?
Slide 5: A joke
Slide 6: A joke
Slide 7: What was that for?
Slide 8: What was that for?
Slide 9: Vectors
Slide 10: Geometric Representation
Slide 11: Vector Addition
Slide 12: Scalar Multiplication
Slide 13: The Zero Vector
Slide 14: The Negative of a Vector
Slide 15: Vector Subtraction
Slide 16: More Properties of Vector Algebra
Slide 17: Magnitude of a Vector
Slide 18: Vectors in a Coordinate System
Slide 19: Unit Vectors
Slide 20: Algebraic Representation of Vectors
Slide 21: Algebraic Addition of Vectors
Slide 22: Algebraic Multiplication of a Vector by a Scalar
Slide 23: Example 1
Slide 24: Example 2
Slide 25: A few words of caution
Slide 26: Problems
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Lecture 9: Dimensionality Reduction, Singular Value Decomposition (SVD), Principal Component Analysis (PCA). (ppt,pdf)
Appendices A, B from the book “Introduction to Data Mining” by Tan, Steinbach, Kumar.
This upload is actually experimental, so sorry for the lost animations. This is my first post on SlideShare. Future presentations will take into account the loss of animation.
Also, I saw that the titles of all my slides got covered by something, so I'll never use this theme again. The titles of the slides are:
Slide 1: Vectors and Scalars
Slide 2: In this lecture, you will learn
Slide 3: What are vectors?
Slide 4: What are scalars?
Slide 5: A joke
Slide 6: A joke
Slide 7: What was that for?
Slide 8: What was that for?
Slide 9: Vectors
Slide 10: Geometric Representation
Slide 11: Vector Addition
Slide 12: Scalar Multiplication
Slide 13: The Zero Vector
Slide 14: The Negative of a Vector
Slide 15: Vector Subtraction
Slide 16: More Properties of Vector Algebra
Slide 17: Magnitude of a Vector
Slide 18: Vectors in a Coordinate System
Slide 19: Unit Vectors
Slide 20: Algebraic Representation of Vectors
Slide 21: Algebraic Addition of Vectors
Slide 22: Algebraic Multiplication of a Vector by a Scalar
Slide 23: Example 1
Slide 24: Example 2
Slide 25: A few words of caution
Slide 26: Problems
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
2. n-dimensional vector
• An n-dimensional vector v is denoted as
follows:
• The transpose vT is denoted as follows:
3. Inner (or dot) product
• Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn),
their dot product defined as follows:
or
(scalar)
4. Orthogonal / Orthonormal vectors
• A set of vectors x1, x2, . . . , xn is orthogonal if
• A set of vectors x1, x2, . . . , xn is orthonormal if
k
5. Linear combinations
• A vector v is a linear combination of the
vectors v1, ..., vk if:
where c1, ..., ck are constants.
Example: any vector in R3 can be expressed
as a linear combinations of the unit vectors
i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)
k
j
i
6. Space spanning
• A set of vectors S=(v1, v2, . . . , vk ) span some
space W if every vector v in W can be written
as a linear combination of the vectors in S
Example: the unit vectors i, j, and k span R3
k
j
i
7. Linear dependence
• A set of vectors v1, ..., vk are linearly dependent
if at least one of them (e.g., vj) can be written
as a linear combination of the rest:
(i.e., vj does not appear on the
right side of the above equation)
8. Linear independence
• A set of vectors v1, ..., vk is linearly independent
if no vector vj can be represented as a linear
combination of the remaining vectors, i.e. :
Example:
c1=c2=0
9. Vector basis
• A set of vectors v1, ..., vk forms a basis in
some vector space W if:
(1) (v1, ..., vk) span W
(2) (v1, ..., vk) are linearly independent
Some standard bases:
R2 R3 Rn
10. Orthogonal vector basis
• Basis vectors might not be orthogonal.
• Any set of basis vectors (v1, ..., vk) can be
transformed to an orthogonal basis using the
Gram-Schmidt orthogonalization algorithm.
• Normalizing the basis vectors to “unit” length will
yield an orthonormal basis.
• More useful in practice since they simplify
calculations.
11. Vector Expansion/Projection
• Suppose v1, v2, . . . , vn is an orthogonal base
in W, then any v є W can be represented in
this basis as follows:
• The xi of the expansion can be computed as follows:
(vector expansion or projection)
(coefficients of expansion or projection)
where:
Note: if the basis is orthonormal, then vi.vi=1
12. Vector basis (cont’d)
• Why do we care about set of basis vectors?
– Given a set of basis vectors, each vector can be
represented (i.e., projected) “uniquely” in this basis.
• Do vector spaces have a unique vector
basis?
– No, simply translate/rotate the basis vectors to obtain a
new basis!
– Some sets of basis vectors are preferred than others
though.
– We will see this when we discuss Principal Components
Analysis (PCA).
13. Matrix Operations
• Matrix addition/subtraction
– Add/Subtract corresponding elements.
– Matrices must be of same size.
• Matrix multiplication
Condition: n = q
m x n q x p m x p
n
17. Determinants
2 x 2
3 x 3
n x n
Properties:
(expanded along 1st column)
(expanded along kth column)
18. Matrix Inverse
• The inverse of a matrix A, denoted as A-1, has the
property:
A A-1 = A-1A = I
• A-1 exists only if
• Definitions
– Singular matrix: A-1 does not exist
– Ill-conditioned matrix: A is “close” to being singular
21. Rank of matrix
• Defined as the size of the largest square sub-matrix
of A that has a non-zero determinant.
Example: has rank 3
22. Rank of matrix (cont’d)
• Alternatively, it can be defined as the maximum
number of linearly independent columns (or
rows) of A.
i.e., rank is not 4!
Example:
24. Eigenvalues and Eigenvectors
• The vector v is an eigenvector of matrix A and
λ is an eigenvalue of A if:
Geometric interpretation: the linear transformation
implied by A cannot change the direction of the
eigenvectors v, only their magnitude.
(assume v is non-zero)
25. Computing λ and v
• To compute the eigenvalues λ of a matrix A,
find the roots of the characteristic polynomial.
• The eigenvectors can then be computed:
Example:
26. Properties of λ and v
• Eigenvalues and eigenvectors are only
defined for square matrices.
• Eigenvectors are not unique (e.g., if v is an
eigenvector, so is kv)
• Suppose λ1, λ2, ..., λn are the eigenvalues of
A, then:
27. Matrix diagonalization
• Given an n x n matrix A, find P such that:
P-1AP=Λ where Λ is diagonal
• Solution: Set P = [v1 v2 . . . vn], where v1,v2 ,. . .
vn are the eigenvectors of A: eigenvalues of A
P-1AP=Λ
29. • If A is diagonalizable, then the corresponding
eigenvectors v1,v2 ,. . . vn form a basis in Rn
• If A is also symmetric, its eigenvalues are real
and the corresponding eigenvectors are
orthogonal.
Matrix diagonalization (cont’d)
30. • An n x n matrix A is diagonalizable iff
rank(P)=n, where P-1AP=Λ.
– i.e., A has n linearly independent eigenvectors.
• Theorem: If the eigenvalues of A are all
distinct, then the corresponding eigenvectors
are linearly independent (i.e., A is
diagonalizable).
Are all n x n matrices
diagonalizable?
32. Matrix decomposition (cont’d)
• Matrix decomposition can be simplified in
the case of symmetric matrices (i.e.,
orthogonal eigenvectors):
P-1=PT
A=PDPT=