SlideShare a Scribd company logo
1 of 45
Download to read offline
1/45
Ch. 12 Review of
Matrices and Vectors
2/45
Definition of Vector: A collection of complex or real numbers,
generally put in a column
[ ]T
N
N
v
v
v
v
1
1
=
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
v
Transpose
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
+
+
=
+
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
N
N
N
N b
a
b
a
b
b
a
a 1
1
1
1
b
a
b
a
Definition of Vector Addition: Add element-by-element
Vectors & Vector Spaces
3/45
Definition of Scalar: A real or complex number.
If the vectors of interest are complex valued then the set of
scalars is taken to be complex numbers; if the vectors of
interest are real valued then the set of scalars is taken to be
real numbers.
Multiplying a Vector by a Scalar :
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
Ī±
Ī±
=
Ī±
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
N
N a
a
a
a 1
1
a
a
ā€¦changes the vectorā€™s length if |Ī±| ā‰  1
ā€¦ ā€œreversesā€ its direction if Ī± < 0
4/45
Arithmetic Properties of Vectors: vector addition and
scalar multiplication exhibit the following properties pretty
much like the real numbers do
Let x, y, and z be vectors of the same dimension and let Ī±
and Ī² be scalars; then the following properties hold:
Ī±
Ī± x
x
x
y
y
x
=
+
=
+
1. Commutativity
x
x
z
x
y
z
y
x
)
(
)
(
)
(
)
(
Ī±Ī²
=
Ī²
Ī±
+
+
=
+
+
2. Associativity
x
x
x
y
x
y
x
Ī²
+
Ī±
=
Ī²
+
Ī±
Ī±
+
Ī±
=
+
Ī±
)
(
)
(
3. Distributivity
zeros
all
of
vector
zero
the
is
where
,
1
0
0
0x
x
x
=
=
4. Scalar Unity &
Scalar Zero
5/45
Definition of a Vector Space: A set V of N-dimensional vectors
(with a corresponding set of scalars) such that the set of vectors
is:
(i) ā€œclosedā€ under vector addition
(ii) ā€œclosedā€ under scalar multiplication
In other words:
ā€¢ addition of vectors ā€“ gives another vector in the set
ā€¢ multiplying a vector by a scalar ā€“ gives another vector in the
set
Note: this means that ANY ā€œlinear combinationā€ of vectors in the
space results in a vector in the spaceā€¦
If v1, v2, and v3 are all vectors in a given vector space V, then
āˆ‘
=
=
+
+
=
3
1
3
3
2
2
1
1
i
i
iv
v
v
v
v Ī±
Ī±
Ī±
Ī±
is also in the vector space V.
6/45
Axioms of Vector Space: If V is a set of vectors satisfying the above
definition of a vector space then it satisfies the following axioms:
Soā€¦ a vector space is nothing more than a set of vectors with
an ā€œarithmetic structureā€
1. Commutativity (see above)
2. Associativity (see above)
3. Distributivity (see above)
4. Unity and Zero Scalar (see above)
5. Existence of an Additive Identity ā€“ any vector space V must
have a zero vector
6. Existence of Negative Vector: For every vector v in V its
negative must also be in V
7/45
Def. of Subspace: Given a vector space V, a subset of vectors in V
that itself is closed under vector addition and scalar
multiplication (using the same set of scalars) is called a
subspace of V.
Examples:
1. The space R2 is a subspace of R3.
2. Any plane in R3 that passes through the origin is a subspace
3. Any line passing through the origin in R2 is a subspace of R2
4. The set R2 is NOT a subspace of C2 because R2 isnā€™t closed
under complex scalars (a subspace must retain the original
spaceā€™s set of scalars)
8/45
Length of a Vector (Vector Norm): For any vector v in CN
we define its length (or ā€œnormā€) to be
āˆ‘
=
=
N
i
i
v
1
2
2
v āˆ‘
=
=
N
i
i
v
1
2
2
2
v
2
2
2
1
2
2
1 v
v
v
v Ī²
Ī±
Ī²
Ī± +
ā‰¤
+
N
C
āˆˆ
āˆ€
āˆž
< v
v 2
0
v
v =
= iff
0
2
2
2
v
v Ī±
Ī± =
Properties of Vector Norm:
Geometric Structure of Vector Space
9/45
Distance Between Vectors: the distance between two
vectors in a vector space with the two norm is defined by:
2
2
1
2
1 )
,
( v
v
v
v āˆ’
=
d
2
1
2
1 iff
0
)
,
( v
v
v
v =
=
d
Note that:
v2
v1 v1 ā€“ v2
10/45
Angle Between Vectors & Inner Product:
Motivate the idea in R2:
v
Īø
A
u
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ£
āŽ”
=
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ£
āŽ”
Īø
Īø
=
0
1
sin
cos
u
v
A
A
Īø
=
Īø
ā‹…
+
Īø
ā‹…
=
āˆ‘
=
cos
sin
0
cos
1
2
1
A
A
A
v
u i
i
i
Note that:
Clearly we see thatā€¦ This gives a measure of the angle
between the vectors.
Now we generalize this idea!
11/45
Inner Product Between Vectors :
Define the inner product between two complex vectors in CN by:
*
1
i
N
i
iv
u
āˆ‘
=
>=
< v
u,
Properties of Inner Products:
1. Impact of Scalar Multiplication:
2. Impact of Vector Addition:
3. Linking Inner Product to Norm:
4. Schwarz Inequality:
5. Inner Product and Angle:
(Look back on previous page!)
>
<
>=
<
>
<
>=
<
v
u,
v
u,
v
u,
v
u,
*
Ī²
Ī²
Ī±
Ī±
>
<
+
>
>=<
+
<
>
<
+
>
>=<
+
<
v
w,
v
u,
v
w,
u
z
u,
v
u,
z
v
u,
>
=< v
v,
v
2
2
2
2
v
u
v
u, ā‰¤
>
<
)
cos(
2
2
Īø
=
>
<
v
u
v
u,
12/45
Inner Product, Angle, and Orthogonality :
)
cos(
2
2
Īø
=
>
<
v
u
v
u,
(i) This lies between ā€“1 and 1;
(ii) It measures directional alikeness of u and v
= +1 when u and v point in the same direction
= 0 when u and v are a ā€œright angleā€
= ā€“1 when u and v point in opposite directions
Two vectors u and v are said to be orthogonal when <u,v> = 0
If in addition, they each have unit length they are orthonormal
13/45
Can we find a set of ā€œprototypeā€ vectors {v1, v2, ā€¦, vM} from
which we can build all other vectors in some given vector space V
by using linear combinations of the vi?
Same ā€œIngredientsā€ā€¦ just different amounts of them!!!
āˆ‘
=
=
M
k
k
k
1
v
v Ī± āˆ‘
=
=
M
k
k
k
1
v
u Ī²
We want to be able to do is get any vector just by changing the
amountsā€¦ To do this requires that the set of ā€œprototypeā€
vectors {v1, v2, ā€¦, vM} satisfy certain conditions.
Weā€™d also like to have the smallest number of members in the
set of ā€œprototypeā€ vectors.
Building Vectors From Other Vectors
14/45
Span of a Set of Vectors: A set of vectors {v1, v2, ā€¦, vM} is said to
span the vector space V if it is possible to write each vector v in V as
a linear combination of vectors from the set:
āˆ‘
=
Ī±
=
M
k
k
k
1
v
v
This property establishes if there are enough vectors in the
proposed prototype set to build all possible vectors in V.
It is clear that:
1. We need at least N vectors to
span CN or RN but not just any N
vectors.
2. Any set of N mutually orthogonal
vectors spans CN or RN (a set of
vectors is mutually orthogonal if all
pairs are orthogonal).
Does not
Span R2
Spans R2
Examples in R2
15/45
Linear Independence: A set of vectors {v1, v2, ā€¦, vM} is said to
be linearly independent if none of the vectors in it can be written as
a linear combination of the others.
If a set of vectors is linearly dependent then there is ā€œredundancyā€
in the setā€¦it has more vectors than needed to be a ā€œprototypeā€ set!
For example, say that we have a set of four vectors {v1, v2, v3, v4}
and lets say that we know that we can build v2 from v1 and v3ā€¦
then every vector we can build from {v1, v2, v3, v4} can also be built
from only {v1, v3, v4}.
It is clear that:
1. In CN or RN we can have no
more than N linear independent
vectors.
2. Any set of mutually
orthogonal vectors is linear
independent (a set of vectors is
mutually orthogonal if all pairs
are orthogonal).
Linearly
Independent
Not Linearly
Independent
Examples in R2
16/45
Basis of a Vector Space: A basis of a vector space is a set of
linear independent vectors that span the space.
ā€¢ ā€œSpanā€ says there are enough vectors to build everything
ā€¢ ā€œLinear Indepā€ says that there are not more than needed
Orthonormal (ON) Basis: If a basis of a vector space contains
vectors that are orthonormal to each other (all pairs of basis
vectors are orthogonal and each basis vector has unit norm).
Fact: Any set of N linearly independent vectors in CN (RN) is a
basis of CN (RN).
Dimension of a Vector Space: The number of vectors in any
basis for a vector space is said to be the dimension of the space.
Thus, CN and RN each have dimension of N.
17/45
Fact: For a given basis {v1, v2, ā€¦, vN}, the expansion of a vector v
in V is unique. That is, for each v there is only one, unique set of
coefficients {Ī±1, Ī±2, ā€¦ , Ī±N} such that
āˆ‘
=
Ī±
=
N
k
k
k
1
v
v
In other words, this ā€œexpansionā€ or ā€œdecompositionā€ is unique.
Thus, for a given basis we can make a 1-to-1 correspondence
between vector v and the coefficients {Ī±1, Ī±2, ā€¦ , Ī±N}.
We can write the coefficients as a vector, too: [ ]T
N
Ī±
Ī±1
=
Ī±
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āŽÆ
āŽÆ
āŽÆ ā†’
ā†
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āˆ’
āˆ’
N
N
v
v
Ī±
Ī±1
1
to
1
1
Ī±
v
Expansion can be viewed as a mapping (or
transformation) from vector v to vector Ī±.
We can view this transform as taking us
from the original vector space into a new
vector space made from the coefficient
vectors of all the original vectors.
Expansion and Transformation
18/45
Fact: For any given vector space there are an infinite number of
possible basis sets.
The coefficients with respect to any of them provides complete
information about a vectorā€¦
some of them provide more insight into the vector and are therefore
more useful for certain signal processing tasks than others.
Often the key to solving a signal processing problem lies in finding
the correct basis to use for expandingā€¦ this is equivalent to finding
the right transform. See discussion coming next linking DFT to
these ideas!!!!
19/45
DFT from Basis Viewpoint:
If we have a discrete-time signal x[n] for n = 0, 1, ā€¦ N-1
Define vector: [ ]T
N
x
x
x ]
1
[
]
1
[
]
0
[ āˆ’
=
x
Then the IDFT equation can be viewed as an expansion of the
signal vector x in terms of this complex sinusoid basis:
āˆ‘
āˆ’
=
=
1
0
]
[
1
N
k
k
k
k
X
N
d
x
Ī±
kth coefficient
T
N
N
X
N
X
N
X
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ£
āŽ” āˆ’
=
]
1
[
]
1
[
]
0
[
Ī±
coefficient vector
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
1
1
1
0
d
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āˆ’
Ļ€
ā‹…
Ļ€
N
N
j
N
j
e
e
/
)
1
(
1
2
/
1
1
2
1
1
d
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āˆ’
Ļ€
ā‹…
Ļ€
N
N
j
N
j
e
e
/
)
1
(
2
2
/
1
2
2
2
1
d
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āˆ’
āˆ’
Ļ€
ā‹…
āˆ’
Ļ€
āˆ’
N
N
N
j
N
N
j
N
e
e
/
)
1
)(
1
(
2
/
1
)
1
(
2
1
1
d
ā€¦
Define a orthogonal basis from the exponentials used in the IDFT:
20/45
Whatā€™s So Good About an ON Basis?: Given any basis
{v1, v2, ā€¦, vN} we can write any v in V as
āˆ‘
=
Ī±
=
N
k
k
k
1
v
v
Given the vector v how do we find the Ī±ā€™s?
ā€¢ In general ā€“ hard! But for ON basis ā€“ easy!!
If {v1, v2, ā€¦, vN} is an ON basis then
i
i
j
i
j
N
j
j
i
j
N
j
j
i
Ī±
=
Ī±
=
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ£
āŽ”
Ī±
=
āˆ’
Ī“
=
=
āˆ‘
āˆ‘
]
[
1
1
v
,
v
v
,
v
v
v,
i
i v
v,
=
Ī±
ith coefficient = inner product with ith ON basis vector
Usefulness of an ON Basis
21/45
Another Good Thing About an ON Basis: They preserve inner
products and normsā€¦ (called ā€œisometricā€):
If {v1, v2, ā€¦, vN} is an ON basis and u and v are vectors
expanded as
Thenā€¦.
1. < v ,u > = < Ī± , Ī² > (Preserves Inner Prod.)
2. ||v||2 = ||Ī±||2 and ||u||2 = ||Ī²||2 (Preserves Norms)
āˆ‘
=
Ī±
=
N
k
k
k
1
v
v āˆ‘
=
=
N
k
k
k
1
v
u Ī²
Soā€¦ using an ON basis provides:
ā€¢ Easy computation via inner products
ā€¢ Preservation of geometry (closeness, size, orientation, etc.
22/45
Example: DFT Coefficients as Inner Products:
Recall: N-pt. IDFT is an expansion of the signal vector in terms of
N Orthogonal vectors. Thus
āˆ‘
āˆ‘
āˆ’
=
āˆ’
āˆ’
=
=
=
=
1
0
/
2
1
0
*
]
[
]
[
]
[
]
[
N
n
N
kn
j
N
n
k
k
e
n
x
n
d
n
x
k
X
Ļ€
d
x,
See ā€œreading notesā€ for some details about normalization issues in this case
23/45
Weā€™ll sometimes view a matrix as being built from its columns;
The 3x4 example above could be written as:
[ ]
4
3
2
1 |
|
| a
a
a
a
A = [ ]T
k
k
k
k a
a
a 3
2
1
=
a
Weā€™ll take two views of a matrix:
1. ā€œStorageā€ for a bunch of related numbers (e.g., Cov. Matrix)
2. A transform (or mapping, or operator) acting on a vector
(e.g., DFT, observation matrix, etcā€¦. as weā€™ll see)
Matrix: Is an array of (real or complex) numbers organized in
rows and columns.
Here is a 3x4 example:
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
34
33
32
31
24
23
22
21
14
13
12
11
a
a
a
a
a
a
a
a
a
a
a
a
A
Matrices
24/45
Matrix as Transform: Our main view of matrices will be as
ā€œoperatorsā€ that transform one vector into another vector.
Consider the 3x4 example matrix above. We could use that matrix
to transform the 4-dimensional vector v into a 3-dimensional
vector u:
[ ] 4
4
3
3
2
2
1
1
4
3
2
1
4
3
2
1 |
|
| a
a
a
a
a
a
a
a
Av
u v
v
v
v
v
v
v
v
+
+
+
=
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
=
Clearly u is built from the columns
of matrix A; therefore, it must lie
in the span of the set of vectors that
make up the columns of A.
Note that the columns of A are
3-dimensional vectorsā€¦ so is u.
25/45
Transforming a Vector Space: If we apply A to all the vectors in
a vector space V we get a collection of vectors that are in a
new space called U.
In the 3x4 example matrix above we transformed a 4-dimensional
vector space V into a 3-dimensional vector space U
A 2x3 real matrix A would transform R3 into R2 :
Facts: If the mapping matrix A is square and its columns are
linearly independent then
(i) the space that vectors in V get mapped to (i.e., U) has the
same dimension as V (due to ā€œsquareā€ part)
(ii) this mapping is reversible (i.e., invertible); there is an inverse
matrix A-1 such that v = A-1u (due to ā€œsquareā€ & ā€œLIā€ part)
A
26/45
1
x
1
, āˆ’
A
A
1
y
2
x
2
y
1
, āˆ’
A
A
Transform = Matrix Ɨ Vector: a VERY useful viewpoint for all
sorts of signal processing scenarios. In general we can view many
linear transforms (e.g., DFT, etc.) in terms of some invertible
matrix A operating on a signal vector x to give another vector y:
i
i Ax
y = i
i y
A
x 1
āˆ’
=
We can think of A and A-1 as
mapping back and forth
between two vector spaces
27/45
Basis Matrix & Coefficient Vector:
Suppose we have a basis {v1, v2, ā€¦, vN} for a vector space V.
Then a vector v in space V can be written as:
Another view of this:
āˆ‘
=
Ī±
=
N
k
k
k
1
v
v
[ ]
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
N
NxN
N
Ī±
Ī±
Ī±
2
1
matrix
2
1 |
|
| v
v
v
v
VĪ±
v =
The ā€œBasis Matrixā€ V transforms
the coefficient vector into the
original vector v
Matrix View & Basis View
28/45
Three Views of Basis Matrix & Coefficient Vector:
View #1
Vector v is a linear combination of the columns of basis matrix V.
View #2
Matrix V maps vector Ī± into vector v.
View #3
There is a matrix, V-1, that maps vector v into vector Ī±.
āˆ‘
=
Ī±
=
N
k
k
k
1
v
v
VĪ±
v =
v
V
Ī± 1
āˆ’
=
Aside: If a matrix A is square and has linearly independent columns, then A
is ā€œinvertibleā€ and A-1 exists such that A A-1 = A-1A = I where I is the identity
matrix having 1ā€™s on the diagonal and zeros elsewhere.
Now have
a way to go
back-and-
forth
between
vector v
and its
coefficient
vector Ī±
Now have
a way to go
back-and-
forth
between
vector v
and its
coefficient
vector Ī±
29/45
Basis Matrix for ON Basis: We get a special structure!!!
Result: For an ON basis matrix Vā€¦ V-1 = VH
(the superscript H denotes ā€œhermitian transposeā€, which consists
of transposing the matrix and conjugating the elements)
To see this:
I
v
v
v
v
v
v
v
v
v
v
v
v
v
v
v
v
v
v
VV
=
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
>
<
>
<
>
<
>
<
>
<
>
<
>
<
>
<
>
<
=
1
0
0
0
1
0
0
0
1
,
,
,
,
,
,
,
,
,
2
1
2
2
2
1
2
1
2
1
1
1
N
N
N
N
N
N
H
Inner products are 0 or 1
because this is an ON basis
30/45
A unitary matrix is a complex matrix A whose inverse is A-1 = AH
For the real-valued matrix caseā€¦ we get a special case of ā€œunitaryā€
the idea of ā€œunitary matrixā€ becomes ā€œorthogonal matrixā€
for which A-1 = AT
Two Properties of Unitary Matrices: Let U be a unitary matrix
and let y1 = Ux1 and y2 = Ux2
1. They preserve norms: ||yi|| = ||xi||.
2. They preserve inner products: < y1, y2 > = < x1, x2 >
That is the ā€œgeometryā€ of the old space is preserved by the unitary
matrix as it transforms into the new space.
(These are the same as the preservation properties of ON basis.)
Unitary and Orthogonal Matrices
31/45
DFT from Unitary Matrix Viewpoint:
Consider a discrete-time signal x[n] for n = 0, 1, ā€¦ N-1.
Weā€™ve already seen the DFT in a basis viewpoint:
Now we can view the DFT as a transform from the Unitary matrix
viewpoint:
āˆ‘
āˆ’
=
=
1
0
]
[
1
N
k
k
k
k
X
N
d
x
Ī±
DFT IDFT
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
=
āˆ’
āˆ’
Ļ€
ā‹…
āˆ’
Ļ€
āˆ’
Ļ€
ā‹…
Ļ€
āˆ’
Ļ€
ā‹…
Ļ€
āˆ’
N
N
N
j
N
N
j
N
N
j
N
j
N
N
j
N
j
N
e
e
e
e
e
e
/
)
1
)(
1
(
2
/
1
)
1
(
2
/
)
1
(
2
2
/
1
2
2
/
)
1
(
1
2
/
1
1
2
1
1
0
1
1
1
1
1
1
]
|
|
|
[ ā€¦ d
d
d
D
x
D
x H
=
~
x
D
x ~
1
N
=
(Acutally D is not unitary but N-1/2D is unitaryā€¦ see reading notes)
32/45
Geometry Preservation of Unitary Matrix Mappings
Recallā€¦ unitary matrices map in such a way that the sizes of
vectors and the orientation between vectors is not changed.
1
x
1
, āˆ’
A
A
1
y
2
x
2
y
1
, āˆ’
A
A
Unitary mappings just
ā€œrigidly rotateā€ the space.
33/45
1
x
1
, āˆ’
A
A
1
y
2
x
2
y
1
, āˆ’
A
A
Effect of Non-Unitary Matrix Mappings
34/45
More on Matrices as Transforms
Ax
y =
mƗ1 mƗn nƗ1
Weā€™ll limit ourselves here to real-valued vectors and matrices
A maps any vector x in Rn
into some vector y in Rm
Rn Rm
x y
A
Range(A): ā€œRange Space of Aā€ =
set of all vectors in Rm that can be
reached by mapping
vector y = weighted sum of columns of A
ā‡’ may only be able to reach certain yā€™s
Mostly interested in two cases:
1. ā€œTall Matrixā€ m > n
2. ā€œSquare Matrixā€ m = n
35/45
Range of a ā€œTall Matrixā€ (m > n)
Rn Rm
x y
A
The range(A) āŠ‚ Rm
ā€œProofā€: Since y is ā€œbuiltā€ from the n columns of A there are
not enough to form a basis for Rm (they donā€™t span Rm)
Range of a ā€œSquare Matrixā€ (m = n)
If the columns of A are linearly indepā€¦.The range(A) = Rm
ā€¦because the columns form a basis for Rm
Otherwiseā€¦.The range(A) āŠ‚ Rm
ā€¦because the columns donā€™t span Rm
36/45
Rank of a Matrix: rank(A) = largest # of linearly independent
columns (or rows) of matrix A
For an mƗn matrix we have that rank(A) ā‰¤ min(m,n)
An mƗn matrix A has ā€œfull rankā€ when rank(A) = min(m,n)
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
0
0
0
0
0
0
0
0
1
1
0
0
2
0
1
0
1
0
0
1
A
Example: This matrix has rank of 3 because the 4th column can be
written as a combination of the first 3 columns
37/45
Characterizing ā€œTall Matrixā€ Mappings
We are interested in answering: Given a vector y, what vector x
gets mapped into it via matrix A?
ā€œTall Matrixā€ (m > n) Case
If y does not lie in range(A), then there is No Solution
If y lies in range(A), then there is a solution (but not
necessarily just one unique solution)
yāˆˆrange(A)
yāˆ‰range(A)
No Solution
One Solution Many Solutions
A full
rank
A not
full rank
Ax
y =
38/45
Full-Rank ā€œTall Matrixā€ (m > n) Case
Rn Rm
x
y
A
Range(A)
Ax
y =
For a given yāˆˆrange(A)ā€¦
there is only one x that maps to it.
By looking at y we can determine which x gave rise to it
This is because the columns of A are linearly independent
and we know from our studies of vector spaces that the
coefficient vector of y is uniqueā€¦ x is that coefficient
vector
39/45
NonFull-Rank ā€œTall Matrixā€ (m > n) Case
Rn Rm
x1
y
A
Range(A)
Ax
y =
For a given yāˆˆrange(A) there may be more than one x that
maps to it
x2
A
By looking at y we can not determine which x gave rise to it
This is because the columns of A are linearly dependent
and that redundancy provides several ways to combine
them to create y
40/45
Characterizing ā€œSquare Matrixā€ Mappings
Q: Given any yāˆˆRn can we find an xāˆˆRn that maps to it?
A: Not always!!!
yāˆˆrange(A)
yāˆ‰range(A)
No Solution
One Solution
Many Solutions
A full
rank
A not
full rank
Ax
y = Careful!!! This is quite a
different flow diagram here!!!
When a square A is full rank then its range covers the
complete new spaceā€¦ then, y must be in range(A) and
because the columns of A are a basis there is a way to
build y
41/45
A Full-Rank Square Matrix is Invertible
A square matrix that has full rank is said to beā€¦.
ā€œnonsingularā€, ā€œinvertibleā€
Then we can find the x that mapped to y using x = A-1y
Pos. Def.
Matrices
Invertible
Matrices
Several ways to check if nƗn A is invertible:
1. A is invertible if and only if (iff) its columns (or rows) are
linearly independent (i.e., if it is full rank)
2. A is invertible iff det(A) ā‰  0
3. A is invertible if (but not only if) it is ā€œpositive definiteā€ (see
later)
4. A is invertible iff all its eigenvalues are nonzero
( becauseā€¦ det(A) = product of eigenvalues )
42/45
Eigenvalues and Eigenvectors of Square Matrices
If matrix A is nƗn, then A maps Rn ā†’ Rn
These vectors are ā€œspecialā€ and are called the eigenvectors of A.
The scalar Ī» is that e-vectorā€™s corresponding eigenvalue.
v
Av Ī»
=
Input Output
v Av
Q: For a given nƗn matrix A, which vectors get mapped into
being almost themselves???
More preciselyā€¦ Which vectors get mapped to a scalar multiple
of themselves???
Even more preciselyā€¦ which vectors v satisfy the following:
43/45
ā€¢ If nƗn real matrix A is symmetric, then
ā€“ e-vectors corresponding to distinct e-values are orthonormal
ā€“ e-values are real valued
ā€“ can decompose A as
ā€œEigen-Facts for Symmetric Matricesā€
T
V
VĪ›
A =
[ ]
{ }
n
T
n
diag Ī»
Ī»
Ī» ,
,
, 2
1
2
1
ā€¦
=
=
=
Ī›
I
VV
v
v
v
V
T
V
VĪ›
A 1
1 āˆ’
āˆ’
=
{ }
n
diag Ī»
Ī»
Ī»
1
1
1
1
,
,
,
2
1
ā€¦
=
āˆ’
Ī›
For P.D. A, A-1 has
the same e-vectors and
has reciprocal e-values
ā€¢ If, further, A is pos. def. (semi-def.), then
ā€“ e-values are positive (non-negative)
ā€“ rank(A) = # of non-zero e-values
ā€¢ Pos. Def. ā‡’ Full Rank (and therefore invertible)
ā€¢ Pos. Semi-Def. ā‡’ Not Full Rank (and therefore not invertible)
ā€“ When A is P. D., then we can write
44/45
Weā€™ll limit our discussion to real-valued matrices and vectors
Quadratic Forms and Positive-(Semi)Definite Matrices
Quadratic Form = Matrix form for a 2nd-order multivariate
polynomial
Example:
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
āŽ„
āŽ„
āŽ„
āŽ¦
āŽ¤
āŽ¢
āŽ¢
āŽ¢
āŽ£
āŽ”
=
22
21
12
11
2
1
a
a
a
a
x
x
A
x
fixed
variable
2
1
21
12
2
2
22
2
1
11
2
1
2
1
2
1
)
(
scalar
)
1
1
(
)
1
2
(
)
2
2
(
)
2
1
(
)
,
(
x
x
a
a
x
a
x
a
x
x
a
x
x
Q
i j
j
i
ij
T
+
+
+
=
=
Ɨ
=
Ɨ
ā‹…
Ɨ
ā‹…
Ɨ
=
āˆ‘āˆ‘
= =
Ax
x
A
scalar
The quadratic form of matrix A is:
Other Matrix Issues
45/45
ā€¢ Values of the elements of matrix A determine the characteristics
of the quadratic form QA(x)
ā€“ If QA(x) ā‰„ 0 āˆ€x ā‰  0ā€¦ then say that QA(x) is ā€œpositive semi-definiteā€
ā€“ If QA(x) > 0 āˆ€x ā‰  0ā€¦ then say that QA(x) is ā€œpositive definiteā€
ā€“ Otherwise say that QA(x) is ā€œnon-definiteā€
ā€¢ These terms carry over to the matrix that defines the Quad Form
ā€“ If QA(x) ā‰„ 0 āˆ€x ā‰  0ā€¦ then say that A is ā€œpositive semi-definiteā€
ā€“ If QA(x) > 0 āˆ€x ā‰  0ā€¦ then say that A is ā€œpositive definiteā€

More Related Content

Similar to Ch_12 Review of Matrices and Vectors (PPT).pdf

1625 signal processing and representation theory
1625 signal processing and representation theory1625 signal processing and representation theory
1625 signal processing and representation theoryDr Fereidoun Dejahang
Ā 
Linear Algebra for Competitive Exams
Linear Algebra for Competitive ExamsLinear Algebra for Competitive Exams
Linear Algebra for Competitive ExamsKendrika Academy
Ā 
Vectorspace in 2,3and n space
Vectorspace in 2,3and n spaceVectorspace in 2,3and n space
Vectorspace in 2,3and n spaceAhmad Saifullah
Ā 
Vector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisVector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisRavi Gelani
Ā 
real vector space
real vector spacereal vector space
real vector spaceAsyraf Ghani
Ā 
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By PearsonChapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By PearsonChaimae Baroudi
Ā 
Proyecto grupal algebra parcial ii
Proyecto grupal algebra parcial iiProyecto grupal algebra parcial ii
Proyecto grupal algebra parcial iiJHANDRYALCIVARGUAJAL
Ā 
Vcla ppt ch=vector space
Vcla ppt ch=vector spaceVcla ppt ch=vector space
Vcla ppt ch=vector spaceMahendra Patel
Ā 
Eigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to DiagonalisationEigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to DiagonalisationChristopher Gratton
Ā 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectorstirath prajapati
Ā 
2 linear independence
2 linear independence2 linear independence
2 linear independenceAmanSaeed11
Ā 
21EC33 BSP Module 1.pdf
21EC33 BSP Module 1.pdf21EC33 BSP Module 1.pdf
21EC33 BSP Module 1.pdfRavikiran A
Ā 
Mathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data MiningMathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data MiningMadhavRao65
Ā 
Vector spaces
Vector spaces Vector spaces
Vector spaces Jitin Pillai
Ā 

Similar to Ch_12 Review of Matrices and Vectors (PPT).pdf (20)

1625 signal processing and representation theory
1625 signal processing and representation theory1625 signal processing and representation theory
1625 signal processing and representation theory
Ā 
Linear Algebra for Competitive Exams
Linear Algebra for Competitive ExamsLinear Algebra for Competitive Exams
Linear Algebra for Competitive Exams
Ā 
Vectorspace in 2,3and n space
Vectorspace in 2,3and n spaceVectorspace in 2,3and n space
Vectorspace in 2,3and n space
Ā 
1619 quantum computing
1619 quantum computing1619 quantum computing
1619 quantum computing
Ā 
Vector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,BasisVector Spaces,subspaces,Span,Basis
Vector Spaces,subspaces,Span,Basis
Ā 
real vector space
real vector spacereal vector space
real vector space
Ā 
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By PearsonChapter 4: Vector Spaces - Part 3/Slides By Pearson
Chapter 4: Vector Spaces - Part 3/Slides By Pearson
Ā 
1640 vector-maths
1640 vector-maths1640 vector-maths
1640 vector-maths
Ā 
Proyecto grupal algebra parcial ii
Proyecto grupal algebra parcial iiProyecto grupal algebra parcial ii
Proyecto grupal algebra parcial ii
Ā 
Vcla ppt ch=vector space
Vcla ppt ch=vector spaceVcla ppt ch=vector space
Vcla ppt ch=vector space
Ā 
lec7.ppt
lec7.pptlec7.ppt
lec7.ppt
Ā 
Eigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to DiagonalisationEigenvectors & Eigenvalues: The Road to Diagonalisation
Eigenvectors & Eigenvalues: The Road to Diagonalisation
Ā 
Calculus it
Calculus itCalculus it
Calculus it
Ā 
Eigen values and eigen vectors
Eigen values and eigen vectorsEigen values and eigen vectors
Eigen values and eigen vectors
Ā 
2 linear independence
2 linear independence2 linear independence
2 linear independence
Ā 
21EC33 BSP Module 1.pdf
21EC33 BSP Module 1.pdf21EC33 BSP Module 1.pdf
21EC33 BSP Module 1.pdf
Ā 
Mathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data MiningMathematical Foundations for Machine Learning and Data Mining
Mathematical Foundations for Machine Learning and Data Mining
Ā 
Vector space
Vector spaceVector space
Vector space
Ā 
2 vectors notes
2 vectors notes2 vectors notes
2 vectors notes
Ā 
Vector spaces
Vector spaces Vector spaces
Vector spaces
Ā 

Recently uploaded

Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
Ā 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
Ā 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
Ā 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)Dr. Mazin Mohamed alkathiri
Ā 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
Ā 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
Ā 
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļøcall girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø9953056974 Low Rate Call Girls In Saket, Delhi NCR
Ā 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
Ā 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupJonathanParaisoCruz
Ā 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
Ā 
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
Ā 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
Ā 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
Ā 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
Ā 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
Ā 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
Ā 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
Ā 

Recently uploaded (20)

Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
Ā 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
Ā 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
Ā 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
Ā 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
Ā 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
Ā 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
Ā 
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļøcall girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
Ā 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
Ā 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
Ā 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized Group
Ā 
Model Call Girl in Bikash Puri Delhi reach out to us at šŸ”9953056974šŸ”
Model Call Girl in Bikash Puri  Delhi reach out to us at šŸ”9953056974šŸ”Model Call Girl in Bikash Puri  Delhi reach out to us at šŸ”9953056974šŸ”
Model Call Girl in Bikash Puri Delhi reach out to us at šŸ”9953056974šŸ”
Ā 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
Ā 
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
ā€œOh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
Ā 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Ā 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
Ā 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
Ā 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
Ā 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Ā 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
Ā 

Ch_12 Review of Matrices and Vectors (PPT).pdf

  • 1. 1/45 Ch. 12 Review of Matrices and Vectors
  • 2. 2/45 Definition of Vector: A collection of complex or real numbers, generally put in a column [ ]T N N v v v v 1 1 = āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = v Transpose āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” + + = + āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = N N N N b a b a b b a a 1 1 1 1 b a b a Definition of Vector Addition: Add element-by-element Vectors & Vector Spaces
  • 3. 3/45 Definition of Scalar: A real or complex number. If the vectors of interest are complex valued then the set of scalars is taken to be complex numbers; if the vectors of interest are real valued then the set of scalars is taken to be real numbers. Multiplying a Vector by a Scalar : āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” Ī± Ī± = Ī± āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = N N a a a a 1 1 a a ā€¦changes the vectorā€™s length if |Ī±| ā‰  1 ā€¦ ā€œreversesā€ its direction if Ī± < 0
  • 4. 4/45 Arithmetic Properties of Vectors: vector addition and scalar multiplication exhibit the following properties pretty much like the real numbers do Let x, y, and z be vectors of the same dimension and let Ī± and Ī² be scalars; then the following properties hold: Ī± Ī± x x x y y x = + = + 1. Commutativity x x z x y z y x ) ( ) ( ) ( ) ( Ī±Ī² = Ī² Ī± + + = + + 2. Associativity x x x y x y x Ī² + Ī± = Ī² + Ī± Ī± + Ī± = + Ī± ) ( ) ( 3. Distributivity zeros all of vector zero the is where , 1 0 0 0x x x = = 4. Scalar Unity & Scalar Zero
  • 5. 5/45 Definition of a Vector Space: A set V of N-dimensional vectors (with a corresponding set of scalars) such that the set of vectors is: (i) ā€œclosedā€ under vector addition (ii) ā€œclosedā€ under scalar multiplication In other words: ā€¢ addition of vectors ā€“ gives another vector in the set ā€¢ multiplying a vector by a scalar ā€“ gives another vector in the set Note: this means that ANY ā€œlinear combinationā€ of vectors in the space results in a vector in the spaceā€¦ If v1, v2, and v3 are all vectors in a given vector space V, then āˆ‘ = = + + = 3 1 3 3 2 2 1 1 i i iv v v v v Ī± Ī± Ī± Ī± is also in the vector space V.
  • 6. 6/45 Axioms of Vector Space: If V is a set of vectors satisfying the above definition of a vector space then it satisfies the following axioms: Soā€¦ a vector space is nothing more than a set of vectors with an ā€œarithmetic structureā€ 1. Commutativity (see above) 2. Associativity (see above) 3. Distributivity (see above) 4. Unity and Zero Scalar (see above) 5. Existence of an Additive Identity ā€“ any vector space V must have a zero vector 6. Existence of Negative Vector: For every vector v in V its negative must also be in V
  • 7. 7/45 Def. of Subspace: Given a vector space V, a subset of vectors in V that itself is closed under vector addition and scalar multiplication (using the same set of scalars) is called a subspace of V. Examples: 1. The space R2 is a subspace of R3. 2. Any plane in R3 that passes through the origin is a subspace 3. Any line passing through the origin in R2 is a subspace of R2 4. The set R2 is NOT a subspace of C2 because R2 isnā€™t closed under complex scalars (a subspace must retain the original spaceā€™s set of scalars)
  • 8. 8/45 Length of a Vector (Vector Norm): For any vector v in CN we define its length (or ā€œnormā€) to be āˆ‘ = = N i i v 1 2 2 v āˆ‘ = = N i i v 1 2 2 2 v 2 2 2 1 2 2 1 v v v v Ī² Ī± Ī² Ī± + ā‰¤ + N C āˆˆ āˆ€ āˆž < v v 2 0 v v = = iff 0 2 2 2 v v Ī± Ī± = Properties of Vector Norm: Geometric Structure of Vector Space
  • 9. 9/45 Distance Between Vectors: the distance between two vectors in a vector space with the two norm is defined by: 2 2 1 2 1 ) , ( v v v v āˆ’ = d 2 1 2 1 iff 0 ) , ( v v v v = = d Note that: v2 v1 v1 ā€“ v2
  • 10. 10/45 Angle Between Vectors & Inner Product: Motivate the idea in R2: v Īø A u āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ£ āŽ” = āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ£ āŽ” Īø Īø = 0 1 sin cos u v A A Īø = Īø ā‹… + Īø ā‹… = āˆ‘ = cos sin 0 cos 1 2 1 A A A v u i i i Note that: Clearly we see thatā€¦ This gives a measure of the angle between the vectors. Now we generalize this idea!
  • 11. 11/45 Inner Product Between Vectors : Define the inner product between two complex vectors in CN by: * 1 i N i iv u āˆ‘ = >= < v u, Properties of Inner Products: 1. Impact of Scalar Multiplication: 2. Impact of Vector Addition: 3. Linking Inner Product to Norm: 4. Schwarz Inequality: 5. Inner Product and Angle: (Look back on previous page!) > < >= < > < >= < v u, v u, v u, v u, * Ī² Ī² Ī± Ī± > < + > >=< + < > < + > >=< + < v w, v u, v w, u z u, v u, z v u, > =< v v, v 2 2 2 2 v u v u, ā‰¤ > < ) cos( 2 2 Īø = > < v u v u,
  • 12. 12/45 Inner Product, Angle, and Orthogonality : ) cos( 2 2 Īø = > < v u v u, (i) This lies between ā€“1 and 1; (ii) It measures directional alikeness of u and v = +1 when u and v point in the same direction = 0 when u and v are a ā€œright angleā€ = ā€“1 when u and v point in opposite directions Two vectors u and v are said to be orthogonal when <u,v> = 0 If in addition, they each have unit length they are orthonormal
  • 13. 13/45 Can we find a set of ā€œprototypeā€ vectors {v1, v2, ā€¦, vM} from which we can build all other vectors in some given vector space V by using linear combinations of the vi? Same ā€œIngredientsā€ā€¦ just different amounts of them!!! āˆ‘ = = M k k k 1 v v Ī± āˆ‘ = = M k k k 1 v u Ī² We want to be able to do is get any vector just by changing the amountsā€¦ To do this requires that the set of ā€œprototypeā€ vectors {v1, v2, ā€¦, vM} satisfy certain conditions. Weā€™d also like to have the smallest number of members in the set of ā€œprototypeā€ vectors. Building Vectors From Other Vectors
  • 14. 14/45 Span of a Set of Vectors: A set of vectors {v1, v2, ā€¦, vM} is said to span the vector space V if it is possible to write each vector v in V as a linear combination of vectors from the set: āˆ‘ = Ī± = M k k k 1 v v This property establishes if there are enough vectors in the proposed prototype set to build all possible vectors in V. It is clear that: 1. We need at least N vectors to span CN or RN but not just any N vectors. 2. Any set of N mutually orthogonal vectors spans CN or RN (a set of vectors is mutually orthogonal if all pairs are orthogonal). Does not Span R2 Spans R2 Examples in R2
  • 15. 15/45 Linear Independence: A set of vectors {v1, v2, ā€¦, vM} is said to be linearly independent if none of the vectors in it can be written as a linear combination of the others. If a set of vectors is linearly dependent then there is ā€œredundancyā€ in the setā€¦it has more vectors than needed to be a ā€œprototypeā€ set! For example, say that we have a set of four vectors {v1, v2, v3, v4} and lets say that we know that we can build v2 from v1 and v3ā€¦ then every vector we can build from {v1, v2, v3, v4} can also be built from only {v1, v3, v4}. It is clear that: 1. In CN or RN we can have no more than N linear independent vectors. 2. Any set of mutually orthogonal vectors is linear independent (a set of vectors is mutually orthogonal if all pairs are orthogonal). Linearly Independent Not Linearly Independent Examples in R2
  • 16. 16/45 Basis of a Vector Space: A basis of a vector space is a set of linear independent vectors that span the space. ā€¢ ā€œSpanā€ says there are enough vectors to build everything ā€¢ ā€œLinear Indepā€ says that there are not more than needed Orthonormal (ON) Basis: If a basis of a vector space contains vectors that are orthonormal to each other (all pairs of basis vectors are orthogonal and each basis vector has unit norm). Fact: Any set of N linearly independent vectors in CN (RN) is a basis of CN (RN). Dimension of a Vector Space: The number of vectors in any basis for a vector space is said to be the dimension of the space. Thus, CN and RN each have dimension of N.
  • 17. 17/45 Fact: For a given basis {v1, v2, ā€¦, vN}, the expansion of a vector v in V is unique. That is, for each v there is only one, unique set of coefficients {Ī±1, Ī±2, ā€¦ , Ī±N} such that āˆ‘ = Ī± = N k k k 1 v v In other words, this ā€œexpansionā€ or ā€œdecompositionā€ is unique. Thus, for a given basis we can make a 1-to-1 correspondence between vector v and the coefficients {Ī±1, Ī±2, ā€¦ , Ī±N}. We can write the coefficients as a vector, too: [ ]T N Ī± Ī±1 = Ī± āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āŽÆ āŽÆ āŽÆ ā†’ ā† āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āˆ’ āˆ’ N N v v Ī± Ī±1 1 to 1 1 Ī± v Expansion can be viewed as a mapping (or transformation) from vector v to vector Ī±. We can view this transform as taking us from the original vector space into a new vector space made from the coefficient vectors of all the original vectors. Expansion and Transformation
  • 18. 18/45 Fact: For any given vector space there are an infinite number of possible basis sets. The coefficients with respect to any of them provides complete information about a vectorā€¦ some of them provide more insight into the vector and are therefore more useful for certain signal processing tasks than others. Often the key to solving a signal processing problem lies in finding the correct basis to use for expandingā€¦ this is equivalent to finding the right transform. See discussion coming next linking DFT to these ideas!!!!
  • 19. 19/45 DFT from Basis Viewpoint: If we have a discrete-time signal x[n] for n = 0, 1, ā€¦ N-1 Define vector: [ ]T N x x x ] 1 [ ] 1 [ ] 0 [ āˆ’ = x Then the IDFT equation can be viewed as an expansion of the signal vector x in terms of this complex sinusoid basis: āˆ‘ āˆ’ = = 1 0 ] [ 1 N k k k k X N d x Ī± kth coefficient T N N X N X N X āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ£ āŽ” āˆ’ = ] 1 [ ] 1 [ ] 0 [ Ī± coefficient vector āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = 1 1 1 0 d āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āˆ’ Ļ€ ā‹… Ļ€ N N j N j e e / ) 1 ( 1 2 / 1 1 2 1 1 d āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āˆ’ Ļ€ ā‹… Ļ€ N N j N j e e / ) 1 ( 2 2 / 1 2 2 2 1 d āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āˆ’ āˆ’ Ļ€ ā‹… āˆ’ Ļ€ āˆ’ N N N j N N j N e e / ) 1 )( 1 ( 2 / 1 ) 1 ( 2 1 1 d ā€¦ Define a orthogonal basis from the exponentials used in the IDFT:
  • 20. 20/45 Whatā€™s So Good About an ON Basis?: Given any basis {v1, v2, ā€¦, vN} we can write any v in V as āˆ‘ = Ī± = N k k k 1 v v Given the vector v how do we find the Ī±ā€™s? ā€¢ In general ā€“ hard! But for ON basis ā€“ easy!! If {v1, v2, ā€¦, vN} is an ON basis then i i j i j N j j i j N j j i Ī± = Ī± = āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ£ āŽ” Ī± = āˆ’ Ī“ = = āˆ‘ āˆ‘ ] [ 1 1 v , v v , v v v, i i v v, = Ī± ith coefficient = inner product with ith ON basis vector Usefulness of an ON Basis
  • 21. 21/45 Another Good Thing About an ON Basis: They preserve inner products and normsā€¦ (called ā€œisometricā€): If {v1, v2, ā€¦, vN} is an ON basis and u and v are vectors expanded as Thenā€¦. 1. < v ,u > = < Ī± , Ī² > (Preserves Inner Prod.) 2. ||v||2 = ||Ī±||2 and ||u||2 = ||Ī²||2 (Preserves Norms) āˆ‘ = Ī± = N k k k 1 v v āˆ‘ = = N k k k 1 v u Ī² Soā€¦ using an ON basis provides: ā€¢ Easy computation via inner products ā€¢ Preservation of geometry (closeness, size, orientation, etc.
  • 22. 22/45 Example: DFT Coefficients as Inner Products: Recall: N-pt. IDFT is an expansion of the signal vector in terms of N Orthogonal vectors. Thus āˆ‘ āˆ‘ āˆ’ = āˆ’ āˆ’ = = = = 1 0 / 2 1 0 * ] [ ] [ ] [ ] [ N n N kn j N n k k e n x n d n x k X Ļ€ d x, See ā€œreading notesā€ for some details about normalization issues in this case
  • 23. 23/45 Weā€™ll sometimes view a matrix as being built from its columns; The 3x4 example above could be written as: [ ] 4 3 2 1 | | | a a a a A = [ ]T k k k k a a a 3 2 1 = a Weā€™ll take two views of a matrix: 1. ā€œStorageā€ for a bunch of related numbers (e.g., Cov. Matrix) 2. A transform (or mapping, or operator) acting on a vector (e.g., DFT, observation matrix, etcā€¦. as weā€™ll see) Matrix: Is an array of (real or complex) numbers organized in rows and columns. Here is a 3x4 example: āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = 34 33 32 31 24 23 22 21 14 13 12 11 a a a a a a a a a a a a A Matrices
  • 24. 24/45 Matrix as Transform: Our main view of matrices will be as ā€œoperatorsā€ that transform one vector into another vector. Consider the 3x4 example matrix above. We could use that matrix to transform the 4-dimensional vector v into a 3-dimensional vector u: [ ] 4 4 3 3 2 2 1 1 4 3 2 1 4 3 2 1 | | | a a a a a a a a Av u v v v v v v v v + + + = āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = = Clearly u is built from the columns of matrix A; therefore, it must lie in the span of the set of vectors that make up the columns of A. Note that the columns of A are 3-dimensional vectorsā€¦ so is u.
  • 25. 25/45 Transforming a Vector Space: If we apply A to all the vectors in a vector space V we get a collection of vectors that are in a new space called U. In the 3x4 example matrix above we transformed a 4-dimensional vector space V into a 3-dimensional vector space U A 2x3 real matrix A would transform R3 into R2 : Facts: If the mapping matrix A is square and its columns are linearly independent then (i) the space that vectors in V get mapped to (i.e., U) has the same dimension as V (due to ā€œsquareā€ part) (ii) this mapping is reversible (i.e., invertible); there is an inverse matrix A-1 such that v = A-1u (due to ā€œsquareā€ & ā€œLIā€ part) A
  • 26. 26/45 1 x 1 , āˆ’ A A 1 y 2 x 2 y 1 , āˆ’ A A Transform = Matrix Ɨ Vector: a VERY useful viewpoint for all sorts of signal processing scenarios. In general we can view many linear transforms (e.g., DFT, etc.) in terms of some invertible matrix A operating on a signal vector x to give another vector y: i i Ax y = i i y A x 1 āˆ’ = We can think of A and A-1 as mapping back and forth between two vector spaces
  • 27. 27/45 Basis Matrix & Coefficient Vector: Suppose we have a basis {v1, v2, ā€¦, vN} for a vector space V. Then a vector v in space V can be written as: Another view of this: āˆ‘ = Ī± = N k k k 1 v v [ ] āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = N NxN N Ī± Ī± Ī± 2 1 matrix 2 1 | | | v v v v VĪ± v = The ā€œBasis Matrixā€ V transforms the coefficient vector into the original vector v Matrix View & Basis View
  • 28. 28/45 Three Views of Basis Matrix & Coefficient Vector: View #1 Vector v is a linear combination of the columns of basis matrix V. View #2 Matrix V maps vector Ī± into vector v. View #3 There is a matrix, V-1, that maps vector v into vector Ī±. āˆ‘ = Ī± = N k k k 1 v v VĪ± v = v V Ī± 1 āˆ’ = Aside: If a matrix A is square and has linearly independent columns, then A is ā€œinvertibleā€ and A-1 exists such that A A-1 = A-1A = I where I is the identity matrix having 1ā€™s on the diagonal and zeros elsewhere. Now have a way to go back-and- forth between vector v and its coefficient vector Ī± Now have a way to go back-and- forth between vector v and its coefficient vector Ī±
  • 29. 29/45 Basis Matrix for ON Basis: We get a special structure!!! Result: For an ON basis matrix Vā€¦ V-1 = VH (the superscript H denotes ā€œhermitian transposeā€, which consists of transposing the matrix and conjugating the elements) To see this: I v v v v v v v v v v v v v v v v v v VV = āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” > < > < > < > < > < > < > < > < > < = 1 0 0 0 1 0 0 0 1 , , , , , , , , , 2 1 2 2 2 1 2 1 2 1 1 1 N N N N N N H Inner products are 0 or 1 because this is an ON basis
  • 30. 30/45 A unitary matrix is a complex matrix A whose inverse is A-1 = AH For the real-valued matrix caseā€¦ we get a special case of ā€œunitaryā€ the idea of ā€œunitary matrixā€ becomes ā€œorthogonal matrixā€ for which A-1 = AT Two Properties of Unitary Matrices: Let U be a unitary matrix and let y1 = Ux1 and y2 = Ux2 1. They preserve norms: ||yi|| = ||xi||. 2. They preserve inner products: < y1, y2 > = < x1, x2 > That is the ā€œgeometryā€ of the old space is preserved by the unitary matrix as it transforms into the new space. (These are the same as the preservation properties of ON basis.) Unitary and Orthogonal Matrices
  • 31. 31/45 DFT from Unitary Matrix Viewpoint: Consider a discrete-time signal x[n] for n = 0, 1, ā€¦ N-1. Weā€™ve already seen the DFT in a basis viewpoint: Now we can view the DFT as a transform from the Unitary matrix viewpoint: āˆ‘ āˆ’ = = 1 0 ] [ 1 N k k k k X N d x Ī± DFT IDFT āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = = āˆ’ āˆ’ Ļ€ ā‹… āˆ’ Ļ€ āˆ’ Ļ€ ā‹… Ļ€ āˆ’ Ļ€ ā‹… Ļ€ āˆ’ N N N j N N j N N j N j N N j N j N e e e e e e / ) 1 )( 1 ( 2 / 1 ) 1 ( 2 / ) 1 ( 2 2 / 1 2 2 / ) 1 ( 1 2 / 1 1 2 1 1 0 1 1 1 1 1 1 ] | | | [ ā€¦ d d d D x D x H = ~ x D x ~ 1 N = (Acutally D is not unitary but N-1/2D is unitaryā€¦ see reading notes)
  • 32. 32/45 Geometry Preservation of Unitary Matrix Mappings Recallā€¦ unitary matrices map in such a way that the sizes of vectors and the orientation between vectors is not changed. 1 x 1 , āˆ’ A A 1 y 2 x 2 y 1 , āˆ’ A A Unitary mappings just ā€œrigidly rotateā€ the space.
  • 34. 34/45 More on Matrices as Transforms Ax y = mƗ1 mƗn nƗ1 Weā€™ll limit ourselves here to real-valued vectors and matrices A maps any vector x in Rn into some vector y in Rm Rn Rm x y A Range(A): ā€œRange Space of Aā€ = set of all vectors in Rm that can be reached by mapping vector y = weighted sum of columns of A ā‡’ may only be able to reach certain yā€™s Mostly interested in two cases: 1. ā€œTall Matrixā€ m > n 2. ā€œSquare Matrixā€ m = n
  • 35. 35/45 Range of a ā€œTall Matrixā€ (m > n) Rn Rm x y A The range(A) āŠ‚ Rm ā€œProofā€: Since y is ā€œbuiltā€ from the n columns of A there are not enough to form a basis for Rm (they donā€™t span Rm) Range of a ā€œSquare Matrixā€ (m = n) If the columns of A are linearly indepā€¦.The range(A) = Rm ā€¦because the columns form a basis for Rm Otherwiseā€¦.The range(A) āŠ‚ Rm ā€¦because the columns donā€™t span Rm
  • 36. 36/45 Rank of a Matrix: rank(A) = largest # of linearly independent columns (or rows) of matrix A For an mƗn matrix we have that rank(A) ā‰¤ min(m,n) An mƗn matrix A has ā€œfull rankā€ when rank(A) = min(m,n) āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = 0 0 0 0 0 0 0 0 1 1 0 0 2 0 1 0 1 0 0 1 A Example: This matrix has rank of 3 because the 4th column can be written as a combination of the first 3 columns
  • 37. 37/45 Characterizing ā€œTall Matrixā€ Mappings We are interested in answering: Given a vector y, what vector x gets mapped into it via matrix A? ā€œTall Matrixā€ (m > n) Case If y does not lie in range(A), then there is No Solution If y lies in range(A), then there is a solution (but not necessarily just one unique solution) yāˆˆrange(A) yāˆ‰range(A) No Solution One Solution Many Solutions A full rank A not full rank Ax y =
  • 38. 38/45 Full-Rank ā€œTall Matrixā€ (m > n) Case Rn Rm x y A Range(A) Ax y = For a given yāˆˆrange(A)ā€¦ there is only one x that maps to it. By looking at y we can determine which x gave rise to it This is because the columns of A are linearly independent and we know from our studies of vector spaces that the coefficient vector of y is uniqueā€¦ x is that coefficient vector
  • 39. 39/45 NonFull-Rank ā€œTall Matrixā€ (m > n) Case Rn Rm x1 y A Range(A) Ax y = For a given yāˆˆrange(A) there may be more than one x that maps to it x2 A By looking at y we can not determine which x gave rise to it This is because the columns of A are linearly dependent and that redundancy provides several ways to combine them to create y
  • 40. 40/45 Characterizing ā€œSquare Matrixā€ Mappings Q: Given any yāˆˆRn can we find an xāˆˆRn that maps to it? A: Not always!!! yāˆˆrange(A) yāˆ‰range(A) No Solution One Solution Many Solutions A full rank A not full rank Ax y = Careful!!! This is quite a different flow diagram here!!! When a square A is full rank then its range covers the complete new spaceā€¦ then, y must be in range(A) and because the columns of A are a basis there is a way to build y
  • 41. 41/45 A Full-Rank Square Matrix is Invertible A square matrix that has full rank is said to beā€¦. ā€œnonsingularā€, ā€œinvertibleā€ Then we can find the x that mapped to y using x = A-1y Pos. Def. Matrices Invertible Matrices Several ways to check if nƗn A is invertible: 1. A is invertible if and only if (iff) its columns (or rows) are linearly independent (i.e., if it is full rank) 2. A is invertible iff det(A) ā‰  0 3. A is invertible if (but not only if) it is ā€œpositive definiteā€ (see later) 4. A is invertible iff all its eigenvalues are nonzero ( becauseā€¦ det(A) = product of eigenvalues )
  • 42. 42/45 Eigenvalues and Eigenvectors of Square Matrices If matrix A is nƗn, then A maps Rn ā†’ Rn These vectors are ā€œspecialā€ and are called the eigenvectors of A. The scalar Ī» is that e-vectorā€™s corresponding eigenvalue. v Av Ī» = Input Output v Av Q: For a given nƗn matrix A, which vectors get mapped into being almost themselves??? More preciselyā€¦ Which vectors get mapped to a scalar multiple of themselves??? Even more preciselyā€¦ which vectors v satisfy the following:
  • 43. 43/45 ā€¢ If nƗn real matrix A is symmetric, then ā€“ e-vectors corresponding to distinct e-values are orthonormal ā€“ e-values are real valued ā€“ can decompose A as ā€œEigen-Facts for Symmetric Matricesā€ T V VĪ› A = [ ] { } n T n diag Ī» Ī» Ī» , , , 2 1 2 1 ā€¦ = = = Ī› I VV v v v V T V VĪ› A 1 1 āˆ’ āˆ’ = { } n diag Ī» Ī» Ī» 1 1 1 1 , , , 2 1 ā€¦ = āˆ’ Ī› For P.D. A, A-1 has the same e-vectors and has reciprocal e-values ā€¢ If, further, A is pos. def. (semi-def.), then ā€“ e-values are positive (non-negative) ā€“ rank(A) = # of non-zero e-values ā€¢ Pos. Def. ā‡’ Full Rank (and therefore invertible) ā€¢ Pos. Semi-Def. ā‡’ Not Full Rank (and therefore not invertible) ā€“ When A is P. D., then we can write
  • 44. 44/45 Weā€™ll limit our discussion to real-valued matrices and vectors Quadratic Forms and Positive-(Semi)Definite Matrices Quadratic Form = Matrix form for a 2nd-order multivariate polynomial Example: āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = āŽ„ āŽ„ āŽ„ āŽ¦ āŽ¤ āŽ¢ āŽ¢ āŽ¢ āŽ£ āŽ” = 22 21 12 11 2 1 a a a a x x A x fixed variable 2 1 21 12 2 2 22 2 1 11 2 1 2 1 2 1 ) ( scalar ) 1 1 ( ) 1 2 ( ) 2 2 ( ) 2 1 ( ) , ( x x a a x a x a x x a x x Q i j j i ij T + + + = = Ɨ = Ɨ ā‹… Ɨ ā‹… Ɨ = āˆ‘āˆ‘ = = Ax x A scalar The quadratic form of matrix A is: Other Matrix Issues
  • 45. 45/45 ā€¢ Values of the elements of matrix A determine the characteristics of the quadratic form QA(x) ā€“ If QA(x) ā‰„ 0 āˆ€x ā‰  0ā€¦ then say that QA(x) is ā€œpositive semi-definiteā€ ā€“ If QA(x) > 0 āˆ€x ā‰  0ā€¦ then say that QA(x) is ā€œpositive definiteā€ ā€“ Otherwise say that QA(x) is ā€œnon-definiteā€ ā€¢ These terms carry over to the matrix that defines the Quad Form ā€“ If QA(x) ā‰„ 0 āˆ€x ā‰  0ā€¦ then say that A is ā€œpositive semi-definiteā€ ā€“ If QA(x) > 0 āˆ€x ā‰  0ā€¦ then say that A is ā€œpositive definiteā€