6. .
Orthogonal basis: A basis consisting of
orthogonal vectors in an inner product
space.
7. ORTHOGONAL BASIS
If S ={v1, v2 , … , vn} is an orthogonal basis of W, then for any w
∈ W,
where
are called the Fourier coefficients.
So the coordinate vector of w,
r
w =
〈
r
w,
r
vi 〉
〈
r
vi ,
r
vi 〉i=1
n
∑
r
vi =
〈
r
w,
r
v1〉
〈
r
v1,
r
v1〉
r
v1 +
〈
r
w,
r
v2 〉
〈
r
v2,
r
v2 〉
r
v2 + ...
〈
r
w,
r
vn 〉
〈
r
vn ,
r
vn 〉
r
vn,
r
wS = (
r
w)S =
〈
r
w,
r
v1〉
〈
r
v1,
r
v1〉
,
〈
r
w,
r
v2 〉
〈
r
v2,
r
v2 〉
, ... ,
〈
r
w,
r
vn 〉
〈
r
vn,
r
vn 〉
.
〈
r
w,
r
v1〉
〈
r
v1,
r
v1〉
,
〈
r
w,
r
v2 〉
〈
r
v2,
r
v2 〉
, ... ,
〈
r
w,
r
vn 〉
〈
r
vn,
r
vn 〉
8. 8
How to Find the Coordinate Vector with Respect to
a Given Orthogonal Basis?
Example : Compute the coefficients and determine the
coordinate vectors in Example 1 for u = (10,3).
From Example 1, we have v1 = (5,0), v2 = (0,-3) and
In this case, the coefficients are:
〈
r
u,
r
v1〉
〈
r
v1,
r
v1〉
=
r
u ⋅
r
v1
r
v1
2
=
(10)(5) + (3)(0)
52
=
50
25
= 2
〈
r
u,
r
v2 〉
〈
r
v2,
r
v2 〉
=
r
u ⋅
r
v2
r
v2
2
=
(10)(0) + (3)(−3)
32
=
−9
9
= −1
r
v1 = 5, and
r
v2 = 3.
9. How to Find the Coordinate Vector with Respect to
a Given Orthogonal Basis?
So the coordinate vector of u,
We can see that a nice advantage of working with an
orthogonal basis is that the coefficients in any basis
representation for a vector are immediately known; they
are called Fourier coefficients.
r
uS = (
r
u)S =
〈
r
u,
r
v1〉
〈
r
v1,
r
v1〉
,
〈
r
u,
r
v2 〉
〈
r
v2,
r
v2 〉
= (2,−1).
10. Properties of orthogonal matrices:
If is orthogonal, thenn n
Q ×
∈ℜ
1
2 2
( ) The column vectors of form an orthonormal
basis for .
( )
( )
( ) , , preserve inner product
( ) preserve norm
( ) preserve angle
n
i Q
ii Q Q I QQ
iii Q Q
iv Qx Qy x y
v Qx x
vi
Τ Τ
Τ −
ℜ
= =
=
= ←
= ←
13. Given
,Clearly
Clearly,
Similarly,
Clearly,
We have the next result
1 nx x
1 1
1
1
u x
x
1 1{ } { }span u span x=
1 2 1 1 2 2 1
2 1
1
, , ( )p x u u u x p
x p
= −
−
1 2 1 2 1 2& { , } { , }u u span x x span u u⊥ =
2 3 1 1 3 2 2
3 3 2
3 2
, ,
1
( )
p x u u x u u
and u x p
x p
= +
−
−
3 1 3 2 1 2 3 1 2 3, & { , , } { , , }u u u u span x x x span u u u⊥ ⊥ =
1u
1p
2x
14. Theorem: (The Gram-Schmidt process)
H. (i) Let be a basis for an inner
product space .
(ii)
C. is an orthonormal basis.{ }1 nu u
{ }1 nx x
V
( )
1 1
1
1 1
1
1
1
1
,
1
, 1, , 1
,
K K K
K K
K
K K j j
j
u x
x
u x p K n
x p
where p x u u
+ +
+
+
=
=
= − = −
−
= ∑
15. Example: Find an orthonormal basis for with
inner product given by
, where
Sol: Starting with a basis
3P
),()(,
3
1
i
i
i xgxPgP ∑=
=
.1&0,1 321 ==−= xxx
{ }2
,,1 xx
{ }
1 2 1
1 2
11 1 1 11 1
Let , ,..., be the projection vectors defines in Thm. 5.6.1, and
let , ,..., be the orthonormal basis of ( ) derived from the
Gram-Schmidt process.
Define
n
n
kk
p p p
q q q R A
r a a r q
r
−
= ⇒ = ⋅
=
1 for 2,...,
and for 1,..., 1 by the Gram-Schmidt process.
k k
T
ik i k
a p k n
r q a i k
−
− =
= = −
16. Theorem: (QR Factorization)
If A is an m×n matrix of rank n, then A
can be factored into a product QR, where Q
is an m×n matrix with orthonormal columns
and R is an n×n matrix that is upper triangular
and invertible.
17. Proof. of QR-Factorization
{ }
1 2 1
1 2 1
11 1
Let , ,..., be the projection vectors defined in Thm.5.6.1,
and let , ,..., be the orthonormal basis of ( ) derived from
the Gram-Schmidt process.
Define
n
n
kk k k
p p p
q q q R A
r a
r a p
−
−
−
=
−
@ 1
1 11 1
2 12 1 22 2
1 1
and for 1,... -1
for 2,...,
By the Gram-Schmidt process,
T
ik i k
n n
r q a i k
k n
a r q
a r q r q
a r q
= =
=
=
= +
= +
M M
... nn nr q+
18. Proof. of QR-Factorization
1 2
11 12 1
22 2
If we set ( , ,..., ) and define to be the upper triangular matrix
0
,
0 0
then the th column of the product wi
n
n
n
nn
Q q q q R
r r r
r r
R
r
j QR
=
=
M M O M
1 1 2 2
1 2
ll be
... for 1,... .
Therefore,
( , ,..., )
j j j jj j j
n
Qr r q r q r q a j n
QR a a a A
= + + + = =
= =
19. Theorem:
If A is an m×n matrix of rank n, then the
solution to the least squares problem
is given by , where Q and R are the
matrices obtained from Thm.5.6.2. The solution
may be obtained by using back substitution to
solve .
Ax b=
vv
1ˆx R Q b− Τ
=
vv
ˆx
v
ˆRx Q bΤ
=
vv
20. Proof. of Thm
ˆLet be the solution to the leaset squares problem
ˆ
ˆ
ˆ( ) ( ) ( )
ˆ( )
T
A
T T
T T
T T
I
x
Ax b
A Ax A b
QR QRx QR b QR Factorization
R Q Q R
×
=
=
⇒ =
⇒ = −
⇒
v
vv
vv
vv
v
1
ˆ ( is invertible)
ˆ ˆor
T T
T T T
T
x R Q b
R Rx R Q b R
Rx Q b x R Q b− Τ
=
⇒ =
⇒ = =
v
vv
v vv v