Successfully reported this slideshow.

# Prove that every symmetric matrix is diagonalisable by an orthogonal m.docx  ×

# Prove that every symmetric matrix is diagonalisable by an orthogonal m.docx

Prove that every symmetric matrix is diagonalisable by an orthogonal matrix.

Solution
(i)
Suppose \$$Sv=\\lambda v \$$
Then \$$<Sx,v>=(Sx)^Tv=x^TS^Tv = x^TSv = <x,Sv> = \\lambda <x,v> \$$
So if x is in W, we have <x,v>=0 and <Sx,v>=0 => Sx is in W.
So the linear transformation x -> Sx takes vectors in W to W

(ii)
Let\'s define \$$T : W -> W \$$ such that \$$T(w) = Sw \$$
First we know that W has dimension (n-1) and let \$$B = \\{b1,...,b_{n-1}\\} \$$ be an orthonormal basis for W.

\$$Sb_j \\in W : \\exists (a_k)_{1\\leq k \\leq n-1} s.t. Sb_j = \\sum_{k=1}^{n-1} a_k b_k \$$

Then since the basis B is orthonormal we have :

\$$<b_i,Sb_j> = a_i \$$ since we have \$$<b_i,b_j> = \\delta_{ij} \$$

\$$T(b_i)=Sb_i=\\sum_{k=1}^{n-1} <b_k,Sb_j> b_i \$$

Which proves the first statement as the B-matrix for T is composed of the coefficients of the \$$T(b_i) \$$
that are exactly \$$<b_i,Sb_j> \$$ at the ij-the entry.

To conclude : \$$<bi,Sbj> = <Sbj,bi>=(Sbj)^Tbi=bj^TS^Tbi=bj^TSbi=<bj,Sbi> \$$

(iii)
Let\'s do induction on the size of S.
Suppose n=1. Then \$$\\{b_1=v/<v,v>\\} \$$ is an orthonormal basis of eigenvectors for S.
Now suppose the statement is true for (n-1). We will prove it for n.

Suppose S is of size n.
We take \$$b_1 = v/<v,v> \$$ one eigenvector of S.
Then W as defined in (ii) has size (n-1) and we can apply our induction hypothesis to the symmetric matrix T.
So we can find an orthonormal matrix of eigenvectors of T \$$\\{b_2, \\dots,b_{n}\\} \$$ .
Note that the eigenvectors of T are eigenvectors of S, since \$$T(w)=Sw \$$
Then we have found an orthonormal basis of eigenvectors of S : \$$\\{ b_1, \\dots, b_n\\} \$$.
.

Prove that every symmetric matrix is diagonalisable by an orthogonal matrix.

Solution
(i)
Suppose \$$Sv=\\lambda v \$$
Then \$$<Sx,v>=(Sx)^Tv=x^TS^Tv = x^TSv = <x,Sv> = \\lambda <x,v> \$$
So if x is in W, we have <x,v>=0 and <Sx,v>=0 => Sx is in W.
So the linear transformation x -> Sx takes vectors in W to W

(ii)
Let\'s define \$$T : W -> W \$$ such that \$$T(w) = Sw \$$
First we know that W has dimension (n-1) and let \$$B = \\{b1,...,b_{n-1}\\} \$$ be an orthonormal basis for W.

\$$Sb_j \\in W : \\exists (a_k)_{1\\leq k \\leq n-1} s.t. Sb_j = \\sum_{k=1}^{n-1} a_k b_k \$$

Then since the basis B is orthonormal we have :

\$$<b_i,Sb_j> = a_i \$$ since we have \$$<b_i,b_j> = \\delta_{ij} \$$

\$$T(b_i)=Sb_i=\\sum_{k=1}^{n-1} <b_k,Sb_j> b_i \$$

Which proves the first statement as the B-matrix for T is composed of the coefficients of the \$$T(b_i) \$$
that are exactly \$$<b_i,Sb_j> \$$ at the ij-the entry.

To conclude : \$$<bi,Sbj> = <Sbj,bi>=(Sbj)^Tbi=bj^TS^Tbi=bj^TSbi=<bj,Sbi> \$$

(iii)
Let\'s do induction on the size of S.
Suppose n=1. Then \$$\\{b_1=v/<v,v>\\} \$$ is an orthonormal basis of eigenvectors for S.
Now suppose the statement is true for (n-1). We will prove it for n.

Suppose S is of size n.
We take \$$b_1 = v/<v,v> \$$ one eigenvector of S.
Then W as defined in (ii) has size (n-1) and we can apply our induction hypothesis to the symmetric matrix T.
So we can find an orthonormal matrix of eigenvectors of T \$$\\{b_2, \\dots,b_{n}\\} \$$ .
Note that the eigenvectors of T are eigenvectors of S, since \$$T(w)=Sw \$$
Then we have found an orthonormal basis of eigenvectors of S : \$$\\{ b_1, \\dots, b_n\\} \$$.
.