Principal Component Analysis Well Explained With an Example in MATLAB
X = [1 2 4 3 5 9 4 2;
5 4 7 4 3 2 1 3;
3 2 4 5 6 2 1 2;
4 1 3 2 2 1 3 4]
First column represents one feature vector and has four dimensions and there are 8 feature
vectors.
Here, dimensions < number of feature vectors.
[vecs,val]=eigs(X*X',2);% X*X' gives matrix of size 4x4 as 4x8*8x4 . Only first two largest
eigen vectors are considered as eigs(X*X',2)
%So, instead of storing 4 vectors, you store 2 vectors
wt1=vecs'*X(:,1);
reconstructed =vecs*wt1; % approximate of the original
wt2=vecs'*X(:,2);
reconstructed =vecs*wt2;
For example: if you have 4 feature vector and each feature has 8 dimensions as shown below
X =[ 1 5 3 4;
2 4 2 1;
4 7 4 3;
3 4 5 2;
5 3 6 2;
9 2 2 1;
4 1 1 3;
2 3 2 4];
[vecs,val]=eigs(X'*X,2);%This is done to simplify computation as X*X' gives matrix of size
8x8. This is the trick used by Pentland and Turk
ef=X*vecs; % in order to get eigen vectors of X*X', we have to multiply X with eigen
vectors of X'*X..
for i=1:size(ef,2)
ef(:,i)=ef(:,i)./norm(ef(:,i)); % u divide each shape by its norm
end
wt1=ef'*X(:,1); % Each shape of 8 dimensions is now represented in 4 dimensions as 2 eigen
vectors considered
reconstructed =ef*wt1; % you get first shape back
wt2=vecs'*X(:,2);
reconstructed =vecs*wt2;

Principal Component Analysis With Simple Matlab Example

  • 1.
    Principal Component AnalysisWell Explained With an Example in MATLAB X = [1 2 4 3 5 9 4 2; 5 4 7 4 3 2 1 3; 3 2 4 5 6 2 1 2; 4 1 3 2 2 1 3 4] First column represents one feature vector and has four dimensions and there are 8 feature vectors. Here, dimensions < number of feature vectors. [vecs,val]=eigs(X*X',2);% X*X' gives matrix of size 4x4 as 4x8*8x4 . Only first two largest eigen vectors are considered as eigs(X*X',2) %So, instead of storing 4 vectors, you store 2 vectors wt1=vecs'*X(:,1); reconstructed =vecs*wt1; % approximate of the original wt2=vecs'*X(:,2); reconstructed =vecs*wt2; For example: if you have 4 feature vector and each feature has 8 dimensions as shown below X =[ 1 5 3 4; 2 4 2 1; 4 7 4 3; 3 4 5 2; 5 3 6 2; 9 2 2 1; 4 1 1 3; 2 3 2 4]; [vecs,val]=eigs(X'*X,2);%This is done to simplify computation as X*X' gives matrix of size 8x8. This is the trick used by Pentland and Turk ef=X*vecs; % in order to get eigen vectors of X*X', we have to multiply X with eigen vectors of X'*X.. for i=1:size(ef,2) ef(:,i)=ef(:,i)./norm(ef(:,i)); % u divide each shape by its norm end
  • 2.
    wt1=ef'*X(:,1); % Eachshape of 8 dimensions is now represented in 4 dimensions as 2 eigen vectors considered reconstructed =ef*wt1; % you get first shape back wt2=vecs'*X(:,2); reconstructed =vecs*wt2;