2. PART 1
INTRODUCTION
• THE IDEA OF MULTIVARIATE GAUSSIAN DISTRIBUTION IS
PRESENTED
• THE RELATION BETWEEN COVARIANCE MATRIX, EIGENVALUES,
EIGENVECTORS OF A MULTIVARIATE DISTRIBUTION IS
THOROUGHLY DEMONSTRATED USING FIGURES AND CERTAIN
CASES
• THE IMPORTANT CASE OF RANK DEFICIENCY FOR COVARIANCE
MATRIX IS ALSO PRESENTED.
4. 2: IN 1ST MATRIX ONLY ONE VARIANCE (ALONG Z) IS LARGEST AND
IN MATRIX 2, ONLY TWO VARIANCES ARE LARGER (ALONG Y AND Z)
AXES AS EVIDENT IN RESPECTIVE FIGURES.
5. 3: THE REQUIRED CHANGES FOR THIS PART ARE MADE IN CODE
AND RESULTS ARE PLOTTED AS:
6.
7. 4): THE REQUIRED CHANGES FOR THIS PART ARE MADE IN CODE
AND THE RESULTS ARE :
8. AS 0.01 IS ALMOST NEARLY EQUAL TO 0 AS
COMPARED TO 1 AND ALSO EVIDENT IN FIGURE-16
TO FIGURE 19, THE DIMENSION OF THE DATA IS
EFFECTIVELY 1 IN THIS CASE EVEN THOUGH
COVARIANCE MATRIX HAS RANK ‘2’.
9. PART 2
INTRODUCTION
• THE CONCEPT OF PRINCIPLE COMPONENT ANALYSIS IS DEEPLY
STUDIED AND UNDERSTOOD
• AFTER LOADING TRAINING DATA AND ESTIMATING THE GIVEN
TEST IMAGES BY ADDING ONE BY ONE A LESS VARIATION
CONTAINING SUBSPACE , PRINCIPLE COMPONENT, THE IMAGE
GETS MORE CLOSER TO THE ORIGINAL ONE BUT AT SOME STAGE,
ADDING FURTHER PRINCIPLE COMPONENT DOES NOT MAKE MUCH
DIFFERENCE.
11. AFTER GETTING THE RESULTS OF 360TH PRINCIPLE COMPONENT FOR ALL 40
IMAGES, THEY ARE ALL COMPARED AND CLOSELY OBSERVED FOR THE BEST AND
THE WORST ESTIMATES. IT IS FOUND THAT IMAGE NO. 33 IS ESTIMATED BEST BY
USING THIS ESTIMATOR. THE RESULT FOR IMAGE 33 IS SHOWN IN FIGURE
12. CONCLUSION
• IN PART 1, WE LEARNT ABOUT MULTIVARIATE GAUSSIAN
DISTRIBUTION AND THE IMPORTANT CONCEPT THAT HOW A
COVARIANCE MATRIX OF SUCH DISTRIBUTION CAN BE WELL
APPROXIMATED BY A COVARIANCE MATRIX THAT IS RANK
DEFICIENT.
• IN PART 2, THE IMPORTANT CONCEPT OF PRINCIPLE COMPONENT
ANALYSIS WAS STUDIED AND THE SIGNIFICANCE OF SUBSPACE
IDENTIFICATION AND SUBSPACE APPROXIMATION BY USING PCA.