Published on

Biometric sensor image fusion for identity verification: A case study with wavelet-based fusion rules and graph matching

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  1. 1. Biometric Sensor Image Fusion for Identity Verification: A Case Study with Wavelet-based Fusion Rules and Graph Matching Authors *Dakshina Ranjan Kisku, Ajita Rattani, Phalguni Gupta, Jamuna Kanta Sing. Presented by: Dakshina Ranjan Kisku *Contact person:
  2. 2. Agenda: <ul><li>Introduction </li></ul><ul><li>Biometrics systems </li></ul><ul><li>Modality based categorization and fusion levels in multibiometrics </li></ul><ul><li>Wavelet decomposition and biometrics image fusion </li></ul><ul><li>Fusion rules applied </li></ul><ul><li>Overview of SIFT features </li></ul><ul><li>Graph matching technique and verification </li></ul><ul><li>Experimental results </li></ul><ul><li>Conclusion </li></ul><ul><li>Bibliography </li></ul>
  3. 3. Introduction: <ul><li>Biometrics sensor image fusion refers to a process that fuses multispectral images captured at different resolutions and by different biometric sensors to acquire richer and complementary information to produce a new fused image in spatially enhanced form. </li></ul><ul><li>The fused image depicts spatially enhanced information of one or more biometric characteristics that is more understandable for human perception. </li></ul><ul><li>Biometrics image fusion at higher abstraction level (i.e., low-level) removes several inconsistencies, less relevant edge artifacts and noise in the fused images. </li></ul>
  4. 4. Biometrics systems: <ul><li>In Computer vision and Machine vision applications, </li></ul><ul><li>Biometric can be thought as a automatic identity </li></ul><ul><li>verification system, where a user automatically </li></ul><ul><li>recognizes by his/her physiological or behavioral </li></ul><ul><li>characteristics. </li></ul><ul><li>Biometrics can be used for establishing identity of a </li></ul><ul><li>person and identity can be defined as follows: </li></ul><ul><li>Identity -quality or condition of being the same in substance, composition, nature, properties, or in particular qualities under consideration ( Oxford English Dictionary, 2004 ) </li></ul>
  5. 5. Contd…Biometrics systems <ul><li>People are identified by three basic means: </li></ul><ul><li>Something you have (passport, Voter ID card, Driving license, etc.) </li></ul><ul><li>Something you know (password, PIN, etc) </li></ul><ul><li>Something you are (human body) </li></ul>
  6. 6. Contd…Biometrics systems <ul><li>Means of identity verification can be </li></ul><ul><li>divided into three groups: </li></ul><ul><li>Possessions-based (credit card, smart card) </li></ul><ul><li>- something you have </li></ul><ul><li>Knowledge-based (password, PIN) </li></ul><ul><li>- something you know </li></ul><ul><li>Biometrics-based (biometrics characteristics) </li></ul><ul><li>- something you are </li></ul>
  7. 7. Modality based categorization: <ul><li>Modality based categorization of the biometric </li></ul><ul><li>systems can be made on the basis of biometric </li></ul><ul><li>traits used. </li></ul><ul><li>Uni-biometric systems: when a single biometric system uses for verification or identification of acquired biometrics characteristic, it is called uni-biometrics system (face, fingerprint, palmprint, etc.). </li></ul><ul><li>Multi-biometric systems: when more than one biometric traits use for identification or verification by fusion of those traits, then it is called multimodal biometrics (face and fingerprint, face and iris, etc). </li></ul>
  8. 8. Fusion levels in multibiometrics: <ul><li>Various levels of fusion in multibiometrics: </li></ul><ul><li>Feature level fusion </li></ul><ul><li>- face and fingerprint, face and iris biometrics, etc. </li></ul><ul><li>Match score level fusion </li></ul><ul><li>- face and voice, face and fingerprint, etc. </li></ul><ul><li>Rank level fusion </li></ul><ul><li>- face and fingerprint, etc. </li></ul><ul><li>Decision level fusion </li></ul><ul><li>- face and voice, etc. </li></ul><ul><li>Sensor level fusion (proposed) </li></ul><ul><li>- face and palmprint, fusion of gray and thermogram image, etc. </li></ul>
  9. 9. Wavelet decomposition and biometrics image fusion: <ul><li>Multisensor biometrics image fusion performs with face and palmprint images and the fused image represents a unique pattern. </li></ul><ul><li>Wavelet decomposition can be applied to face and palmprint images independently that decomposes them into multiple channels depending on their local frequency. </li></ul><ul><li>The wavelet transform provides an integrated framework to decompose biometric images into a number of new images, each of them having a different degree of resolution. </li></ul>
  10. 10. Contd…wavelet decomposition <ul><li>Prior to image fusion, wavelet transforms are determined from face and palmprint images. </li></ul><ul><li>The wavelet transform contains low-high bands, high-low bands and high-high bands at different scales including the low-low bands of the images at coarse level. </li></ul><ul><li>The low-low band has all the positive transform values and remaining bands have transform values which are fluctuating around zeros. </li></ul><ul><li>Wavelet transform decomposes an image recursively into several frequency levels and each level contains transform values. </li></ul>
  11. 11. Contd…wavelet decomposition <ul><li>Sub-image sequences are then fused by applying different wavelet fusion rules on the low and high frequency parts. </li></ul><ul><li>Finally, inverse wavelet transformation is performed to restore the fused image. </li></ul>
  12. 12. Contd… Fig. 1. A generic structure of wavelet based fusion approach is shown.
  13. 13. Fusion rules applied: <ul><li>The input images are decomposed by a discrete wavelet transform (DWT) and the wavelet coefficients are then selected using a set of fusion rules. </li></ul><ul><li>- “Maximum” wavelet fusion rule: maximum wavelet coefficients are selected during any decomposition </li></ul><ul><li>- “Mean” fusion rule: mean wavelet coefficients are determined. </li></ul><ul><li>- “Up-down” fusion rule </li></ul><ul><li>- “Down-up” fusion rule </li></ul>
  14. 14. Contd… Fig. 2. Haar wavelet based fusion of face and palmprint is shown where the “Maximum” fusion rule is applied.
  15. 15. Contd… Fig. 3. Haar wavelet based fusion is presented for face and palmprint images, where “Mean” fusion rule is applied.
  16. 16. Contd… Fig. 4. Haar wavelet based fusion is presented, where Up-Down fusion rule is applied.
  17. 17. Contd… Fig. 5. Haar wavelet based fusion scheme is shown, where Down-Up fusion is applied.
  18. 18. Overview of SIFT features: <ul><li>The scale invariant feature transform (SIFT) has been proposed by david Lowe[6] and proved to be invariant to image rotation, scaling, partly illumination changes and the camera view . </li></ul><ul><li>Local keypoints are detected from the following steps: </li></ul><ul><li>- select candidates for feature points by searching peaks in the scale-space from a difference of Gaussian (DoG) function. </li></ul><ul><li>- localized feature points using the measurement of their stability. </li></ul><ul><li>- assign orientations based on local image properties. </li></ul>
  19. 19. Contd… <ul><li>- calculate feature descriptors </li></ul>Fig. 6. SIFT feature extraction from fused image is shown.
  20. 20. Graph matching technique and verification: <ul><li>Subject to interpretation of fused image with keypoint descriptors, attributed probabilistic graph G={N, E, K, ζ } is considered for representation. </li></ul><ul><li>Where, N and E denote the nodes and edges, respectively, and K and are attributes associated with nodes and edges in the graph. </li></ul><ul><li>The nodes correspond to fused image primitives, such as keypoint descriptor and edges link between these nodes. </li></ul><ul><li>The authentication then becomes a problem of graph matching corresponds to a pair of fused image, where the probe fused image is matched with the gallery fused image. </li></ul>
  21. 21. Contd… <ul><li>Based on gallery model graph searching is initiated for the matched maximized posteriori probabilities in probe graph. </li></ul><ul><li>Let us consider, to measure the similarity of nodes and edges for a pair of graphs drawn on fused images, two graphs are taken as G’={N’, E’, K’, ζ ’} and G”={N”, E”, K”, ζ ”}. </li></ul><ul><li>Thus for the node, we are searching the most probable label or node in the probe graph. Hence, it can be stated as </li></ul>
  22. 22. Contd…
  23. 23. Contd… Hence, the matching between a pair of graphs is established by using the posteriori probabilities and assigning the labels from the gallery graph to the points on the probe graph.
  24. 24. Experimental results: <ul><li>The experiment is conducted on IITK multimodal database of face and palmprint images the multimodal database consists of 400 face images and 400 palmprint images of 200 individuals. </li></ul><ul><li>In these evidence fusion, different wavelet fusion rules are applied, namely, ‘maximum’, ‘UD’, ‘DU’ and “mean” fusion rules. </li></ul><ul><li>Multisensor biometric fusion based on ‘maximum’ fusion rule produces 98.81% accuracy, while biometric fusion based on ‘mean’ fusion rule, fusion based on ‘DU’ fusion rule, and fusion based on ‘UD’ fusion rule produce 97.43%, 96.27% and 89.93% accuracies, respectively, as shown in the ROC curve. </li></ul>
  25. 25. Contd… Figure. Performances are shown through ROC curves determined from different wavelet based fusion techniques. The fusion rules are – “Down-up (DU)” wavelet fusion rule, “Maximum” wavelet fusion rule, “Mean” wavelet fusion rule and “Up-down (UD)” wavelet fusion rule
  26. 26. Conclusion: <ul><li>In this paper, multisensor biometric image fusion scheme has been addressed for multibiometric user authentication. </li></ul><ul><li>The proposed technique efficiently minimizes the less irrelevant distinct variability and inconsistencies exist in the different biometric modalities and their characteristics by performing fusion of biometrics images at low-level. </li></ul><ul><li>The result shows that the proposed method exploits at the sensor level is robust, computationally efficient and less sensitive to unwanted noise, which confirms the validity and efficacy of the system </li></ul>
  27. 27. Bibliography: <ul><li>A. K. Jain, and A. K. Ross, “Multibiometric systems,” Communications of the ACM , vol. 47, no.1, pp. 34 - 40, 2004. </li></ul><ul><li>A. Ross, A. K. Jain, and J. K. Qian, “Information fusion in biometrics,” Pattern Recognition Letters , vol. 24, no. 13, pp. 2115 – 2125, 2003. </li></ul><ul><li>A. Ross, and R. Govindarajan, &quot;Feature level fusion using hand and face biometrics,&quot; Proceedings of SPIE Conference on Biometric Technology for Human Identification II , 2005, pp. 196 – 204. </li></ul><ul><li>T. Stathaki.: “Image fusion – algorithms and applications,” Academic Press , 2008. </li></ul><ul><li> </li></ul><ul><li>D. G. Lowe, “Distinctive image features from scale invariant keypoints,” International Journal of Computer Vision , vol. 60, no. 2, 2004. </li></ul><ul><li>U. Park, S. Pankanti, and A. K. Jain, &quot; Fingerprint verification using SIFT features ,&quot; Proceedings of SPIE Defense and Security Symposium , 2008. </li></ul><ul><li>A. Rattani, D. R. Kisku, M. Bicego, and M. Tistarelli, “Robust feature-level multibiometric classification,” Proceedings of the Biometric Consortium Conference – A special issue in Biometrics, pp. 1- 6, 2006. </li></ul>
  28. 28. Bibliography: <ul><li>D. R. Kisku, A. Rattani, E. Grosso, and M. Tistarelli, “Face identification by SIFT-based complete graph topology”, Proceedings of the IEEE Workshop on Automatic Identification Advanced Technologies, 2007, pp. 63 – 68. </li></ul><ul><li>H. Yaghi, and H. Krim, “Probabilistic graph matching by canonical decomposition”, Proceedings of the International Conference on Image Processing, 2008, pp. 2368 – 2371. </li></ul><ul><li>R. Sitaraman, and A. Rosenfield, “Probabilistic analysis of two stage matching”, Pattern Recognition, vol. 22, no. 3, pp. 331 – 343, 1989. </li></ul><ul><li>L. S. Davis, “Shape matching using relaxation techniques,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 1, no. 1, pp. 60-72, Jan. 1979. </li></ul><ul><li>A. Rattani, D. R. Kisku, M. Bicego, and M. Tistarelli, “Feature level fusion of face and fingerprint biometrics”, Proceedings of the Biometrics: Theory, Applications and Systems, 2007. </li></ul><ul><li>C. Hsu, and R. Beuker, “Multiresolution feature-based image registration”, Proceedings of the Visual Communications and Image Processing, 2000, pp. 1 – 9. </li></ul><ul><li>A. K. Jain, A. Ross, and S. Prabhakar, “An introduction to biometrics recognition”, IEEE Transactions on Circuits and Systems for Video Technology , vol. 14, no. 1, pp. 4 – 20, 2004. </li></ul><ul><li>A. K. Jain, A. Ross, and S. Pankanti, “Biometrics: A tool for information security”, IEEE Transactions on Information Forensics and Security , vol. 1, no. 2, pp. 125 – 143, 2006. </li></ul>
  29. 29. Questions ???
  30. 30. <ul><li>Contact person: </li></ul><ul><li>[email_address] </li></ul>