Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Iris & Peri-ocular Recognition

1,401 views

Published on

Published in: Technology
  • Be the first to comment

Iris & Peri-ocular Recognition

  1. 1. IRIS RECOGNITION Shashank Dhariwal Aisha Jabeen
  2. 2. Scheme Iris Recognition Fig: 1-Iris Region 2 Periocular Recognition Fig: 2-Periocular Region
  3. 3. Final Detection Iris Recognition Output Image =0 Images Matched Sum of Difference Periocular Recognition Output Image 3 >0 Images not matched
  4. 4. Iris Recognition  Segmentation  Normalization Fig: 3-Segmented Image [1]  Feature Extraction  Template Matching Fig: 4-Normalized Image [1] 4
  5. 5. Segmentation  Localization  Circular Hough Transform  Canny edge detection to generate edge map  Gradients biased in vertical direction for outer iris/sclera boundary  Vertical and horizontal gradients weighted equally for inner iris/ pupil boundary  6 parameters stored at the end 5 Fig: 5-Localised Image
  6. 6. Segmentation  Eyelid/Eyelash detection  Linear Hough Transform     used for fitting line 2nd horizontal line drawn Canny edge detection for edge map Only horizontal gradient information taken Simple thresholding for isolating eyelashes Fig: 6-Eyelid/Eyelash occlusion 6
  7. 7. Normalization  Counters dimensional inconsistencies  Produces iris regions with constant dimensions  Remaps each point within iris region to a pair of polar coordinates Fig : 7- Daugman’s rubber sheet model [2] 7
  8. 8. Normalization  Radial & Angular resolution  Pupil being non-concentric  Normalized pattern created by backtracking Cartesian data points.  2D arrays for polar coordinates, and marking reflections, eyelashes and eyelids  Data points occurring along pupil border are discarded 8 Fig : 8- Result of Normalization
  9. 9. Feature Extraction  Represent iris texture as a binary vector of 2046 bit Iris Code Textured region 9 Fig: 9-Iris code & Textured region [2]
  10. 10. 8 Bands * 128 Textures 10 Fig: 10-Textures [2]
  11. 11. Feature Encoding  11
  12. 12. Feature Encoding 12
  13. 13. Matching  Bit-wise comparison  Measured by Hamming distance  K-nearest neighbour classification 13
  14. 14. Periocular Biometrics
  15. 15. Definition The process of identifying a person based on the study of area around the eye, namely the edges of eye, eyebrows, eye lashes and skin.[3] It is the region of interest that defines the method to be used for feature extraction and matching, and are broadly classified as Global Matcher (uses information about colour, texture and shape) or Local Matcher (uses the information contained in Key Points). Fig 1. Area of interest [3] 15 Fig 2. Figure showing key points obtained using SIFT [3]
  16. 16. Why Periocular  Iris - Iris is a moving object located in another moving object eye ball which is again located in moving head which again is connected to a moving body. (lot of movements!!!!) - Small surface area (difficult to capture!!!!) - Typically imaged in NIR (appropriate lighting required to illuminate!!!!) - Requires subject co operation (as if thugs would co operate!!!!) - Occlusion by eyelids and flowing hair affected the results.  Retina – Typically a coherent light source required to illuminate - Again Subject Co operation is required.  Face good - There is a trade off between the recognition based on Iris and Face. - Iris requires the subject to be close to camera, so we miss out the facial info - Face requires subject to be at some distance from the camera, we can’t get resolution image for Iris. Periocular – could use colour and NIR both, distance to camera not a problem and best no subject co operation required. 16
  17. 17. Periocular Success  Periocular – introduced by Park in year 2009 [3], used colour images obtained from an off the shelf Cannon camera. Accuracy – 77% (combined- Local + Global) 958 images - Later in year 2010, Woodard [4] combined Iris with Periocular used 520 NIR images database Local Binary Pattern as the global matching method Accuracy - combined - 96.5% , Iris – 13.8% and Periocular – 92.5% . - Present project NIR images used (Three database of varying size 40 - 77) SIFT [5] as Local matching technique. Accuracy – 100% !!!!!! 17
  18. 18. Extraction and Matching  Methods Global Feature extraction and matching matching Local Feature extraction and Fig 4. Local Descriptor Construction [3] Fig 3 . Global Descriptor Construction [3] - 18 Gradient Based (GO) histogram. Local Binary Pattern (LBP) - SIFT - SURF Euclidean distance used to calculate matching - Distance ratio based matching Squared Euclidean distance
  19. 19. Implementation  Dataset used  DB1 – 40 NIR images from CASIA V3_2– Iris – Twins  DB2 – 36 NIR images from CASIA V3_2– Iris – lamp  DB3 – 77 NIR images from CASIA V3_2– Iris – Interval  Parameters for SIFT  Detection (Key point - centre coordinates, size/scale, orientation/theta )  Octaves – dynamically set as per the size of the image – log2(min (width, length)) (inversely proportional to image resolution)  Scale - 3 (smoothing level)  Peak Threshold - 0 (high value will eliminate key points)  Edge Threshold - 10 ( low value will eliminate more key points)  Description  Magnification factor - 3  Gaussian Window Size - 1.5 x scale of key point (smaller values let the centre of descriptor count more) (descriptor size is determined by multiplying the key point scale by this factor.)  Matching – 1.5  19 Threshold  Measured by L2 norm for min difference between two descriptors (Squared Euclidean Distance Ratio)
  20. 20. Implementation  System Details  Processor – Intel (R) Xenon (R) 2.67GHz, 64-bit  Ram - 12.0 GB  Matlab ver - R2011a  Various Tests  Test 1  Test 2  Test 3  Test 4  Test 5  Test 6 20 – Query image from same data set – Query image from other data set – Effect on identification when Noise added to query image – Effect on identification when Blur added to query image – Effect on identification when query image is rotated – Effect on identification when query image is scaled
  21. 21. Results Test 1 (a) – Database – DB1 Query image – DB1 40 query images Accuracy – 100% 1600 Matches Found Time Taken 6 1400 No of Matches Found 1200 4 1000 800 3 600 2 400 1 200 21 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 Query Image No Time taken to find the match(s) 5
  22. 22. Results Test 1 (b) – Database – DB2 Query image – DB2 36 query images Accuracy – 100% Matches Found Time Taken 1600 6 5 No of Matches found 1200 4 1000 800 3 600 2 400 1 200 0 22 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 Query Image No Time Taken to find the match(s) 1400
  23. 23. Results Test 1 (c) – Database – DB3 Query image – DB3 77 query images Accuracy – 100% Matches found Time taken 1.4 450 1.2 No of Matches Found 400 1 350 300 0.8 250 0.6 200 150 0.4 100 0.2 50 0 0 0 1 2 3 4 5 6 7 8 910 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50 52 54 56 58 60 62 64 66 68 70 72 74 76 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 61 63 65 67 69 71 73 75 77 23 Query Image No Time taken to find the matches(s) 500
  24. 24. Results Test 2 (a ) – Database – DB1 Query image – DB2 36 query images Accuracy – 100% Test 2 (b ) – Database – DB2 Query image – DB1 40 query images Accuracy – 100% 24
  25. 25. Results Test 3 (a) – Salt & Pepper Noise Query Image 25 20% Salt & pepper Noise Enter the query image = 27 No of Matches = 25 Match Found , Image 27 Elapsed time is 1.404492 seconds. Matched Image
  26. 26. Results Test 3 (b) – Gaussian Noise Query Image 26 Mean = 0 ; variance = .05 Enter the query image = 10 No of Matches = 30 Match Found , Image 10 Elapsed time is 1.357519 seconds. Matched Image
  27. 27. Results Test 4 – Blurring Query Image 27 Linear = 20 ; Theta = 35 Enter the query image = 15 No of Matches = 137 Match Found , Image 15 Elapsed time is 3.862147 seconds. Matched Image
  28. 28. Results Test 5 – Rotation Query Image 28 Deg = 80 Enter the query image = 20 No of Matches = 1034 Match Found , Image 20 Elapsed time is 4.084018 seconds.. Matched Image
  29. 29. Results Test 6 – Scaling Query Image 29 Ratio = .5 Enter the query image = 5 No of Matches = 127 Match Found , Image 5 Elapsed time is 0.646144 seconds. Matched Image
  30. 30. Analysis  With the given size of the database this method has given an accuracy of 100% even when introduced by noise, blur and transformation. (however it is prudent to test this method on a larger database)  Time taken to match is found to be proportional to the no of key points selected and hence the number of matches.  Time taken is also proportional to the size of the image, for e.g in Test 1 (c) the size of the image is 280 x 320 against the image size 480 x 640 in Test 1 (a & b) ,the time taken for match is quarter of that taken in Test 1 (a & b). 30
  31. 31. Reference [1] Libor Masek, ”Recognition of Human Iris Patterns for Biometric Identification” Bachelors Thesis, The University of Western Australia, 2003 [2] J. Daugman (2004). “How iris recognition works”, IEEE Trans. CSVT, vol. 14, no. 1, pp. 21 – 30. [3] U. Park, A. Ross, and A.K. Jain, “Periocular biometrics in the visible spectrum: a feasibility study”, in Proceedings of the 3rd IEEE International Conference on Biometrics: Theory, Applications and systems, 2009, pp. 153–158. [4] D. Woodard, S Pundlik, P Miller, “ On the Fusion of Periocular and Iris Biometrics in Non – Ideal Imagery”, in International conference on Pattern recognition, 2010. [5] A Vedaldi and B Fulkerson http://ww w.vlfeat.org 31
  32. 32. DEMO 32

×