The vulnerabilities of biometric sensors have been
discussed extensively in the literature and popularized in films and
television shows. This research examines the image quality of an
artificial print as compared to a genuine finger, and examines the
characteristics of the two, including minutiae counts and image
quality, as repeated samples are taken.
This study investigated the effect of force levels (3, 5, 7, 9 and 11N) on fingerprint
matching performance, image quality scores and minutiae count between optical and capacitance sensors. Three images were collected from the right index fingers of 75 participants for each sensing technology. Descriptive statistics analysis of variance and Kruskal-Wallis non-parametric
tests were conducted to assess significant differences in minutiae counts and image quality scores, by force level. The results reveal a significant difference in image quality score by force level and sensor technology in contrast to minutiae count for the capacitance sensor. The image quality score is one of the many factors that influence the system matching performance, yet the
removal of low quality images does not improve the system performance at each force level. Further research is needed to identify other manipulatable factors to improve the interaction between a user and device and the subsequent matching performance.
IMAGE QUALITY ASSESSMENT FOR FAKE BIOMETRIC DETECTION: APPLICATION TO IRIS, F...ijiert bestjournal
In this Paper,the actual presence of a real legitimate trait in contrast to a fake self - manufactured synthetic or reconstructed sample is a significant problem in biometric authentication,which requires the development of new and efficient protection measures. In this paper,we present a novel software - based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The obje ctive of the proposed system is to enhance the security of biometric recognition frameworks,by adding livens assessment in a fast,user - friendly,and non - intrusive manner,through the use of image quality assessment. The proposed approach presents a very low degree of complexity,which makes it suitable for real - time applications,using 25 general image quality features extracted from one image (i.e.,the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results,obtained on publicly available data sets of fingerprint,iris,and 2D face,show that the proposed method is highly competitive compared with other state - of - the - art approaches and that the analysis of the general image quality of rea l biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.
Seminar report on Error Handling methods used in bio-cryptographykanchannawkar
Detail information about the real time errors in the biometrics devices and also how to secure encryption keys. To make authentication systems more secure. In this seminar report describe about the combination of the biometrics with the cryptography. and also describe the methods that are used to handle the real time error like fault accept and fault reject and also describe their their rates.i,e FRR and FAR by the biometrics systems.
IRJET-Gaussian Filter based Biometric System Security EnhancementIRJET Journal
M.Selvi, T.Manickam, C.N.Marimuthu"Gaussian Filter based Biometric System Security Enhancement", International Research Journal of Engineering and Technology (IRJET), Volume2,issue-01 April 2015.e-ISSN:2395-0056, p-ISSN:2395-0072. www.irjet.net
Abstract
A novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. To enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment.
The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. Multi-biometric and Multi-attack protection method which targets to overcome part of these limitations through the use of Image Quality Assessment (IQA).
Moreover, being software-based, it presents the usual advantages of this type of approaches: fast, as it only needs one image (i.e., the same sample acquired for biometric recognition) to detect whether it is real or fake, non-intrusive; user-friendly (transparent to the user), cheap and easy to embed in already functional systems and no hardware is required).
An SVM based Statistical Image Quality Assessment for Fake Biometric DetectionIJTET Journal
Abstract
A biometric system is a computer based system and is used to identify the person on their behavioral and logical characteristics such as (for example fingerprint, face, iris, keystroke, signature, voice, etc.).A typical biometric system consists of feature extraction and matching patterns. But nowadays biometric systems are attacked by using fake biometric samples. This paper described the fingerprint biometric techniques and also introduce the attack on that system and by using Image Quality Assessment for Liveness Detection to know how to protect the system from fake biometrics and also how the multi biometric system is more secure than uni-biometric system. Support Vector Machine (SVM) classification technique is used for training and testing the fingerprint images. The testing onput fingerprint image is resulted as real and fake fingerprint image by quality score matching with the training based real and fake fingerprint samples.
A Study of Approaches and Measures aimed at Securing Biometric Fingerprint Te...Editor IJCATR
The need for fool proof authentication procedures away from traditional authentication mechanisms like passwords, security PINS has led to the advent of biometric authentication in information systems. Biometric data extracted from physiological features of a person including but not limited to fingerprints, palm prints, face or retina for purpose of verification & identification is saved as biometric templates. The inception of biometrics in access control systems has not been without its own hitches & like other systems it has its fair share of challenges. Biometric fingerprints being the most mature of all biometric spheres are the most widely adopted biometric authentication systems. Biometric systems effectiveness lies on how secure they are at preventing inadvertent disclosure of biometric templates in an information system‟s archive. This however has not been the case as biometric templates have been fraudulently accessed to gain unauthorized access in identification and verification systems. In order to achieve strong and secure biometric systems, biometric systems developers need to build biometric systems that properly secure biometric templates. Several biometric template protection schemes and approaches have been proposed and used to safeguard stored biometric templates. Despite there being various biometric template protection schemes and approaches in existence, none of them has provided the most authentic, reliable, efficient and deterrent means to totally secure biometric fingerprint templates. This research sought to establish status of the current biometric template protection techniques and methods by conducting a survey and analyzing data gathered from a sample of seventy-eight (78) respondents. We will report these results and give our conclusion based on findings of the survey in this paper.
This study investigated the effect of force levels (3, 5, 7, 9 and 11N) on fingerprint
matching performance, image quality scores and minutiae count between optical and capacitance sensors. Three images were collected from the right index fingers of 75 participants for each sensing technology. Descriptive statistics analysis of variance and Kruskal-Wallis non-parametric
tests were conducted to assess significant differences in minutiae counts and image quality scores, by force level. The results reveal a significant difference in image quality score by force level and sensor technology in contrast to minutiae count for the capacitance sensor. The image quality score is one of the many factors that influence the system matching performance, yet the
removal of low quality images does not improve the system performance at each force level. Further research is needed to identify other manipulatable factors to improve the interaction between a user and device and the subsequent matching performance.
IMAGE QUALITY ASSESSMENT FOR FAKE BIOMETRIC DETECTION: APPLICATION TO IRIS, F...ijiert bestjournal
In this Paper,the actual presence of a real legitimate trait in contrast to a fake self - manufactured synthetic or reconstructed sample is a significant problem in biometric authentication,which requires the development of new and efficient protection measures. In this paper,we present a novel software - based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The obje ctive of the proposed system is to enhance the security of biometric recognition frameworks,by adding livens assessment in a fast,user - friendly,and non - intrusive manner,through the use of image quality assessment. The proposed approach presents a very low degree of complexity,which makes it suitable for real - time applications,using 25 general image quality features extracted from one image (i.e.,the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results,obtained on publicly available data sets of fingerprint,iris,and 2D face,show that the proposed method is highly competitive compared with other state - of - the - art approaches and that the analysis of the general image quality of rea l biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.
Seminar report on Error Handling methods used in bio-cryptographykanchannawkar
Detail information about the real time errors in the biometrics devices and also how to secure encryption keys. To make authentication systems more secure. In this seminar report describe about the combination of the biometrics with the cryptography. and also describe the methods that are used to handle the real time error like fault accept and fault reject and also describe their their rates.i,e FRR and FAR by the biometrics systems.
IRJET-Gaussian Filter based Biometric System Security EnhancementIRJET Journal
M.Selvi, T.Manickam, C.N.Marimuthu"Gaussian Filter based Biometric System Security Enhancement", International Research Journal of Engineering and Technology (IRJET), Volume2,issue-01 April 2015.e-ISSN:2395-0056, p-ISSN:2395-0072. www.irjet.net
Abstract
A novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. To enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment.
The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. Multi-biometric and Multi-attack protection method which targets to overcome part of these limitations through the use of Image Quality Assessment (IQA).
Moreover, being software-based, it presents the usual advantages of this type of approaches: fast, as it only needs one image (i.e., the same sample acquired for biometric recognition) to detect whether it is real or fake, non-intrusive; user-friendly (transparent to the user), cheap and easy to embed in already functional systems and no hardware is required).
An SVM based Statistical Image Quality Assessment for Fake Biometric DetectionIJTET Journal
Abstract
A biometric system is a computer based system and is used to identify the person on their behavioral and logical characteristics such as (for example fingerprint, face, iris, keystroke, signature, voice, etc.).A typical biometric system consists of feature extraction and matching patterns. But nowadays biometric systems are attacked by using fake biometric samples. This paper described the fingerprint biometric techniques and also introduce the attack on that system and by using Image Quality Assessment for Liveness Detection to know how to protect the system from fake biometrics and also how the multi biometric system is more secure than uni-biometric system. Support Vector Machine (SVM) classification technique is used for training and testing the fingerprint images. The testing onput fingerprint image is resulted as real and fake fingerprint image by quality score matching with the training based real and fake fingerprint samples.
A Study of Approaches and Measures aimed at Securing Biometric Fingerprint Te...Editor IJCATR
The need for fool proof authentication procedures away from traditional authentication mechanisms like passwords, security PINS has led to the advent of biometric authentication in information systems. Biometric data extracted from physiological features of a person including but not limited to fingerprints, palm prints, face or retina for purpose of verification & identification is saved as biometric templates. The inception of biometrics in access control systems has not been without its own hitches & like other systems it has its fair share of challenges. Biometric fingerprints being the most mature of all biometric spheres are the most widely adopted biometric authentication systems. Biometric systems effectiveness lies on how secure they are at preventing inadvertent disclosure of biometric templates in an information system‟s archive. This however has not been the case as biometric templates have been fraudulently accessed to gain unauthorized access in identification and verification systems. In order to achieve strong and secure biometric systems, biometric systems developers need to build biometric systems that properly secure biometric templates. Several biometric template protection schemes and approaches have been proposed and used to safeguard stored biometric templates. Despite there being various biometric template protection schemes and approaches in existence, none of them has provided the most authentic, reliable, efficient and deterrent means to totally secure biometric fingerprint templates. This research sought to establish status of the current biometric template protection techniques and methods by conducting a survey and analyzing data gathered from a sample of seventy-eight (78) respondents. We will report these results and give our conclusion based on findings of the survey in this paper.
A novel approach to generate face biometric template using binary discriminat...sipij
In identity management system, commonly used biometric recognition system needs attention towards issue
of biometric template protection as far as more reliable solution is concerned. In view of this biometric
template protection algorithm should satisfy security, discriminability and cancelability. As no single
template protection method is capable of satisfying the basic requirements, a novel technique for face
biometric template generation and protection is proposed. The novel approach is proposed to provide
security and accuracy in new user enrolment as well as verification process. This novel technique takes
advantage of both the hybrid approach and the binary discriminant analysis algorithm. This algorithm is
designed on the basis of random projection, binary discriminant analysis and fuzzy commitment scheme.
Three publicly available benchmark face databases (FERET, FRGC, CMU-PIE) are used for evaluation.
The proposed novel technique enhances the discriminability and recognition accuracy in terms of matching
score of the face images and provides high security. This paper discusses the corresponding results.
7 multi biometric fake detection system using image quality based liveness de...INFOGAIN PUBLICATION
Biometric systems mostly popular in all over the world because of its user friendly and credible nature in security. In spite of this advantages, many attacks that done through synthetic , self manufactured, fake, reconstructed samples affected on the performance and accuracy of biometric system which becomes major problem in biometrics. Hence, new effective measures have to be taken to protect the biometric systems. In this paper, we propose novel software based multi-biometric fake detection system to detect various types of attacks. The main moto of this system is to enhance security level of biometric recognition systems through Image Quality Assessment (IQA) which is one of the liveness detection method.25 image quality measures calculated from test image which used to classify between real and fake trait using Linear Discriminative Analysis(LDA) classifier. The experimental results is done on the database of 2D face and fingerprint modalities, shows the proposed system is ease in implementation in real time application as complexities is very less because of one input image. Also this system is fast, user-friendly, non-intrusive which is more competitive with any other state of the art approaches, classifies between real and fake traits.
Increasingly sophisticated biometric methods are being used for a
variety of applications in which accurate authentication of people is necessary.
Because all biometric methods require humans to interact with a device of some
type, effective implementation requires consideration of human factors issues.
One such issue is the training needed to use a particular device appropriately.
In this paper, we review human factors issues in general that are associated with
biometric devices and focus more specifically on the role of training.
An Approach to Speech and Iris based Multimodal Biometric SystemIJEEE
Biometrics is the science and technology of human identification and verification through the use of feature set extracted from the biological data of the individual to be recognized. Unimodal and Multimodal systems are the two modal systems which have been developed so far. Unimodal biometric systems use a single biometric trait but they face limitations in the system performance due to the presence of noise in data, interclass variations and spoof attacks. These problems can be resolved by using multimodal biometrics which rely on more than one biometric information to produce better recognition results. This paper presents an overview of the multimodal biometrics, various fusion levels used in them and suggests the use of iris and speech using score level fusion for a multimodal biometric system.
To ensure that the object presented in front of biometric device is real or reconstructed sample is a significant
problem in biometric authentication, which requires the development of new and efficient protection measures. This
paper, presents a software-based fake biometric detection method that can be used in multiple biometric systems to
detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of
biometric recognition devices through the use of image quality assessment in a fast and user friendly manner. The
proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using
25 general image quality features extracted from one image to distinguish between real and imposed samples. The
proposed method is highly competitive compared with other as the analysis of the general image quality of real
biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake
traits.
Role of fuzzy in multimodal biometrics systemKishor Singh
Person identification is possible through the biometrics using their physiological and behavioral characteristics such
as face, ear, thumb print, voice, signature and key stock. Unimodal biometric systems face a range of problems, including noisy
data, intra-class versions, small liberty, non-university, spoof assaults, and unsustainable error rates. Some of these drawbacks
can be overcome by multimodal biometric technologies, which incorporate data from various information sources. In this paper
we work on multimodal biometric using three modalities face, ear and foot to find the optimal results using fuzzy fusion
mechanism and produces final identification decision via a fuzzy rules that enhance the quality of multimodalities biometric
system.
Existing definitions for biometric testing and
evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human
Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of
presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract
(FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model.
Fingerprint recognition is an important
biometric technology, and its use is increasing day by day.
Fingerprint recognition is affected by several physiological
factors like age, wear and tear of skin, and technological
factors like sensor technologies. This paper builds on
previous research in the area of gender differences in
fingerprint features, and reports results of differences in
performance of fingerprints collected from males and
females. The researchers propose a fingerprint analysis
framework for testing differences in gender and apply the
framework to fingerprints collected from males and
females.
Feature Level Fusion of Multibiometric Cryptosystem in Distributed SystemIJMER
ABSTRACT: Multibiometrics is the combination of one or more biometrics (e.g., Fingerprint, Iris, and Face). Researchers
are focusing on how to provide security to the system, the template which was generated from the biometric need to be
protected. The problems of unimodal biometrics are solved by multibiometrics. The main objective is to provide a security to
the biometric template by generating a secure sketch by making use of multibiometric cryptosystem and which is stored in a
database. Once the biometric template is stolen it becomes a serious issue for the security of the system and also for user
privacy. In the existing approach, feature level fusion is used to combine the features securely with well-known biometric
cryptosystems namely fuzzy vault and fuzzy commitment. The drawbacks of existing system include accuracy of the biometric
need to be improved and the noises in the biometrics also need to be reduced. The proposed work is to enhance the security
using multibiometric cryptosystem in distributed system applications like e-commerce transactions, e-banking and ATM.
Keywords: Biometric Cryptosystem, Error correcting code, Fingerprint, Iris, Multibiometrics, Unimodal biometrics.
Robust Analysis of Multibiometric Fusion Versus Ensemble Learning Schemes: A ...CSCJournals
Identification of person using multiple biometric is very common approach used in existing user
validation of systems. Most of multibiometric system depends on fusion schemes, as much of the
fusion techniques have shown promising results in literature, due to the fact of combining multiple
biometric modalities with suitable fusion schemes. However, similar type of practices are found in
ensemble of classifiers, which increases the classification accuracy while combining different
types of classifiers. In this paper, we have evaluated comparative study of traditional fusion
methods like feature level and score level fusion with the well-known ensemble methods such as
bagging and boosting. Precisely, for our frame work experimentations, we have fused face and
palmprint modalities and we have employed probability model - Naive Bayes (NB), neural
network model - Multi Layer Perceptron (MLP), supervised machine learning algorithm - Support
Vector Machine (SVM) classifiers for our experimentation. Nevertheless, machine learning
ensemble approaches namely, Boosting and Bagging are statistically well recognized. From
experimental results, in biometric fusion the traditional method, score level fusion is highly
recommended strategy than ensemble learning techniques.
An Investigation towards Effectiveness of Present State of Biometric-Based Au...RSIS International
The adoption of biometric-based authentication mechanism has been already initiated a decade back but still in real-life we get to see usage of only unimodal biometrics. Out of all the different forms of biometrics, we see usage of fingerprint as the dominant attribute in contrast to different other attributes e.g. teeth image, palm, facial geometry, retina network, iris, etc. Multimodal biometrics is believed to offered better security compared to unimodal. Although, there are some of the technical advancement in evolving up with new multimodal methodologies, but still commercial usage of such is yet to be seen. Therefore, this manuscripts aims to explore the level of effectiveness in existing approaches of biometric-based authentication system in order to further investigate the unaddressed solution towards this problem. This paper reviews the approaches used for addressing different problems associated with biometrics and discusses about their technical methodologies as well as their limitations.
Research has shown for some age groups, quality of fingerprints can impact the performance of biometric systems. A
desirable feature of biometrics is that they are suitable for use across the population. This applied study examines the performance of a fingerprint recognition system in a healthcare environment. Anecdotal evidence suggested front line healthcare workers may have lower image quality due to continued hand washing which may remove oils from their skin. During training, individuals are told to add oil to their fingers by wiping oil from their foreheads to improve the resulting quality of the
fingerprints. In the healthcare population the authors tested, compared to two general populations (collected on optical and
capacitance sensors) there was a significant difference in skin oiliness, but not in image quality. There was a difference across
healthcare and non-healthcare groups in the performance of the fingerprint algorithm when compared against the capacitance
dataset.
Multimodal biometric systems are those that utilize more than one physical or behavioural characteristic for enrolment , verification, or identification.
Fake Multi Biometric Detection using Image Quality Assessmentijsrd.com
In the recent era where technology plays a prominent role, persons can be identified (for security reasons) based on their behavioral and physiological characteristics (for example fingerprint, face, iris, key-stroke, signature, voice, etc.) through a computer system called the biometric system. In these kinds of systems the security is still a question mark because of various intruders and attacks. This problem can be solved by improving the security using some efficient algorithms available. Hence the fake person can be identified if he/she uses any synthetic sample of an authenticated person and a fake person who is trying to forge can be identified and authenticated.
User verification systems that use a single source of biometric information are not sufficient to meet today’s high security requirements for applications. This is because these systems have to contend with noisy data, intra-class variations, spoof attack and non-universality. Therefore, there is need for employing multiple sources of biometric information to provide better recognition performance as compared to the systems based on single trait. This paper is an overview of different categories of multibiometric systems, information fusion in multibiometric systems, and approaches to feature fusion at feature selection phase.
This paper proposes a structured methodology following a full vulnerability analysis of the general biometric model outlined by Mansfield and Wayman (2002). Based on this analysis, a new multidimensional paradigm named the Biometric Architecture & System Security (BASS) model is proposed, which adds comprehensive security and management layers to the existing biometric model.
Abstract—Biometric systems are increasingly deployed in networked environment, and issues related to interoperability are bound to arise as single vendor, monolithic architectures become less desirable. Interoperability issues affect every subsystem of the biometric system, and a statistical framework to evaluate interoperability is proposed. The framework was applied to the acquisition subsystem for a fingerprint recognition system and the results were evaluated using the framework. Fingerprints were collected from 100 subjects on 6 fingerprint sensors. The results show that performance of interoperable fingerprint datasets is not easily predictable and the proposed framework can aid in removing unpredictability to some degree.
This study evaluated the performance of a
commercially available face recognition algorithm for the verification of an individual’s identity across three illumination levels
• The lack of research related to lighting conditions and face recognition was the driver for this evaluation
• This evaluation examined the influence of variations in illumination levels on the performance of a face recognition algorithm, specifically with respect to factors of:
– Age, gender, ethnicity, facial characteristics, and facial obstructions
A novel approach to generate face biometric template using binary discriminat...sipij
In identity management system, commonly used biometric recognition system needs attention towards issue
of biometric template protection as far as more reliable solution is concerned. In view of this biometric
template protection algorithm should satisfy security, discriminability and cancelability. As no single
template protection method is capable of satisfying the basic requirements, a novel technique for face
biometric template generation and protection is proposed. The novel approach is proposed to provide
security and accuracy in new user enrolment as well as verification process. This novel technique takes
advantage of both the hybrid approach and the binary discriminant analysis algorithm. This algorithm is
designed on the basis of random projection, binary discriminant analysis and fuzzy commitment scheme.
Three publicly available benchmark face databases (FERET, FRGC, CMU-PIE) are used for evaluation.
The proposed novel technique enhances the discriminability and recognition accuracy in terms of matching
score of the face images and provides high security. This paper discusses the corresponding results.
7 multi biometric fake detection system using image quality based liveness de...INFOGAIN PUBLICATION
Biometric systems mostly popular in all over the world because of its user friendly and credible nature in security. In spite of this advantages, many attacks that done through synthetic , self manufactured, fake, reconstructed samples affected on the performance and accuracy of biometric system which becomes major problem in biometrics. Hence, new effective measures have to be taken to protect the biometric systems. In this paper, we propose novel software based multi-biometric fake detection system to detect various types of attacks. The main moto of this system is to enhance security level of biometric recognition systems through Image Quality Assessment (IQA) which is one of the liveness detection method.25 image quality measures calculated from test image which used to classify between real and fake trait using Linear Discriminative Analysis(LDA) classifier. The experimental results is done on the database of 2D face and fingerprint modalities, shows the proposed system is ease in implementation in real time application as complexities is very less because of one input image. Also this system is fast, user-friendly, non-intrusive which is more competitive with any other state of the art approaches, classifies between real and fake traits.
Increasingly sophisticated biometric methods are being used for a
variety of applications in which accurate authentication of people is necessary.
Because all biometric methods require humans to interact with a device of some
type, effective implementation requires consideration of human factors issues.
One such issue is the training needed to use a particular device appropriately.
In this paper, we review human factors issues in general that are associated with
biometric devices and focus more specifically on the role of training.
An Approach to Speech and Iris based Multimodal Biometric SystemIJEEE
Biometrics is the science and technology of human identification and verification through the use of feature set extracted from the biological data of the individual to be recognized. Unimodal and Multimodal systems are the two modal systems which have been developed so far. Unimodal biometric systems use a single biometric trait but they face limitations in the system performance due to the presence of noise in data, interclass variations and spoof attacks. These problems can be resolved by using multimodal biometrics which rely on more than one biometric information to produce better recognition results. This paper presents an overview of the multimodal biometrics, various fusion levels used in them and suggests the use of iris and speech using score level fusion for a multimodal biometric system.
To ensure that the object presented in front of biometric device is real or reconstructed sample is a significant
problem in biometric authentication, which requires the development of new and efficient protection measures. This
paper, presents a software-based fake biometric detection method that can be used in multiple biometric systems to
detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of
biometric recognition devices through the use of image quality assessment in a fast and user friendly manner. The
proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using
25 general image quality features extracted from one image to distinguish between real and imposed samples. The
proposed method is highly competitive compared with other as the analysis of the general image quality of real
biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake
traits.
Role of fuzzy in multimodal biometrics systemKishor Singh
Person identification is possible through the biometrics using their physiological and behavioral characteristics such
as face, ear, thumb print, voice, signature and key stock. Unimodal biometric systems face a range of problems, including noisy
data, intra-class versions, small liberty, non-university, spoof assaults, and unsustainable error rates. Some of these drawbacks
can be overcome by multimodal biometric technologies, which incorporate data from various information sources. In this paper
we work on multimodal biometric using three modalities face, ear and foot to find the optimal results using fuzzy fusion
mechanism and produces final identification decision via a fuzzy rules that enhance the quality of multimodalities biometric
system.
Existing definitions for biometric testing and
evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human
Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of
presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract
(FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model.
Fingerprint recognition is an important
biometric technology, and its use is increasing day by day.
Fingerprint recognition is affected by several physiological
factors like age, wear and tear of skin, and technological
factors like sensor technologies. This paper builds on
previous research in the area of gender differences in
fingerprint features, and reports results of differences in
performance of fingerprints collected from males and
females. The researchers propose a fingerprint analysis
framework for testing differences in gender and apply the
framework to fingerprints collected from males and
females.
Feature Level Fusion of Multibiometric Cryptosystem in Distributed SystemIJMER
ABSTRACT: Multibiometrics is the combination of one or more biometrics (e.g., Fingerprint, Iris, and Face). Researchers
are focusing on how to provide security to the system, the template which was generated from the biometric need to be
protected. The problems of unimodal biometrics are solved by multibiometrics. The main objective is to provide a security to
the biometric template by generating a secure sketch by making use of multibiometric cryptosystem and which is stored in a
database. Once the biometric template is stolen it becomes a serious issue for the security of the system and also for user
privacy. In the existing approach, feature level fusion is used to combine the features securely with well-known biometric
cryptosystems namely fuzzy vault and fuzzy commitment. The drawbacks of existing system include accuracy of the biometric
need to be improved and the noises in the biometrics also need to be reduced. The proposed work is to enhance the security
using multibiometric cryptosystem in distributed system applications like e-commerce transactions, e-banking and ATM.
Keywords: Biometric Cryptosystem, Error correcting code, Fingerprint, Iris, Multibiometrics, Unimodal biometrics.
Robust Analysis of Multibiometric Fusion Versus Ensemble Learning Schemes: A ...CSCJournals
Identification of person using multiple biometric is very common approach used in existing user
validation of systems. Most of multibiometric system depends on fusion schemes, as much of the
fusion techniques have shown promising results in literature, due to the fact of combining multiple
biometric modalities with suitable fusion schemes. However, similar type of practices are found in
ensemble of classifiers, which increases the classification accuracy while combining different
types of classifiers. In this paper, we have evaluated comparative study of traditional fusion
methods like feature level and score level fusion with the well-known ensemble methods such as
bagging and boosting. Precisely, for our frame work experimentations, we have fused face and
palmprint modalities and we have employed probability model - Naive Bayes (NB), neural
network model - Multi Layer Perceptron (MLP), supervised machine learning algorithm - Support
Vector Machine (SVM) classifiers for our experimentation. Nevertheless, machine learning
ensemble approaches namely, Boosting and Bagging are statistically well recognized. From
experimental results, in biometric fusion the traditional method, score level fusion is highly
recommended strategy than ensemble learning techniques.
An Investigation towards Effectiveness of Present State of Biometric-Based Au...RSIS International
The adoption of biometric-based authentication mechanism has been already initiated a decade back but still in real-life we get to see usage of only unimodal biometrics. Out of all the different forms of biometrics, we see usage of fingerprint as the dominant attribute in contrast to different other attributes e.g. teeth image, palm, facial geometry, retina network, iris, etc. Multimodal biometrics is believed to offered better security compared to unimodal. Although, there are some of the technical advancement in evolving up with new multimodal methodologies, but still commercial usage of such is yet to be seen. Therefore, this manuscripts aims to explore the level of effectiveness in existing approaches of biometric-based authentication system in order to further investigate the unaddressed solution towards this problem. This paper reviews the approaches used for addressing different problems associated with biometrics and discusses about their technical methodologies as well as their limitations.
Research has shown for some age groups, quality of fingerprints can impact the performance of biometric systems. A
desirable feature of biometrics is that they are suitable for use across the population. This applied study examines the performance of a fingerprint recognition system in a healthcare environment. Anecdotal evidence suggested front line healthcare workers may have lower image quality due to continued hand washing which may remove oils from their skin. During training, individuals are told to add oil to their fingers by wiping oil from their foreheads to improve the resulting quality of the
fingerprints. In the healthcare population the authors tested, compared to two general populations (collected on optical and
capacitance sensors) there was a significant difference in skin oiliness, but not in image quality. There was a difference across
healthcare and non-healthcare groups in the performance of the fingerprint algorithm when compared against the capacitance
dataset.
Multimodal biometric systems are those that utilize more than one physical or behavioural characteristic for enrolment , verification, or identification.
Fake Multi Biometric Detection using Image Quality Assessmentijsrd.com
In the recent era where technology plays a prominent role, persons can be identified (for security reasons) based on their behavioral and physiological characteristics (for example fingerprint, face, iris, key-stroke, signature, voice, etc.) through a computer system called the biometric system. In these kinds of systems the security is still a question mark because of various intruders and attacks. This problem can be solved by improving the security using some efficient algorithms available. Hence the fake person can be identified if he/she uses any synthetic sample of an authenticated person and a fake person who is trying to forge can be identified and authenticated.
User verification systems that use a single source of biometric information are not sufficient to meet today’s high security requirements for applications. This is because these systems have to contend with noisy data, intra-class variations, spoof attack and non-universality. Therefore, there is need for employing multiple sources of biometric information to provide better recognition performance as compared to the systems based on single trait. This paper is an overview of different categories of multibiometric systems, information fusion in multibiometric systems, and approaches to feature fusion at feature selection phase.
This paper proposes a structured methodology following a full vulnerability analysis of the general biometric model outlined by Mansfield and Wayman (2002). Based on this analysis, a new multidimensional paradigm named the Biometric Architecture & System Security (BASS) model is proposed, which adds comprehensive security and management layers to the existing biometric model.
Abstract—Biometric systems are increasingly deployed in networked environment, and issues related to interoperability are bound to arise as single vendor, monolithic architectures become less desirable. Interoperability issues affect every subsystem of the biometric system, and a statistical framework to evaluate interoperability is proposed. The framework was applied to the acquisition subsystem for a fingerprint recognition system and the results were evaluated using the framework. Fingerprints were collected from 100 subjects on 6 fingerprint sensors. The results show that performance of interoperable fingerprint datasets is not easily predictable and the proposed framework can aid in removing unpredictability to some degree.
This study evaluated the performance of a
commercially available face recognition algorithm for the verification of an individual’s identity across three illumination levels
• The lack of research related to lighting conditions and face recognition was the driver for this evaluation
• This evaluation examined the influence of variations in illumination levels on the performance of a face recognition algorithm, specifically with respect to factors of:
– Age, gender, ethnicity, facial characteristics, and facial obstructions
Biometric research centers on five fundamental areas: data collection, signal processing, decision-making, transmission, and storage. Traditionally, research occurred in subsets of the discipline in separate departments within universities such as algorithm development in computer science, and speech and computer vision in electrical engineering. In the fall semester of 2002, a class in Biometric Technology and Applications was developed to encourage cross-disciplinary education, where all areas of the biometric model would come together and address issues such as research methodologies and the implementation of biometrics in society at large. The course has been modified to accommodate a wider audience, incorporate graduate student research, which is the foundation for modular mini-courses tailored to specific majors and issues. Having an interdisciplinary group of student’s better mirrors the makeup of jobs involved in biometrics, such as management, marketing, or research. The challenge lies in providing a course that accounts for these diverse needs.
The proliferation of networked authentication
systems has put focus on the issue of interoperability.
Fingerprint sensors are based on a variety of different technologies that introduce inconsistent distortions and variations in the feature set of the captured image, which makes the goal of interoperability challenging. The motivation of this
research was to examine the effect of fingerprint sensor interoperability on the performance of a minutiae based matcher. A statistical analysis framework for testing
interoperability was formulated to test similarity of minutiae count, image quality and similarity of performance between
native and interoperable datasets. False non-match rate (FNMR) was used as the performance metric in this research.
Interoperability performance analysis was conducted on each sensor dataset and also by grouping datasets based on the
acquisition technology and interaction type of the acquisition sensor. The lowest interoperable FNMR observed was 0.12%.
Intelligent multimodal identification system based on local feature fusion be...nooriasukmaningtyas
Biometric identification systems, which use physical features to check a person's identity, ensure much higher security than password and number systems. Biometric features such as the face or a fingerprint can be stored on a microchip in a credit card, for example. A single modal biometric identification system fails to extract enough features for identification. Another disadvantage of using only one feature is not always readable. In this article, a smart multimodal biometric verification model for identifying and verifying a person's identity is recommended based on artificial intelligence methods. The proposed model is identified the iris and finger vein unique patterns each individual to overcome many challenges such as identity fraud, poor image quality, noise, and instability of the surrounding environment. Several experiments were performed on a dataset containing 50 people by using many matching methods. The results of the proposed model were provided a higher accuracy of 98%, with FAR and FRR of 0.0015% and 0.025%, respectively.
This is a complete report on Bio-metrics, finger print detection. It include what finger print is, how to scan and refin finger print, how the mechanism of its detection work, applications, etc
In the age of Biometric Security taking over the traditional security features, this is a small intro to the Biometric features one can use to enhance the security. The various modalities have been explained.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
call for paper 2012, hard copy of journal, research paper publishing, where to publish research paper,
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Antispoofing in face biometrics: A comprehensive study on software-based tech...CSITiaesprime
The vulnerability of the face recognition system to spoofing attacks has piqued the biometric community's interest, motivating them to develop anti-spoofing techniques to secure it. Photo, video, or mask attacks can compromise face biometric systems (types of presentation attacks). Spoofing attacks are detected using liveness detection techniques, which determine whether the facial image presented at a biometric system is a live face or a fake version of it. We discuss the classification of face anti-spoofing techniques in this paper. Anti-spoofing techniques are divided into two categories: hardware and software methods. Hardware-based techniques are summarized briefly. A comprehensive study on software-based countermeasures for presentation attacks is discussed, which are further divided into static and dynamic methods. We cited a few publicly available presentation attack datasets and calculated a few metrics to demonstrate the value of anti-spoofing techniques.
This study evaluated fingerprint quality across two populations, elderly and young, in order to assess age and moisture as potential factors affecting utility image
quality. Specifically, the examination of these variables was conducted on a population over the age of 62, and a
population between the ages of 18 and 25, using two fingerprint recognition devices (capacitance and optical). Collected individual variables included: age, gender,
ethnic background, handedness, moisture content of each index finger, occupation(s), subject's use of hand moisturizer, and prior usage of fingerprint devices. Computed performance measures included failure to
enroll, and quality scores. The results indicated there was statistically significant evidence that both age and moisture affected effectiveness image quality of each index finger at a=0.01 on the optical device, and there was statistically significant evidence that age affected effectiveness image quality of each index finger on the capacitance device, but moisture was only significant for
the right index finger at a=0.01.
Biometrics is the science and technology of
measuring and analyzing biological data. In information
technology, biometrics refers to technologies that measure and
analyze human body characteristics, such as DNA, fingerprints,
eye retinas and irises, voice patterns, facial patterns and hand
measurements, for authentication purposes. This paper is about
the applications of biometric especially in the field of healthcare
and its future uses
Biometric technologies focus on verifying and identifying people using their biological (anatomical, physiological and behavioral) features. Biometric technologies are generally considered to be the implementation of pattern recognition algorithms because they are intended to identify people.
With the growth of technology their grows threat to our data which is just secured by passwords so to make it more secure biometrics came into existence. As biometric systems are adopted and accepted for security purpose for various information and security systems. Hence it is immune to attacks. This paper deals with the security of biometric details of individuals. In this paper we will be discussing about biometrics and its types and the threats and security issues which is not talked about usually. The different technologies evolved and had contributed to biometrics in long run and their effects. Sushmita Raulo | Saurabh Gawade "Security Issues Related to Biometrics" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-5 , August 2021, URL: https://www.ijtsrd.com/papers/ijtsrd44951.pdf Paper URL: https://www.ijtsrd.com/computer-science/computer-security/44951/security-issues-related-to-biometrics/sushmita-raulo
The increasing use of distributed authentication architecture
has made interoperability of systems an important issue. Interoperabil ity of systems affects the maturity of the technology and also improves confidence of users in the technology. Biometric systems are not immune to the concerns of interoperability. Interoperability of fingerprint sensors and its effect on the overall performance of the recognition system is an area of interest with a considerable amount of work directed
towards it. This research analyzed effects of interoperability on error rates for fingerprint datasets captured from two optical sensors and a capacitive sensor when using a single commercially available fingerprint
matching algorithm. The main aim of this research was to emulate a
centralized storage and matching architecture with multiple acquisition
stations. Fingerprints were collected from 44 individuals on all three sensors and interoperable False Reject Rates of less than .31% were achieved using two different enrolment strategies.
BIOMETRICS AUTHENTICATION TECHNIQUE FOR INTRUSION DETECTION SYSTEMS USING FIN...IJCSEIT Journal
Identifying attackers is a major apprehension to both organizations and governments. Recently, the most
used applications for prevention or detection of attacks are intrusion detection systems. Biometrics
technology is simply the measurement and use of the unique characteristics of living humans to distinguish
them from one another and it is more useful as compare to passwords and tokens as they can be lost or
stolen so we have choose the technique biometric authentication. The biometric authentication provides the
ability to require more instances of authentication in such a quick and easy manner that users are not
bothered by the additional requirements. In this paper, we have given a brief introduction about
biometrics. Then we have given the information regarding the intrusion detection system and finally we
have proposed a method which is based on fingerprint recognition which would allow us to detect more
efficiently any abuse of the computer system that is running.
This research focused on classifying Human-Biometric Sensor Interaction errors in real-time. The Kinect 2 was used as a measuring device to track the position and movements of the subject through a simulated border control environment. Knowing, in detail, the state of the subject ensures that the human element of the HBSI model is analyzed accurately. A network connection was established with the iris device to know the state of the sensor and biometric system elements of the model. Information such as detection rate, extraction rate, quality, capture type, and more metrics was available for use in classifying HBSI errors. A Federal Inspection Station (FIS) booth was constructed to simulate a U.S. border control setting in an International airport. The subjects were taken through the process of capturing iris and fingerprint samples in an immigration setting. If errors occurred, the Kinect 2 program would classify the error and saved these for further analysis.
IT 34500 is an undergraduate course offered to Purdue West Lafayette students. The course gives an introduction into biometrics and automatic identification and data capture technologies
The human signature provides a natural and publically-accepted legally-admissible method for providing authentication to a process. Automatic biometric signature systems assess both the drawn image and the temporal aspects of signature construction, providing enhanced verification rates over and above conventional outcome assessment. To enable the capture of these constructional data requires the use of specialist ‘tablet’ devices. In this paper we explore the enrolment performance using a range of common signature capture devices and investigate the reasons behind user preference. The results show that writing feedback and familiarity with conventional ‘paper and pen’ donation configurations are the primary motivation for user preference. These results inform the choice of signature device from both technical performance and user acceptance viewpoints.
The inherent differences between secret-based authentication (such as passwords and PINs) and biometric authentication have left gaps in the credibility of biometrics. These gaps are due, in large part, to the inability to adequately cross-compare the two types of authentication. This paper provides a comparison between the two types of authentication by equating biometric entropy in the same way entropy of secrets are represented. Similar to the method used by Ratha, Connell, and Bolle [1], the x and y dimensions of the fingerprints were examined to determine all possible locations of minutiae. These locations were then examined based on the observed probability of minutiae occurring in each of the designated locations. The results of this work show statistically significant differences in the frequencies and probabilities of occurrence for minutiae location, type, and angle, across all possible minutiae locations. These components were applied to Shannon’s Information Theory [2] to determine the entropy of fingerprint biometrics, which was estimated to be equivalent to an 8.3-character, randomly chosen password
In this research, intra-visit match score stability was examined for the human iris. Scores were found to be statistically stable in this short time frame.
In this research, intra-visit match score stability was examined for the human iris. Scores were found to be statistically stable in this short time frame.
In this research, intra-visit match score stability was examined for the human iris. Scores were found to be statistically stable in this short time frame.
In this research, intra-visit match score stability was examined for the human iris. Scores were found to be statistically stable in this short time frame.
A lot of work done in Center recently has focused around different topics concerning "time". Iris stability across different "times" has been in the forefront due to work in the undergraduate class, IT345, the graduate class IT545, as well as work in Ben Petry's thesis. Of course "time" is a fairly inaccurate word to use. Assessing stability over time is very ambiguous to the research question. For example time may mean millisecond, months, years, or even life of the user. Upon further examination of other academic literature, the reporting of research duration, collection interval, and specific time frame of interest are sporadic at best and missing completely at worst. To solve this issue, the Center has created the biometric duration scale (BDS) model with associated suggested best practices for reporting time duration in biometrics.
The BDS model marries the general biometric model with HBSI model to create a logical flow of five phases: the presentation definition phase, sample phase, processing phase, and enrollment or matching phase. By tracking information through this progression such as specific subject presentations made, HBSI error, and FTE/Enrollment score (to name a few), performance within the general biometric model can be examined. The BDS model goes one step further by creating specific durations to report research specific metrics. By creating this model, outcomes that effect a yearly performance metrics can be looked at by examining monthly performance, daily performance, or even specific user presentations and how those subcomponents effect the whole system.
Additionally, best practices for the reporting of duration is also included. The reporting methodology stems from ISO 8601 and is in compliance with ISO 21920. In the common reporting structure, start date, duration, number of visits at how many intervals, and time scope of interest for the specific research are given in a logical, readily available format along with the very specific, detailed ISO 8601 methodology. The goal of creating a formal script for reporting research duration was to eliminate ambiguity and create an environment where replication and drawing parallels between research is encouraged.
The stability score index, conceptualized in 2013, was designed to address the weaknesses of the zoo menagerie and other performance metrics by quantifying the relative stability of a user from on condition to another. In this paper, the measure of interoperability is the stability score from enrolling on one sensor and verifying on multiple sensors. The results showed that like performance, individual performance were not stable across these sensors. When examining stability by sensor family (capacitance, optical and thermal) we find that capacitive as the enrollment sensor were the least stable. Both enrolling and verifying on a thermal sensor, individuals were the most stable of the three family types. With respect to interaction type, enrolling on touch and verifying on swipe was more stable than enrolling on swipe and verifying on swipe, which was an interesting finding. Individuals using the thermal sensor generated the most stable stability scores.
According to a report by Frost and Sullivan in 2007, revenues for non-AFIS fingerprint devices in notebook PC's and wireless devices is anticipated to grow from $148.5 million to $1588.0 million by 2014, a compound annual growth rate of 40.3% [1]. The AFIS market has a compound annual growth rate of 15.2% with revenues of $445.0 million in 2007. With the development of mobile applications in a number of different market segments, such as healthcare, retail, and law enforcement, this paper analyzed the performance of fingerprints of different sizes, from different sensors...
This is a preview of the databases we use in the Center. The presentation overviews our data collection GUI, data storage (datawarehouse), and our project management database. Each of these databases work together to allow us to efficiently run our operations.
More from International Center for Biometric Research (20)
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
2. and decision) [4], another component is added to represent the the features of the two fingerprints remain consistent
transfer of the authentication decision to the application that Repeatability of the extracted features is important for the
relies on the decision from the, biometric system. Such matching process in any type of biometric technology [7]. The
applications could be identity management systems (IDMS) or features to be examined include: minutiae points and image
access control systems for logical and or physical access to quality. One of the challenges associated with this research is
resources. These systems can vary in complexity and size, to ensure the image is of sufficient quality. A wide variety of
ranging from a local computer log-in all the way to a wide- factors can influence the quality of fingerprint samples. Non-
scale distributed architecture seen in the cases of the U.S. uniform contact, inconsistent contact, or irreproducible contact
Department of Transportation's Transportation Worker with the fingerprint sensor can result in images with a low
Identification Credential (TWIC) [5] or the Personal Identity signal-to-noise ratio, which is not desirable for feature
Verification (PIV) of Federal Employees and Contractors [6]. extraction and matching purposes [9]. Wear and tear of the
The remaining points of vulnerability are communication skin and aging effects can semi-permanently alter ridge
channels between these six modules. It is worth noting that not characteristics. These factors also affect acquisition of
all 11 vulnerability points are unique to the biometric system. fingerprints by the fingerprint sensor. The importance of
Many of the same points (e.g., storage and communication quality is widely acknowledged, but there is no standard
channels) are vulnerable in other authentication systems and means of assessing quality. The current standardization effort
similar methods can be used to limit those particular for assessing quality for biometric samples refers to three
vulnerabilities. different connotations of quality:
The most publicized vulnerability in biometric systems * Character
resides at the data collection module in the form of spoofmg or * Fidelity
presenting artificial representations of biometric samples (see * Utility
module #1, Fig. 1). If an artificial or fake biometric sample is These three connotations of biometric sample quality can be
accepted by the biometric system at this initial stage, the entire directly applied to fingerprint sample quality. Character is a
biometric system is corrupted and the system is compromised. description of quality based on inherent features from the
Attacks on biometric systems are not new, popular culture source of the fingerprint. Individuals who have scarred
seeks to circumvent security systems and biometric systems fingerprints or dry or cracked skin on the fingertips will
are not immune. Several online resources are available that provide samples with poor character. Fidelity is a description
describe such attacks on the data collection module, and many of quality based on degree of similarity between the actual
movies and television shows have highlighted attacks on such fingerprint and the fingerprint image acquired by the sensor.
systems. One such attack at this data collection module was Inconsistent contact with the fmgerprint sensor can lead to
outlined in the work of Matsumoto, Yamada, and Hoshino fingerprint samples with poor fidelity. Utility is a description
(2002) using "Gummy Fingers" [1]. of quality based on observed or predicted contribution of the
The biometric research community, as well as industry, has fingerprint sample to the overall performance of the
focused its research on preventing such attacks by using the fingerprint recognition system. Utility of a fmgerprint sample
concept of "liveness" detection techniques. Today, the newer is directly affected by the character and fidelity of the
sensors are improving their resilience against a spoofing attack fingerprint sample, and should be the closely related to
at this module. In the past, acetate spoofing attacks - where performance of the recognition system.
an image of a fingerprint placed on acetate was accepted as a A substantial amount ofresearch has been conducted in area
genuine live finger - was easy to reproduce. Now, such of quality assessment, all of which give varying levels of
attacks are proving increasingly difficult to succeed, hence the importance to character, fidelity, and utility. Previous research
more complicated approaches to attacks being waged on the in the field of fingerprint image quality assessment can be
vulnerable sensor. Techniques for liveness detection within generalized into three categories: local features analysis,
the fingerprint modality focus on moisture content, global features analysis, and quality analysis as a classification
temperature, electrical conductivity, and challenge response. problem [10]. Features of the fingerprint image such as
minutiae count, fidelity of minutiae, contrast ratio between
III. FINGERPRnT IMAGE QUALITY ANALYSIS ridges and valleys, capture area of the figerprint, and
The purpose of this research paper was not to prove the detennination of dominant direction are used by quality
vulnerability of the biometric system, but to examine the algorithms in varying capacities to make quality
repeatability of the features of the gelatin finger print as determinations. For the purpose of this research, fingerprints
compared to the live genuine sample once the image has been from live fngers and gelatin fingers were examined by two
acquired. The research question is whether an artificial print different image quality algorithms, one provided by Aware,
captured on an optical sensor exhibits any of the same Inc. and the other provided as a part of NIST Fingerprint
characteristics as a genuine fingerprint from the same Imaging Software (NFIS).
individual captured on the same sensor, and whether any
distinguishers might enable the artificial print to be excluded IV. METHODOLOGY
later on in the process, if the initial data collection module This research involved two separate experiments. The first
accepts it. The research also investigates whether, over time,
31
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.
3. experiment was conducted in two stages. Stage 1 of the first verified by the software, then the mold would have been
experiment involved creation of a set of images from an destroyed and the process started again.
artificial gelatin fimger that was crafted from the same subject The artificial finger was returned to the refrigerator in a
who provided the genuine fingerprint. The procedures to simple (airtight, but not vacuum-sealed) plastic storage
accomplish this feat required adaptation of several different container for 48 hours at a temperature of 2TC. This procedure
methodologies outlined in the literature for creating an allowed the gelatin finger to completely solidify to its
artificial fingerprints, including the work done by Matsumoto, permanent state.
Yamada, and Hoshino [1]. Prior to creating the mold, the After removing the gelatin finger from the refrigerator, tests
following necessary ingredients and utensils were gathered: were conducted on it to estimate the optimal load required for
plastic clay, hot water, and a pair of plastic tongs. To create acquiring images. In general terms, it is best to use the least
the mold, a quantity of plastic clay sufficient to cover the weight possible to produce a scan in order to minimize the
genuine finger was required. In order to make the plastic clay spreading and dissolution of the gelatin finger's valleys and
malleable, it was placed briefly in boiling water. In order to be ridges. Testing of loads ranging from 20Og to 1,000g, as
utilized, the plastic clay had to have a consistency such that it measured by a Tanita digital scale, was performed.
enabled a mold to be created by placing the finger with only Approximately 200g was determined to be the lower limit to
light pressure. When the plastic clay has attained a sufficient produce an image, with 550g being the upper limit before
level of pliability, the clay was cooled. Once the clay had distortion and inability to match occurred. The next stage of
cooled enough to be touched, the finger was placed into the the experiment involved acquisition of a series of images from
the gelatin finger. All of the images were acquired from the
clay to a depth sufficient to create the mold to be used to craft optical sensor using the gelatin fimger over a 15-minute time
the gelatin finger. The genuine finger was kept in the plastic period. After 15 minutes of acquiring images, the gelatin
clay until the clay had cooled enough to retain its shape and finger had degraded to the point at which it was no longer able
the details of the genuine finger. After the finger was to be accepted by the optical sensor. In all, 163 images were
removed, the mold was allowed to cure for an additional 10 produced over the 15-minute time period. A detailed
minutes. The resulting mold for this study is shown in Fig. 2. description of the 160 images utilized in the study is provided
in the results section.
Stage 2 of the first experiment called for the collection of a
series of live samples. One hundred sixty live samples were
acquired from the same finger that was used to create the
artificial gelatin finger. The live samples were collected over
an 8-minute time period on a commercially available optical
sensor. The authors chose to collect 160 live samples, as this
was the same number of fingerprints collected from the gelatin
finger. These 160 live images were all stored according to the
time collected; they are used to provide a baseline quality
assessment that will be compared against the samples
Fig. 2 Plastic mold formed to create gelatin finger generated by the gelatin finger.
After data collection, both sets of images (from the live
The next step was to create a gelatin mixture capable of finger and the gelatin fimger) were processed through the NIST
producing artificial fingerprints from the mold that would be Fingerprint Image Software (NFIS) package. The MINDTCT
recognizable to the sensor. Two sheets of gelatin weighing function was used to count the number of minutiae present in
3.5g were soaked in cold water for five minutes. In order to each individual image. The NFIQ function was used to
remove the excess water, the gelatin sheets were dried until evaluate image quality, which is determined on a rating scale
the gelatin weighed 14-16g. Next, a bowl was immersed into of 1 to 5, with 1 being the best and 5 being the worst. The
extremely hot water, and the gelatin was placed into the bowl results of M1NDTCT and NFIQ from both groups (the live
to soften and melt the gelatin. Once the gelatin had melted, it fingerprints and the gelatin fingerprints) were then compared
was poured into the clay mold. Immediately after placing the by the means of statistical t-tests (using an a level of 0.05) to
gelatin in the mold, the mold was placed in a refrigerator to determine if any statistical difference existed across the
cool for 10 minutes at a temperature of 1°C. Cooling groups. Aware, Inc. offers a commercially available image
transforms the gelatin to a state that is resistant to changes in quality and minutiae count software; this software was used to
shape when touched. Ambient room temperate, when this extract image quality scores and minutiae counts for
experiment was conducted, was 220C. fingerprints from the live finger and gelatin finger groups. By
After the gelatin finger had cured in the refrigerator for a utilizing two different software packages to analyze the
period of approximately one hour, it was removed from the fingerprints, we sought to eliminate bias that might have been
clay mold and placed on the sensor to determine whether it generated by utilizing only a single application.
was actually able to produce images. Ten attempts to acquire The second experiment involved collecting fingerprint
an image were made; in all 10 instances, an image was images from left index finger and the right thumb from 30
produced. The software verified the mold. If the mold was not different subjects. Each subject was asked to provide 20
32
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.
4. images of each fimger on three different optical sensors. A of live and gelatin fingerprint minutiae count for last 16 prints.
silicon mold was created the left index and right thumb of An interesting observation is that the minutiae count increases
each subject, and the molds were used to provide 20 for the gelatin fingerprint group, but stabilizes for live
fingerprint images on each of the three optical sensors. The fingerprints.
procedures used in [1] were also employed in creating the
silicon molds in the second part of the experiment. At the end Boxplot of AWAREJLive-comt, AWARE_Gelatincount
of the data collection there were 3600 fingerprint images from 70.
live fingers, and the 3600 fingerprint images from silicon
fingers.
60*
A. Experiment I
V. RESULTS
a
50I
Creation and acquisition of images from gelatin fingers can be
problematic, as previous research has shown that gelatin 40 _
fingers do not afford consistent repeatability. However, this
study provides anecdotal evidence suggesting that better 3J
preparation and storage of the artificial finger can aid in the
repeatability of the images produced. The first 39 samples AWARE Live ca,t AWARELGelItkSem_t
provided consistently successful spoofing results; on the 40th Fig. 4 Box plot, live and gelatin finger minutiae count using Aware
presentation of the gelatin finger, a failure-to-acquire (FTA) QualityCheck
resulted. Overall, 163 images were acquired, but only 160
images were used for the final study. Degradation on the final
3 images rendered the images unusable. The acquisition rate Boxplot of t.S_Live_Coumt, NFI&Gelatin cout
for this particular gelatin fingerprint was 90.7%, producing a 110
FTA rate of 9.3%. Fig. 3 shows the gelatin print (left) a live
print (right). The FTA rate for the live finger was 0.0%.
90-
900
a
0-
60-
o
60
50
40
NRSLve_Cort NFS.Geeatincount
Fig. 5 Box plot, live and gelatin finger minutiae count using NFIS
M1NDTCT
Fig. 3 Gelatin finger (left) and live fmger (right) The stabilization of minutiae count for the live fingerprints
can probably be attributed to habituation or acclimation to the
The minutiae count analysis on both the fingerprint groups device. The subject has been acclimated to placing the sample
was performed. Fig. 4 and Fig. 6 show box plots of the finger on the optical sensor, which reduces the inconsistent
minutiae count from the live finger and gelatin finger, contact of finger surface with the platen of the sensor. Another
respectively, generated from Aware, Inc.'s image quality tool. interesting observation is the increase in minutiae count
Fig. 5 and Fig. 7 show box plots of the minutiae count from between the gelatin fingerprints over time. This suggests
the live finger and gelatin fmger, respectively, generated from degradation of the gelatin finger and mold because of
NFIS's MINDTCT. repetitive use and introduction of cracks in the mold used to
The results from the box plot graphs generated by both the create the gelatin finger. Evidence suggests that, over time, the
Aware and NFIS software programs show that the live number of minutiae for the gelatin fingerprint increases, while
fingerprints have a lower minutiae count than the gelatin the number of minutiae for the live fmgerprint stabilizes.
fingerprints, which is most likely a result of indirect and
inconsistent contact with the optical sensor. In order to study
the deterioration of the gelatin fingerprints, the first 16 and last
16 samples from the live and gelatin fmgerprint groups were
used. Fig 6 shows a box plot of the live and gelatin fingerprint
minutiae count for first 16 prints, and Fig. 7 shows a box plot
33
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.
5. 9
55
501
45
40
35
303
65-
So
60-
55-
45-
40-
35-
30
Boxplot of Lisfe_First-16;, Gelatin-First-16
I
ve-FirsL6
Boxplot of Live_Last_6, GelatinjLastL16
Live_Last_16
'I'--
Fig. 8 and Fig. 9 shows the scatter plots for minutiae count
versus sample numbers of live and gelatin fingerprint groups.
Both of these graphs give credence to the observations made
from the box plots.
Scatterplot of NFISJJve-Count, WISjGelatin_count vs rnageCount
>
110
100
80
70
6.0
40
0 2
0
I
1
S%mN&
40
me_pmft
U
.
Jj,#MO
6
U
*-
0
"
100 12
1
14
U
1
Fig. 8 Scatter plot, minutiae count vs. sample number using NFIS
MIINDTCT
Gebtin_Fst16
Fig. 6 Box plot, live and gelatin finger minutiae count using Aware,
first 16 prints
Gelati_Last_16
Fig. 7 Box plot, live and gelatin finger minutiae count using Aware
QualityCheck, last 16 prints
-
VariabLa
-+- EIS Lwe.Co.wA
-'-
W- NFIS_GeaUn_cwnt
(U
'U
9
Groups
70
60-
50-
40-
30-
70
so 80
40
Aware_LiveLIQ
0
Aware_Gelatin_IQ 160
NFIS_Live-IQ 160
NFIS Gelatin IQ 160
*
20
tE a
...
Scatterplot of AWAREJive_ount, AWARE_Gelatincmnt vs bnaJeCoLnt
,V,able
,
40
5
U0
0406
a
60
*
ILI
*rn
*
N
i
60
'
ImageSotunt
160
w
*u
mA
.*
80 100 320 140 160 180
Image.Eoumt
Fig. 9 Scatter plot, minutiae count vs. sample number using Aware
Image quality is another metric that was considered
important for this research. Fig. 10 shows the scatter plot of
image quality scores obtained using Aware, Inc.'s quality
algorithm on the live and gelatin fingerprints. The graph of
image quality scores clearly indicates a degradation of the
gelatin fingerprint. T-tests of image quality scores between the
live and gelatin fingerprints showed a statistically significant
difference. The severe decrease in image quality noticed in the
repeated use of the gelatin finger indicates that it would be of
practical use only for the first 10 or so attempts. Table 1 shows
the results from the t-tests.
Scatterplot of AWAREjLiveJIQ, AWARE-Gelatn IQ vshnage-Count
U. *
|
* * **
*
U
T-TEST RESULTS
Interestingly, image quality scores might not provide a clear
indication of a spoofing attempt, because in the initial nine
samples, there was no statistically significant difference
between the image quality score means ofthe two groups.
_
a
X-
a
100 120 140 160 180
Fig. 10 Scatter plot, image quality scores using Aware QualityCheck
TABLE I
a 5
s
UI
*i
t-&-
Mean
79.22
61.0'
1.88
2.21
U
l AWARE Live count
-_U- AWAREfidati count
-4-
-U
AWARE_i.veJq
AWAREJ.I*anJ
p-value
<.05
<.05
34
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.
6. B. Experiment 2 A paired t-test test for statistically significant difference was
Using the Aware software, the minutiae count and quality conducted on image quality scores. The test showed a p-value
scores were computed for all the live, and silicon fingerprints of < .05, which indicated that the difference in iimage quality
collected in the second experiment. Table II shows the scores was statistically significant. The minutiae count for the
summary statistics for image quality and minutiae count for dataset of silicon fingerprints and dataset of live fingerprints is
the dataset of live and silicon fingerprints. very similar, but the quality scores for the two groups were
significantly different. The results from this experiment
TABLE II indicate that image quality scores from the silicon fingerprints
SUMMARY STATISTICS are of medium quality, but they still are significantly different
Groups N Median Median Minutiae from image quality of live fingers. This provides an
Image Count interesting observation - the extraction of minutiae points is
Quality Score very similar between silicon mold fngerprints and live
Live Fingers 3600 80.0 39.0 fingerprint images, but the difference in quality scores points
Silicon Fingers 3600 69.0 39.0 to noise in the ridge flow, and contrast levels between ridges
Groups N Mean Image Mean Minutiae and valleys and other factors. This could also be possible due
Quality Score Count to distortion and elastic deformation of the silicon fingerprint
Live Fingers 3600 76.46 39.44 being different than the corresponding live fingerprint. This
Silicon Fingers 3600 61.35 38.98 observation merits ftuther research into this phenomenon since
the ridge flow and contrast analysis could be used as a
Fig. 11 and 12 show the box plot of quality scores and differentiating factor. The spread of the image quality scores
minutiae count respectively. The spread of the image quality of the silicon mold is also more variable than the images
scores is a lot larger for silicone finger compared to live quality scores of the live fmgerprint images, which indicates
fingers. degradation of the silicon fingerprints between successive
attempts.
Boxplot of Live Finger & Silicone Finger Image Quality Scores VI. CONCLUSION
inn-I
iw
The danger with providing a recipe for spoofing is that an
80
attack methodology to a biometric sensor is revealed.
However, in this case, the attack is analogous to an individual
revealing a PIN number to a fraudster and accompanying the
fraudster to the ATM to watch the fraudster withdraw the
ii 240
individual's money. The test was not designed as a spoofing
enquiry to evaluate the security of the system, but rather to
understand the characteristics of the gelatin finger and silicon
20Q
fmger compared to its live counterpart.
Some interesting results were obtained as result of the
0
analysis in both the experiments. In the first experiment, the
Live Finger Silio Finger
gelatin finger was able to provide 163 samples with an
Fig. 11 Box plot, image quality scores using Aware image quality
acquisition rate of 90.7%. Further analysis of fingerprints from
software the live and gelatin fingers showed a considerable difference
in the basic characteristics between the two groups. Repeated
use of the gelatin finger resulted in a rapid degradation of the
Boxplot of Live Finger, Silicone Finger quality of prints provided, which was reinforced by an
100- * increase in minutiae count with repeated use. Expecting a
gelatin finger to survive over a 100 attempts would be
80S unreasonable, but our analysis showed that even after 10 uses,
the gelatin finger showed a severe degradation in quality, even
60- though the system matched the spoof. Stabilization of the
0
minutiae count for the live fingerprint was an unexpected
A 40- result of the experiment, but it reaffins the notion of
habituation and how it can be used to acquire fingerprint
20- samples representative of its owner. Comparing the minutiae
count results of the first and second experiment, the minutiae
O- count of silicon mold fingerprints was more similar to the live
Live Mintae Count Silicone Minutae Count fingerprints compared to the gelatin mold fingerprints. A
future work of this study is to perform matching operations
Fig. 12 Box plot, Minutiae count between the live fingerprints and silicon fingerprints to
examine an effect of quality and minutiae count on the
35
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.
7. matching error rates. Results from both the experiments
showed a statistically significant difference in image quality
scores between the fake fingerprints and live fingerprints
which could be used as an anti-spoofing mechanism.
Understanding the characteristics of fake fingerprints is
important in devising countermeasures, and extremely
important in increasing security of fingerprint biometric
systems.
REFERENCES
[11 T. Matsumoto, H. Matsumoto, K. Yamada, and S. Hoshino, "Impact of
artificial 'gummy' fingers on fingerprint systems, i Proc. SPIE, vol.
4677, Optical Security and Counterfeit Deterrence Techniques IV, San
Jose, CA, 2002, pp. 275-289.
2] R. K. Rowe. "A multispectral sensor for fingerprint spoof detection.
Sensors, " January 2005, p. 4.
[3] N. K. Ratha, J. H. Connell, and R. M. Bolle, "Enhancing security and
privacy in biometrics-based authentication systems," IBM Syst J 40(3),
2001, pp. 614-634.
[4] A. Mansfield and J. Wayman, "Best practices in testing and reporting
perfonnances of biometric devices," UK Biometric Working Group,
City, ST, 2002, pp. 1-32.
[5] U.S. Department of Homeland Security, Transportation worker
identification credential (TWIC) program, Available online:
https://www.twicprogram.com.
[6] National Institute of Standards and Technology, Personal identity
verification of federal employees and eontractors, Availabke online:
http: /csrc.nist.gov/piv-program.
[7] S. J. Elliott, N. Sickler, and E. Kukula. Automatic identification and data
capture. 3rd ed., West Lafayette, IN: Copymat. 2005, p. 314.
[8] A. Jam and N. Duta. "Deformable matching of hand shapes for user
verification," in 1999 Int Conf Image Processing, 1999, Kobe, Japan:
IEEE.
[9] N. Haas, S. Pankanti, and M. Yao, "Fingerprint quality assessment"
Automatic fingerprint recognition systems, NY: Springer-Verlag, 2004,
pp. 55-66.
[10] F. Fernandez, J. Aguilar, and J. Garcia, "A review of schemes for
fingerprint image quality computation," in 3rd COST 275 Workshop,
Biometrics on the Internet, Hatfield, UK, 2005.
36
Authorized licensed use limited to: Purdue University. Downloaded on February 18,2010 at 15:00:20 EST from IEEE Xplore. Restrictions apply.