Detail information about the real time errors in the biometrics devices and also how to secure encryption keys. To make authentication systems more secure. In this seminar report describe about the combination of the biometrics with the cryptography. and also describe the methods that are used to handle the real time error like fault accept and fault reject and also describe their their rates.i,e FRR and FAR by the biometrics systems.
Alambagh Call Girl 9548273370 , Call Girls Service Lucknow
Seminar report on Error Handling methods used in bio-cryptography
1. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 1
1. INTRODUCTION
The need for identity verification is important than ever before due to the development
of more sophisticated and more importantly security critical applications. Existing
authentication mechanisms can be divided into three categories as something that you know
(passwords), something that you have (tokens) and something that you are (biometrics).
Passwords were the most popular method of authentication due its ease of usage for a long
period of time. However due to their vulnerability on guessing and dictionary attacks, there is
a need for strong random passwords which cannot be easily guessed. Though, this increases
the level of security, the issue of secure storage of passwords comes into the fore front, since
they cannot be remembered no more due to their associated randomness. As a result, a
significant interest was raised in recent years for using biometrics, which represents the
biological properties of humans such as fingerprints, finger veins, iris, ECG, keystroke
dynamics and palm [1 - 4]. Thus, the users are relieved from remembering passwords or lost
tokens. However, use of biometrics also raises critical security issues, since if a biometric is
compromised it would be gone forever, unlike passwords which can be reissued when
necessary.
Though, cryptography evolved leaps and bounce still it suffers from the critical issue
of key management. Cryptographic theories are built around the primitive assumption that the
keys are completely secure. However, as the case with passwords, secure storage of keys is a
major issue since the security is completely depends upon the security of keys.
In the process of identifying solutions to the above mentioned issues, researchers have
looked at possibilities to combine cryptography along with biometrics. The main hurdle to be
overcome in biometric cryptosystems is the noisy nature of biometric information. When
comparing plaintext information used in traditional cryptography and biometric information,
there exist a significant difference since biometric information can vary during each capture
due to acquisition noise. Thus, multiple templates captured from an individual could be
slightly different meaning that they are not reproducible; while providing evidence to the fact
that biometric templates cannot be crypted simply as in cryptography. Therefore, it is
necessary to incorporate error tolerant mechanisms when dealing with transformed biometric
templates to overcome the effect of noisy biometric inputs.
2. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 2
2. MOTIVATION
Biometric system refers to automatic recognition of a person using his behavioural or
physical characteristics like voice, signature, face, fingerprints, palm etc. Biometric system
processing stage involves image capture, pre-processing, feature extraction and template
storage in the system database. During verification the input query image features are
compared with stored features for final authentication. Biometric system has various
limitations like noisy sensor data, spoof attacks, inter class similarity and intra class variations
etc.
A primary motivation for using biometrics is to easily and repeatedly recognize an
individual so as to enable an automated action based on that recognition. The reasons for
wanting to automatically recognize individuals can vary a great deal; they include reducing
error rates and improving accuracy, reducing fraud and opportunities for circumvention,
reducing costs, improving scalability, increasing physical safety, and improving convenience.
Often some combination of these will apply. For example, almost all benefit and entitlement
programs that have utilized
To increase the performance accuracy and to design a biometric system or to propose
a new approach to the existing system one has to understand the basic biometric system, its
parameters used, limitations, biometric scenario, biometric characters used for an application,
types of errors and existing approaches. Any biometric system is not optimal. Always there is
a need for improving the accuracy and performance of the biometric system. Hence this is the
motivational idea for my seminar is using cryptography handles error in biometrics.
3. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 3
3. RELATED WORK
3.1 Detailed survey of recent research on extracting biometric keys are as follows: -
Monrose, Reiter, Li and Wetzel were among the first: their system [11] is based on
key-stroke dynamics. A short binary string is derived from the user’s typing patterns and
then combined with her password to form a hardened password. Each keystroke feature is
discretized as a single bit, which allows some error tolerance for feature variation. The short
string is formed by concatenating the bits.
Monrose, Reiter and Wetzel proposed a more reliable implementation based on voice
biometrics, but with the same discretization methodology [8]. Using voice biometrics an
improvement in performance: the entropy of the biometric key is increased from 12 bits to 46
bits, while the false rejection rate falls from 48.4% to 20%.
Hao and Chan made use of handwritten signatures. They defined forty-three
signature features extracted from dynamic information like velocity, pressure, altitude and
azimuth. Feature coding was used to quantize each feature into bits, which were concatenated
to form a binary string. This achieved on average 40-bit key entropy with a 28% false
rejection rate; the false acceptance rate was about 1.2% [10].
Fingerprints are among the more reliable biometrics, and there is a long history of
their use in criminal cases [1]. Soutar, Roberge, Stoianov, Gilroy and Vijaya Kumar reported
a biometric-key system based on fingerprints in and were the first to commercialize this
technology into a product – Bio crypt. They extract phase information from the fingerprints
image using a Fourier transform and apply majority coding to reduce the feature variation.
Clancy, Kiyavash and Lin proposed a similar application based on fingerprints in [7]
and used a technique called a fuzzy vault, which had been first introduced by Juels and Sudan.
In Clancy’s work, the fingerprints minutiae locations are recorded as real points which form a
locking set. A secret key can be derived from this through polynomial reconstruction. In
addition, chaff points are added to the locking set to obscure the key. If a new biometric
sample has a substantial overlap with the locking set, the secret key can be recovered by a
Reed-Solomon code. This work is reported to derive a 69-bit biometric key but unfortunately
with 30% false rejection rate.
Goh and Ngo combined some of the above techniques to build a system based on face
biometrics [9]. They adopted the biometric locking approach used by Soutar et al. Eigen-
projections are extracted from the face image as features, each of which is then mixed with a
random string and quantized into a single bit. A binary key is formed by concatenating these
4. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 4
bits, and majority-coding is added as suggested by Davida et al. Error correction involves
polynomial thresholding which further reduces feature variance. Goh and Ngo report
extracting 80-bit keys with a 0.93% false rejection rate. This is beginning to approach the
parameters needed for a practical system. However, the experiments reported are based on
images taken from a continuous video source with minor variations, rather than a face
database. So doubts remain about the evaluation of this work.
In summary, a range of biometrics has been used in previous practical work. With the
exception of Goh and Ngo’s paper, the false rejection rates are over 20%, which is way
beyond the level acceptable for practical use. In addition, the key lengths are usually too
short. There is also some theoretical work on key extraction from noisy data.
However, biometric applications lie between the extremes of secret data and fully
public data. People leave behind fingerprints, and their irises can be photographed
surreptitiously; a biometric sample stolen in this way will reveal most of its entropy to the
attacker. A related issue is issuing multiple keys for different applications.
5. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 5
4. PROBLEM STATEMENT
4.1 Biometric devices can make two kinds of errors:
1. The false accept: In which the device accepts an unauthorized person.
2. The false reject: In which the device falsely rejects an authorized person.
Picking the right biometric device requires careful analysis of which type of error will
have the greater impact on both security and everyday operations. Because different types of
biometric devices have widely differing false reject and false accept rates, the parameters of
the specific application must be considered. Where a device serves a small population or has
limited use, a higher false reject rate may not make much difference. In larger populations or
with frequent use, a high false reject rate can affect enough people to make the system
impractical.
A number of biometrics, such as keystroke patterns, voice, handwritten signatures,
fingerprints, Iris, DNA and face images, have been studied for cryptographic key binding. The
major problem for binding cryptographic key with biometrics is that biometric measurements
are non-exactly reproducible: typically two measurements of a same biometrics will give two
similar but different results, which violate the exactitude requirement of cryptographic key.
To solve the problem, error tolerance techniques have to be applied. In this paper we survey
these techniques proposed in literatures for different biometrics.
The performance of a bio-cryptosystem is usually measured using the parameters FAR
and FRR which are largely affected by the accuracy of the underlying error correcting method
used in the system. Therefore, it is crucial to examine the facts that must be considered when
choosing an appropriate error tolerance method. In the beginning, the types of errors and
related error patterns must be recognized in relation to the considered biometric. Furthermore,
with methods such as quantization the optimum selection of threshold levels should be made
based on the experimental analysis in order to have acceptable values for FAR and FRR.
However, with regard to ECC selections there are lot more facts to be considered. One
of the most important aspects is the error correcting capability of the ECC with respect to the
bit error rate. Furthermore, the error correcting capability must be good enough to distinguish
between intra-domain comparisons and inter-domain comparisons. Therefore, before deciding
on an ECC scheme it is critical to analyze the genuine and imposter distance distributions of
the considered biometric information.
6. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 6
5. METHODOLOGY
5.1 Introduction to Biometrics:
Biometrics is measurements of the characters of a person for authentication. The
various characters include iris, face, finger print, palm print etc., which are behavioural.
Measurements related to each character will differ. Biometric system is analyzed based on
cost, accuracy, complexity, inter-operability and system security. Biometrics also provides
implementation in a wide category. Some of the majorly used categories are elaborated:-
Fig 5.1.1 classification of Biometrics
Fingerprint Recognition:- It takes into consideration the minutiae present on the tips of
the fingers of human as a characteristic for authentication
Face Recognition: - It adjudges facial characteristics of humans for authentication.
Eyes Recognition: - It inculcates two types of recognition namely iris and retina
recognition. In iris recognition, it gauges particularly the iris of eye as an
authentication facet. In retina recognition, it deems the veins that are present at the
back of eyes.
Typing Recognition: - It reckons the unique characteristics of a person’s typing for
establishing identity.
Ear Recognition: - It exaggerates the shape of ear for the authentication purpose.
7. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 7
5.2 Introduction to Cryptography:
Cryptography involves creating written or generated codes that allows information to
be kept secret. Cryptography converts data into a format that is unreadable for an
unauthorized user, allowing it to be transmitted without anyone decoding it back into a
readable format, thus compromising the data.
Information security uses cryptography on several levels. The information cannot be
read without a key to decrypt it. The information maintains its integrity during transit and
while being stored. Cryptography also aids in non-repudiation. This means that neither the
creator nor the receiver of the information may claim they did not create or receive it.
Cryptography is also known as cryptology.
Fig: 5.2.1 Overview-of-the-encoding-method
Fig: 5.2.2 Cryptography
Cryptography also allows senders and receivers to authenticate each other through the
use of key pairs. There are various types of algorithms for encryption, some common
algorithms include:
8. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 8
Secret Key Cryptography (SKC): Here only one key is used for both encryption and
decryption. This type of encryption is also referred to as symmetric encryption.
Public Key Cryptography (PKC): Here two keys are used. This type of encryption is
also called asymmetric encryption. One key is the public key and anyone can have
access to it. The other key is the private key, and only the owner can access it. The
sender encrypts the information using the receiver’s public key. The receiver decrypts
the message using his/her private key. For non-repudiation, the sender encrypts plain
text using a private key, while the receiver uses the sender’s public key to decrypt it.
Thus, the receiver knows who sent it.
Hash Functions: These are different from SKC and PKC. They have no key at all and
are also called one-way encryption. Hash functions are mainly used to ensure that a
file has remained unchanged.
5.3 Difference between Biometric Recognition and Cryptography Authentication:
Biometric systems refer to the use of biometric recognition technology to authenticate
a person’s identity through his or her unique biological characteristics (e.g., fingerprints, palm
prints, iris, and personal signature) in lieu of a password. This approach can thus authenticate
the user’s identity without requiring the user to remember multiple passwords. This
authentication method usually first obtains a threshold range to discriminate between
acceptable and unacceptable inputs. However, repeated use, improper storage, or transmission
leaks may compromise security.
The difference with cryptographic technology is that these authentication ratios do not
need to achieve 100% accuracy. That is, a certain degree of error in data matching is tolerated.
5.4 Error Control Mechanism:
The most commonly adopted error tolerance methods are as follows:
Biometric system works with two steps: registration and verification. For registration a
person provide a live biometric for measurements and the results will be stored. For
verification, the person must provide the same biometric for new measurements. The output
of the new measurements will be compared to the previously stored results. Biometric
measurements generate noisy data and it is a challenging problem to achieve security with
noisy data [17]. Among other things non-reproducibility is the most difficult problem to be
9. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 9
solved for biometric application, including cryptographic key binding. In this section the
common methods we found in literature are briefly described as follows:
Quantization
Individual biometric image will be quantized into a number of small units. The
biometric information inside each unit will be assumed to locate at the centre of the
unit.
Subsetting:
Let us consider a scenario where a biometric template represented with N symbols.
During verification, if only a subset of information P (P<N), is sufficient to
reconstruct the raw biometric information referred to as a subsetting error correction.
The prime example for a subsetting is a fuzzy vault scheme.
Error correction Coding:
Error correction coding is often used for noisy data. Two common choices are Reed-
Solomon (RS) coding and Hadamard coding, especially for iris encoding. During
registration, some redundant information (also called helper data) about the biometric
will also be collected and stored to correct the error bits at verification.
10. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 10
6. FLOW DIAGRAM
6.1 Traditional Biometric Methods:
Fig 6.1.1 The processing of a conventional biometric method.
As shown in above fig, the processes of traditional biometric methods include the
following subsystems: Data collection, Signal processing, Biometric feature extraction,
biometric feature registration/biometric feature input, and matching and decision (i.e.,
comparing biometric features to determine whether they match). Generally speaking, one
needs to first register/store biometric feature data (in the registration phase) for matching.
11. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 11
Once this is completed, the biometric device allows the user to input his or her biometric
feature data (in the matching phase) for comparison of the biometric features against those in
the registration phase (in the compare biometric feature function) to determine if they match.
If the biometrics of the prestored biometric features in the registration phase and those in the
matching phase inputted by the user are found to match, then the device outputs a recognition
result of “Authentication Successful.” Otherwise, the biometric device outputs a recognition
result of “Authentication Failed.” Generally speaking, the steps in the registration phase and
in the matching phase are processed similarly. For example, the matching phase is divided
into the following steps: data collection, signal processing, biometric feature extraction, and
biometric feature input. In terms of biometric feature matching, for the matching of the
biometric feature registration data and the biometric feature input data, biometric
authentication usually determines acceptability based on a threshold value.
6.2 Performance of biometric:
Fig 6.2.1 FAR & FRR
False acceptance rate (FAR): The false acceptance rate, or FAR, is the measure of
the likelihood that the biometric security system will incorrectly accept an access
attempt by an unauthorized user. A system’s FAR typically is stated as the ratio of the
number of false acceptances divided by the number of identification attempts
False recognition rate or (FRR): The false recognition rate, or FRR, is the measure
of the likelihood that the biometric security system will incorrectly reject an access
attempt by an authorized user. A system’s FRR typically is stated as the ratio of the
number of false recognitions divided by the number of identification attempts.
12. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 12
7. MATHEMATICAL MODELLING
A few biometrics, including keystroke dynamics, voice, handwritten signatures, face,
iris, and fingerprint, have been proposed for cryptographic key binding. For different
biometrics, different techniques have to be chosen to solve the fuzzy measurement problems.
Biometrics-exemplified error tolerance techniques are as follows:
7. 1 Quantization
In order to generate feature vectors from the Delaunay net, for each minutiae mi the set
of triangles which share the vertex mi is considered [11]. This is denoted as the local structure
centered on mi, TSi. Thereafter to make it cancelable, non-invertible transformation is applied
using a pair-polar coordinate system. In order to tolerate the deformations each segment of the
feature vector is quantized.
Therefore, deformations that lie within the quantized interval would not have an effect
since each quantized interval is represented with a unique symbol. Furthermore, the matching
between a saved template and a query is being done on the transformed domain by
considering number of matched triangles in each local structure. In this scheme error
tolerance is handled by quantization as well as the properties of Delaunay triangles.
Delaunay net is capable of keeping the same local structure even under elastic
distortion of a fingerprint to a certain extent. With regard to quantization, it is important to
select the quantization step appropriately, since a larger value would result in an increase in
False Acceptance Rate (FAR).
A grid based quantization scheme was used to generate a cancelable fingerprint
template by Ahmad T. et al [12]. The scheme requires the identification of a singular point
(SP) which acts as the reference to represent minutiae in the fingerprint. In their construction,
the orientation of SP is taken as the x-axis and then a grid is formed in such a way that it
covers the complete fingerprint area. Furthermore, all the points that lie in a particular cell are
mapped to the center of the cell. Then to make the template non-invertible, minutiae
containing cell centers are projected to a line L which passes through SP at an angle of α to x-
axis.
Finally, the line is partitioned into equal size (p) non-overlapping sectors and the
number of points in each sector is mapped to a vector v. The vector v acts as the transformed
template which is irreversible since the mapping is many to one. The main issue with this
construction is the use of a singular point as a reference. If the singular point cannot be
13. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 13
located properly or it does not exist then the complete verification process would be failed.
Furthermore, quantization schemes have been widely adopted as a primary error tolerance
method to allow intra-user variations and we will examine how this technique is incorporated
with most of other error tolerance methods in the following methods.
7.2 Subsetting:
Fuzzy Vault Scheme:
Fuzzy vault is a bio-cryptographic construction initially introduced by Juels A. et al.
[13] which was evolved from the Shamir’s secret sharing scheme for binding cryptographic
keys or secrets. The security of this scheme depends completely upon the polynomial
reconstruction problem.
Let us consider a secret K, and during vault construction it is encoded into
coefficients of a polynomial P of degree D. Then, the vault V is constructed by projecting a
user specific N element bio-information set into the polynomial while including C element
chaff set which does not lie on P. Furthermore, the secret is recovered through polynomial
reconstruction after identifying possible D +1out of N original points by presenting the query
bio-information set.
In this method we have only considered the constructions which use error detection
methods to identify occurrences of errors. CRC is an error detecting coding scheme, which
uses polynomial division to attach a checksum to the original data. Upon reception it is
possible to detect whether there are any errors in the received message by considering the
received checksum and the re-calculated checksum of the received message.
This phenomenon was used by Uludag et al. to introduce a fuzzy vault scheme for
establishing a secure biometric key release based on fingerprint minutiae features [14]. In
their construction, they have used the algorithm given to represent minutiae information of a
user as a feature vector triplet of (x,y,θ) in which x, y and θ represents the row index, column
index and the ridge orientation respectively. Then the CRC checksum of the secret S is
computed and appended to the tail of S to generate a new secret code SC which is encoded in
the coefficients of a polynomial P of degree D. Then the minutiae information is quantized
and projected into the polynomial P and these genuine results are combined with a chaff set to
construct the vault V which is randomized to make sure that any information vital for
separating chaff and genuine points is being hidden.
During the verification phase, first the minutiae information of the user who is trying
to be authenticated is extracted after aligning the templates with the help of higher curvature
14. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 14
points in two fingerprints and the query feature set u∗
i which is to be used in polynomial
reconstruction is generated. Then, the subset of points that lie in both u∗
i and vj which is the
abscissa of the vault V is identified. Considering K number of such points have found; then
they are divided into all possible (D + 1) combinations, and for each such (D + 1) pair
Lagrange interpolation polynomial is obtained and from the coefficients the possible secret
code SC∗ can be revealed. Then, the polynomial corresponding to SC∗ is used to evaluate the
CRC checksum and thereby it is possible to conclude that there are no errors in SC∗.
The only difference in the encoding process is that the chaff points are added in such a
way that the minimum distance between the added chaff and the points in the abscissa is
greater than a threshold δ. The main issue with the discussed fuzzy vault schemes is that the
recovery complexity is only governed by the ability of separating the genuine points of the
vault from the chaff points.
Therefore, if an intruder can somehow separate them, the secret can be unravelled. As
a way of overcoming this, a cancelable fuzzy vault using Delaunay triangles was proposed by
Yang W. etal. [20].In this solution the vault is constructed from the many to one transformed
local structures obtained through Delaunay triangulation. Moreover, in this seminar discuss
several other constructions that are used to secure the fuzzy vault embeddings through
encrypting vault points when we elaborate the influence of ECCs in bio-cryptosystems.
7.3 Error Correction Codes (ECCs):
Overviews of ECCs used in Bio-Cryptography are as follows:
• Reed Solomon:
Codes RS codes are a set of algebraic codes which are used for error correction at
block level. Let us consider m blocks of information [x1, x2,..., xm] and during encoding
these blocks are embedded in coefficients of a polynomial P of degree (m−1).
P(y) = x m y(m−1)
+ x(m−1)y(m−2)
+ ... +1 -------(1)
Then, it is possible to generate n codewords from P(y) by substituting values y € Z. From a
corrupted codeword, RS decoding can reconstruct the original polynomial given that n =
m+2t. Afterwards, by extracting the coefficients of the reconstructed polynomial, the original
message can be unravelled.
• Hadamard Codes:
Hadamard codes are constructed from the Hadamard matrix which is a square
orthogonal matrix with elements 1 or−1 [21]. If Hc(k) and H(k) denote a set of Hadamard
codes and a Hadamard matrix; then,
15. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 15
Hc
(k)
= (H(k) /-H(k))
Therefore, if we consider a Hadamard code of size n =2k
bits, then there are 2n
codewords
generated in total. Furthermore, the code is having a minimum distance of 2k−1
which suggests
that it can correct up to 2k−2
-1 error.
When encoding, the message m should be of k+1 bits and depending on the value of m, mth
row codeword of the matrix Hc(k)
is allocated as the related codeword of the message w.
When decoding with a corrupted codeword w′, w′ HT
c (k) is computed and the position of the
raw matrix which has the highest value would be the original message.
16. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 16
8. CONCLUSION
Error control methods play a vital role in bio-cryptosystems to compensate for the
fuzziness associated with biometric inputs. In this paper, a thorough investigation is carried
out on existing error tolerance techniques including score based methods, quantization,
subsetting, ECCs and importantly how they are deployed to achieve the requirements of bio-
cryptosystems. Most of the initial biometric based constructions were using singular point
based verification processes, which lead for the increased error rates. The relative biometric
information based verification solutions helped in improving the error rates through localizing
the errors and further smoothed through combination of error correction methods.
Furthermore, depending upon the type of biometric information to be protected and their
associated error patterns, the error correcting methods must be selected accordingly.
Especially with ECCs, error correcting capability characteristics and granularity of ECCs
must be considered when choosing the optimal coding scheme to achieve a better trade off
between performance parameters FAR and FRR. Thus, it is vital to select appropriate error
tolerance methods to be deployed, considering the security requirements of the underlying
bio-crypto system since eliminating the errors completely is impossible to achieve in practice.
17. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 17
REFERENCES
[1] F. Sufi and I. Khalil, “Faster Person Identification Using Compressed ECG in Time
Critical Wireless Telecardiology Applications”, in Journal of Network and Computer
Applications, vol. 34(1), pp. 282-293, 2011.
[2] F. Sufi and I. Khalil, “An Automated Patient Authentication System For Remote
Telecardiology”, in Proceedings of the IEEE International Conference on Intelligent
Sensors, Sensor Networks and Information Processing, pp. 279-284, Sydney,
Australia, Dec. 2008.
[3] F. Sufi, I. Khalil, and J. Hu “ Ecg-based Authentication”, in P. Stavroulakis, M.
Stamp, eds.: Handbook of Information and Communication Security, Springer, pp.
309-331, Berlin, Germany, Dec. 2010. [4] K. Xi, Y. Tang, and J. Hu “ Correlation
Keystroke Verification Scheme for User Access Control in Cloud Computing
Environment”, in The Computer Journal, Vol. 54(10), pp. 1632-1644, Jul. 2011.
[4] P.Tuyls, B. Skoric, and T. Kevenaar. Security with noisy data: On Private
Biometrics, Secure Key Storage and Anti-Counterfeiting, 1st Edition, Springer, 2007.
[7] T.C. Clancy, N. Kiyavash and D.J. Lin, “Secure smart card-based fingerprint
authentication,” Proceedings of the 2003 ACM SIGMM Workshop on Biometrics
Methods and Application, WBMA 2003.
[8] F.Monrose, M.K. Reiter, Q. Li and S. Wetzel, “Cryptographic key generation from
voice,” Proceedings of the 2001 IEEE Symposium on Security and Privacy, May
2001.
[9] A.Goh, D.C.L. Ngo, “Computation of cryptographic keys from face biometrics,”
International Federation for Information Processing 2003, Springer-Verlag, LNCS
2828, pp. 1–13, 2003.
16
[10] F. Hao, C.W. Chan, “Private key generation from on-line handwritten
signatures,” Information Management & Computer Security, Issue 10, No. 2, pp. 159–
164, 2002.
[11] F. Monrose, M.K. Reiter and R. Wetzel, “Password hardening based on keystroke
dynamics,” Proceedings of sixth ACM Conference on Computer and Communications
Security, CCCS 1999.
18. Error Handling Methods Used in Bio-Cryptography
Dept Of I.T., SSGMCE, Shegaon Page 18
[12] T. Ahmad, J. Hu, and S. Wang “String-based Cancelable Fingerprint Templates”,
in Proceedings of 6th IEEE Conference on Industrial Electronics and Applications
(ICIEA), pp. 1028-1033, Beijing, China, Jun. 2011.
[13] A. Juels and M. Sudan “A Fuzzy Vault Scheme”, in Proceedings of IEEE
International Symposium on Information Theory, Jun. 2002.
[14] U. Uludag, S. Pankanti, and A. K. Jain, “Fuzzy Vault for Fingerprints”, in
Proceedings of 5th International Conference on Audio and Video based Biometric
Person Authentication, vol. 3546, pp. 310-319, New York, USA, Jul. 2005.
[15] A. C. Jain, L. Hong, S. Pankanti, and R. Bolle “An Identity Authentication
System using Fingerprints”, in Proceedings of the IEEE, vol.85(9) , pp. 1365-1388,
Sep. 1997.
[16] U. Uludag and A. Jain “Securing Fingerprint Template: Fuzzy Vault with Helper
Data”, in Proceedings of IEEE Computer Vision and Pattern Recognition Workshop,
pp. 163-167, New York, USA, Jun. 2006.
[17] Y. Wang and K. N. Plataniotis “Fuzzy Vault for Face based Cryptographic Key
Generation,” in Proceedings of IEEE Biometrics Symposium, pp. 1-6, Maryland,
USA, Sep. 2007.
[18] K. Nandakumar, A. K. Jain, and S. Pankanti, “Fingerprint based Fuzzy Vault:
Implementation and Performance”, in IEEE Transactions on Information Forensics
and Security, vol. 2(4), pp. 744-757, New Jersey, USA, Dec. 2007.
[19] Y. Chen, S. C. Dass, and A. K. Jain, ”Fingerprint quality indices for predicting
authentication performance,” in of the 5th International Conference on Audio- and
Video-Based Biometric Person Authentication, pp. 160170, Berlin, Germany, Jul.
2005.
[20] W. Yang, J. Hu, and S. Wang “A Delaunay Triangle Group Based Fuzzy Vault
with Cancellability”, in Proceedings of 6th IEEE International Congress on Image and
Signal Processing, Hangzhou, China, Dec. 2013.