Elsevier

Pattern Recognition

Volume 37, Issue 11, November 2004, Pages 2245-2255
Pattern Recognition

Biohashing: two factor authentication featuring fingerprint data and tokenised random number

https://doi.org/10.1016/j.patcog.2004.04.011Get rights and content

Abstract

Human authentication is the security task whose job is to limit access to physical locations or computer network only to those with authorisation. This is done by equipped authorised users with passwords, tokens or using their biometrics. Unfortunately, the first two suffer a lack of security as they are easy being forgotten and stolen; even biometrics also suffers from some inherent limitation and specific security threats. A more practical approach is to combine two or more factor authenticator to reap benefits in security or convenient or both. This paper proposed a novel two factor authenticator based on iterated inner products between tokenised pseudo-random number and the user specific fingerprint feature, which generated from the integrated wavelet and Fourier–Mellin transform, and hence produce a set of user specific compact code that coined as BioHashing. BioHashing highly tolerant of data capture offsets, with same user fingerprint data resulting in highly correlated bitstrings. Moreover, there is no deterministic way to get the user specific code without having both token with random data and user fingerprint feature. This would protect us for instance against biometric fabrication by changing the user specific credential, is as simple as changing the token containing the random data. The BioHashing has significant functional advantages over solely biometrics i.e. zero equal error rate point and clean separation of the genuine and imposter populations, thereby allowing elimination of false accept rates without suffering from increased occurrence of false reject rates.

Introduction

Today's human authentication factors have been placed in three categories, namely What you know, e.g password, secret, personal identification number (PIN); What you have, such as token, smart card etc. and What you are, biometrics for example. However, the first two factors can be easily fooled. For instance, password and PINs can be shared among users of a system or resource. Moreover, password and PINs can be illicitly acquired by direct observation. The main advantage of biometrics is that it bases recognition on an intrinsic aspect of a human being and the usage of biometrics requires the person to be authenticated to be physically present at the point of the authentication. These characteristics overcome the problems whereas password and token are unable to differentiate between the legitimate user and an attacker. In addition biometric authentication information cannot be transferred or shared; it is a powerful weapon against repudiation. However, it also suffers from some inherent biometrics-specific threats [1]. The main concern of the public for the biometric usage is the privacy risks in biometric system. If an attacker can intercept a person's biometric data, then the attacker might use it to masquerade as the person, or perhaps simple to monitor that person's private activities. These concerns are aggravated by the fact that a biometrics cannot be changed. When a biometrics is compromised, however, a new one cannot be issued.

Besides that, the nature of biometrics system offers binary (yes/no) decisions scheme, which is well defined in the classical framework of statistical decision theory, thereby provided four possible outcomes are normally called as false accept rate (FAR), correct accept rate (CAR), false reject rate (FRR) and correct reject rate (CRR) [2]. By manipulating the decision criteria, the relative probabilities of these four outcomes can be adjusted in a way that reflected their associated cost and benefits. In practice, that is almost impossible to get both zero FAR and FRR errors due to the fact that the classes are difficult to completely separate in the measurement space. According to Bolle et al. [3], the biometrics industry emphasis heavily on security issues relating to FAR with relaxed the FRR requirement. However, the overall performance of a biometric system cannot be assessed based only on this metric. High FRR, i.e. rejection of valid users, which is resulted by low FAR, is often largely neglected in the evaluation of biometric systems. However, this will give an impact on all major aspects of a biometric system as pointed in Ref. [4]. Denial of access in biometric systems greatly impacts on the usability of the system by failing to identify genuine user, and hence on the public acceptance of biometrics in the emerging technology. Both aspects may represent significant obstacles to the wide deployment of biometric systems.

Multimodal biometrics fusion i.e. systems employing more than one biometric technology to establish the identity of an individual, is able to improve the overall performance of the biometric system by checking multiple evidences of the same identity [5]. Multimodal biometrics can reduce the probability of denial of access without sacrificing the FAR performance by increasing the discrimination between the genuine and imposter classes [6], [7]. Despite of that, multimodal biometrics is not a solution for the privacy invasion problem, though the difficulty of attack activities may increase to certain degree. Moreover, use of multiple biometric measurement devices will certainly impose significant additional costs, more complex user-machine interfaces and additional management complexity [4].

The most practical way of addressing the privacy invasion problem is to combine two or more factor authenticators. A common multi-factor authenticator is an ATM card, which combines a token with a secret (PIN). Combination of password or secret with a biometrics is not so favorable, since one of the liabilities of biometrics is to get rid of the task of memorising the password. As a user has difficulty remembering the secret, a token may be combined with a biometrics. A token is a physical device that can be thought of as a portable storage for authenticator, such as ATM card, smart card, or an active device that yields time-changing or challenged-based passwords. The token can store human-chosen passwords, but an advantage is to use these devices to store longer codewords or pseudo-random sequence that a human cannot remember, and thus they are much less easily attacked. Presently, there are quite a number of literature reported the integration of biometrics into the smartcard [8], [9], [10]. However, the only effort being applied in this line is to store the user's template inside a smart card, protected with Administrators Keys, and extracted from the card by the terminal to perform verification. Some are allowed to verify themselves in the card, whenever the verification is positive, the card allows the access to the biometrically protected information and/or operations [11]. Obviously, these configurations are neither a remedy for the afore-mentioned invasion of privacy problem nor reduce the probability of denial of access with no expense of an increase in the FAR. Most recently, Ho and Armington [12] reported a dual-factor authentication system that designed to counteract imposter by pre-recorded speech and the text-to-speech voice cloning technology, as well as to regulate the inconsistency of audio characteristics among different handsets. The token device generates and prompts an one time password (OTP) to the user. The spoken OTP is then forwarded simultaneously to both a speaker verification module, which verifies the user's voice, and a speech recognition module, which converts the spoken OTP to text and validates it. Despite of that, no attempt for the FAR–FRR interdependent problem is reported.

In this paper, a novel two factor authentication approach which combined user specified tokenised random data with fingerprint feature to generate a unique compact code per person is highlighted. The discretisation is carried out by iterated inner product between the pseudo-random number and the wavelet Fourier–Mellin transform (FMT) fingerprint feature, and finally deciding each bit on the sign based on the predefined threshold. Direct mixing of pseudo-random number and biometric data—BioHashing is an extremely convenient mechanism with which to incorporate physical tokens, such as smart card, USB token etc. thereby resulting in two factors (token+biometrics) credentials via tokenised randomisation. Hence, it protects against biometric fabrication without adversarial knowledge of the randomisation or equivalently possession of the corresponding token. Tokenised discretisation also enables straightforward revocation via token replacement, and furthermore, biohashing has significant functional advantages over solely biometrics i.e. zero equal error rate (EER) point and eliminate the occurrence of FAR without overly imperil the FRR performance.

The outline of the paper is as follow: Section 2 presents the integrated framework of wavelet transform and the FMT for representing the invariant fingerprint feature as well as BioHashing procedure. Section 3 presents the experimental results and the discussion, and followed by concluding remarks in Section 4.

Section snippets

BioHashing overview

BioHashing methodology can be decomposed into two components: (a) an invariant and discriminative integral transform feature of the fingerprint data, with a moderate degree of offset tolerance. This would involve the use of integrated wavelet and Fourier–Mellin transform framework (WFMT) that reported in Ref. [13]. In this framework, wavelet transform preserves the local edges and noise reduction in the low-frequency domain (high energy compacted) after the image decomposition, and hence makes

Experiments and discussion

In this paper, the proposed methodology is evaluated on images taken from FVC 2002 (Set A), which is available in DVD included in Ref. [19]. FVC2002 (Set A) provided four different fingerprint databases: DB1, DB2, DB3 and DB4, three of these databases are acquired by various sensors, low cost and high quality, optical and capacitive whereas the fourth contains synthetically generated images. In this paper, we had selected DB1 as the experiment benchmark to vindicate the propose methodology. DB1

Concluding remarks

This paper described a novel error-tolerant discretisation methodology from user-specific fingerprint images and uniquely serialised tokens. The two factor BioHashing has significant functional advantages over solely biometrics or token usage, such as extremely clear separation of the genuine and the imposter populations and zero EER level, thereby mitigate the suffering from increased occurrence of FRR when eliminate the FAR. The process of generating a token of pseudo-random vectors taking

Summary

Human authentication is the security task whose job is to limit access to physical locations or computer network only to those with authorisation. This is done by equipped authorised users with passwords, tokens or using their biometrics. Unfortunately, the first two suffer a lack of security as they are easy being forgotten and stolen; even biometrics also suffers from some inherent limitation and specific security threats, for instance, if an attacker can intercept a person's biometric data,

About the Author—ANDREW TEOH BENG JIN obtained his B.Eng. (Electronics) in 1999 and Ph.D. degree in 2003 from National University of Malaysia. He is currently a lecturer of Faculty of Information Science and Technology, Multimedia University. He held the post of co-chair (Biometrics Division) in Center of Excellent in Biometrics and Bioinformatics in the same university. His research interest is in multimodal biometrics, pattern recognition, multimedia signal processing and Internet security.

References (20)

There are more references available in the full text version of this article.

Cited by (626)

View all citing articles on Scopus

About the Author—ANDREW TEOH BENG JIN obtained his B.Eng. (Electronics) in 1999 and Ph.D. degree in 2003 from National University of Malaysia. He is currently a lecturer of Faculty of Information Science and Technology, Multimedia University. He held the post of co-chair (Biometrics Division) in Center of Excellent in Biometrics and Bioinformatics in the same university. His research interest is in multimodal biometrics, pattern recognition, multimedia signal processing and Internet security.

About the Author—DAVID CHEK LING NGO is an Associate Professor and the Dean of the Faculty of Information Science & Technology at Multimedia University, Malaysia. He has worked there since 1999. Ngo was awarded a BAI in Microelectronics & Electrical Engineering and Ph.D. in Computer Science in 1990 and 1995, respectively, both from Trinity College Dublin. Ngo's research interests lie in the area of Automatic Screen Design, Aesthetic Systems, Biometric Encryption, and Knowledge Management. He is author and co-author of over 20 invited and refereed papers. He is a member of Review Committee of Displays and Multimedia Cyberscape.

About the Author—ALWYN GOH is an experienced and well-published researcher in biometrics, cryptography and information security. His work is recognised by citations from the Malaysian National Science Foundation and the European Federation of Medical Informatics. He previously lectured Computer Sciences at Universiti Sains Malaysia where he specialised in data-defined problems, client server computing and cryptographic protocols. Goh has a Masters in Theoretical Physics from the University of Texas, and a Bachelors in Electrical Engineering and Physics from the University of Miami.

View full text