Using an enrolled biometric dataset to detect adversarial examples in biometrics-based authentication system
11496466 · 2022-11-08
Assignee
Inventors
Cpc classification
H04L63/0861
ELECTRICITY
H04L63/1466
ELECTRICITY
International classification
G06F15/16
PHYSICS
Abstract
A computer-implemented method for improving security of a biometrics-based authentication system comprises receiving, by one or more servers, enrolled biometric samples of an enrolled user during an enrollment stage of the biometrics-based authentication system. Augmented biometric samples are created by adding learned perturbations to the enrolled biometric samples of the enrolled user. During a request for authentication, submitted biometric samples are received from a second user. The submitted biometric samples of the second user are compared to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics. Based on the comparison it is determined whether the submitted biometric samples of the second user have been modified to impersonate the enrolled user.
Claims
1. A computer-implemented method for improving security of a biometrics-based authentication system, comprising: receiving, by one or more servers, enrolled biometric samples x.sub.i of an enrolled user during an enrollment stage of the biometrics-based authentication system; applying a function f(⋅) to the enrolled biometric samples x.sub.i to generate a biometric template f(x.sub.i); creating, by the one or more servers, augmented biometric samples by adding perturbations to the enrolled biometric samples of the enrolled user; receiving, by the one or more servers, during a request for authentication, submitted biometric samples x′ from a second user; applying the function f(⋅) to the submitted biometric samples x′ to generate a biometric template f(x′); comparing, by the one or more servers, the submitted biometric samples of the second user to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics by: computing a distance between the biometric template f(x.sub.i) of the enrolled user with the biometric template f(x′), and responsive to determining that the distance is less than a first threshold, temporarily authorizing the request for authentication; and based on the comparison, determining, by the one or more servers, that the submitted biometric samples of the second user have been modified to impersonate the enrolled user.
2. The method of claim 1, further comprising: storing the enrolled biometric samples x.sub.i, the biometric template f(x.sub.i) and the augmented biometric samples as an enrolled biometric dataset.
3. The method of claim 1, wherein determining that the submitted biometric samples of the second user have been modified is activated only in response to the request for authentication being temporarily authorized.
4. The method of claim 3, further comprising: responsive to the request for authentication being temporarily authorized, receiving, by an adversarial perturbation detector, the enrolled biometric samples x.sub.i of the enrolled user and the submitted biometric samples of the second user, where the submitted biometric samples comprise perturbed adversarial samples x′+Δx′; and responsive to detecting any perturbations, rejecting the request for authentication of the second user, and otherwise, granting the request for authentication and authorizing the second user.
5. The method of claim 4, wherein determining that any perturbations are detected further comprises: applying a transformation function k(⋅) to both the enrolled biometric samples x.sub.i and the perturbed adversarial samples x′+Δx′ to generate transformed enrolled samples k(x.sub.i) and transformed adversarial sample k(x′+Δx′), which are in a transformed subspace; computing a distance F between the transformed enrolled samples k(x.sub.i) and the transformed adversarial sample k(x′+Δx′) in the transformed subspace; and determining the transformed adversarial sample k(x′+ΣΔx′) is adversarial when the distance F is greater than a second threshold t′ indicating that one or more perturbations have been detected.
6. The method of claim 5, wherein the enrolled biometric samples xi include augmented biometric samples xi′ such that the enrolled biometric samples xi=[xi, xi′], and wherein the x.sub.i and x.sub.i′ of the enrolled biometric samples are input to the transformation function k(⋅) using one of a parallel model and a sequential model.
7. The method of claim 5, further comprising training an adversarial perturbation detector prior to deployment by: performing the training using a transformation CNN k(⋅) having learnable parameters θ and a classifier having learnable parameters σ, and using a training set to learn the parameters θ and σ, wherein the training set comprises the enrolled biometric sample x.sub.i, the public biometric sample x of the enrolled user, and the perturbed adversarial template x′+Δx′ of the second user; inputting the training set to the transformation function k(⋅) to generate the transformed enrolled samples k(x.sub.i), a transformed publicly available sample k(x), and the transformed adversarial sample k(x′+Δx′); classifying, by a classifier with learnable parameters σ, the transformed adversarial sample k(x′+Δx′) as a success or as a fail based on the transformed enrolled samples k(x.sub.i); and based on a result of the classification, updating the learnable parameters θ and σ.
8. A non-transitory computer readable medium having stored thereon software instructions that, when executed by a processor, cause the processor to improve security of a biometrics-based authentication system, the instructions comprising: receiving, by one or more servers, enrolled biometric samples x.sub.i of an enrolled user during an enrollment stage of the biometrics-based authentication system; applying a function f(⋅) to the enrolled biometric samples x.sub.i to generate a biometric template f(x.sub.i); creating, by the one or more servers, augmented biometric samples by adding learned perturbations to the enrolled biometric samples of the enrolled user; receiving, by the one or more servers, during a request for authentication, submitted biometric samples x′ from a second user; applying the function f(⋅) to the submitted biometric samples x′ to generate a biometric template f(x′); comparing, by the one or more servers, the submitted biometric samples of the second user to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics by computing a distance between the biometric template f(x.sub.i) of the enrolled user with the biometric template f(x′), and responsive to determining that the distance is less than a first threshold, temporarily authorizing the request for authentication; and based on the comparison, determining, by the one or more servers, that the submitted biometric samples of the second user have been modified to impersonate the enrolled user.
9. The non-transitory computer readable medium of claim 8, further comprising: storing the enrolled biometric samples x.sub.i, the biometric template f(x.sub.i) and the augmented biometric samples as an enrolled biometric dataset.
10. The non-transitory computer readable medium of claim 8, wherein determining that the submitted biometric samples of the second user have been modified is activated only in response to the request for authentication being temporarily authorized.
11. The non-transitory computer readable medium of claim 10, further comprising: responsive to the request for authentication being temporarily authorized, receiving, by an adversarial perturbation detector, the enrolled biometric samples x.sub.i of the enrolled user and the submitted biometric samples of the second user, where the submitted biometric samples comprise perturbed adversarial samples x′+Δx′; and responsive to detecting any perturbations, rejecting the request for authentication of the second user, and otherwise, granting the request for authentication and authorizing the second user.
12. The non-transitory computer readable medium of claim 11, wherein determining that any perturbations are detected further comprises: applying a transformation function k(⋅) to both the enrolled biometric samples x.sub.i and the perturbed adversarial samples x′+Δx′ to generate transformed enrolled samples k(x.sub.i) and a transformed adversarial sample k(x′+Δx′), which are in a transformed subspace; computing a distance F between the transformed enrolled samples k(x.sub.i) and the transformed adversarial sample k(x′+Δx′) in the transformed subspace; and determining the transformed adversarial sample k(x′+ΣΔx′) is adversarial when the distance F is greater than a second threshold t′ indicating that one or more perturbations have been detected.
13. The non-transitory computer readable medium of claim 12, further comprising training an adversarial perturbation detector prior to deployment by: performing the training using a transformation k(⋅) having learnable parameters θ and a classifier having learnable parameters σ, and using a training set to learn the parameters θ and σ, wherein the training set comprises the enrolled biometric sample x.sub.i, the public biometric sample x of the enrolled user, and the perturbed adversarial template x′+Δx′ of the second user; inputting the training set to the transformation function k(⋅) to generate the transformed enrolled samples k(x.sub.i), a transformed publicly available sample k(x), and the transformed adversarial sample k(x′+Δx′); classifying, by a classifier with learnable parameters σ, the transformed adversarial sample k(x′+Δx′) as a success as a fail based on the transformed samples k(x.sub.i); and based on a result of the classification, updating the learnable parameters θ and σ.
14. A system, comprising: a memory; a processor coupled to the memory; and a software component executed by the processor that is configured to: receive enrolled biometric samples x.sub.i of an enrolled user during an enrollment stage of the biometrics-based authentication system; apply a function f(⋅) to the enrolled biometric samples x.sub.i to generate a biometric template f(x.sub.i); create augmented biometric samples by adding learned perturbations to the enrolled biometric samples of the enrolled user; receive during a request for authentication, submitted biometric samples x′ from a second user; apply the function f(⋅) to the submitted biometric samples x′ to generate a biometric template f(x′); compare the submitted biometric samples of the second user to the enrolled biometric samples and to the augmented biometric samples of the enrolled user based on predefined metrics by: computing a distance between the biometric template f(x.sub.i) of the enrolled user with the biometric template f(x′), and responsive to determining that the distance is less than a first threshold, temporarily authorizing the request for authentication; and based on the comparison, determine that the submitted biometric samples of the second user have been modified to impersonate the enrolled user.
Description
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
DETAILED DESCRIPTION
(10) The exemplary embodiments relate to using an enrolled biometric dataset to detect adversarial examples in biometrics-based authentication system. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the exemplary embodiments and the generic principles and features described herein will be readily apparent. The exemplary embodiments are mainly described in terms of particular methods and systems provided in particular implementations. However, the methods and systems will operate effectively in other implementations. Phrases such as “exemplary embodiment”, “one embodiment” and “another embodiment” may refer to the same or different embodiments. The embodiments will be described with respect to systems and/or devices having certain components. However, the systems and/or devices may include more or less components than those shown, and variations in the arrangement and type of the components may be made without departing from the scope of the invention. The exemplary embodiments will also be described in the context of particular methods having certain steps. However, the method and system operate effectively for other methods having different and/or additional steps and steps in different orders that are not inconsistent with the exemplary embodiments. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
(11) The disclosed embodiments relate to using an enrolled biometric dataset to detect adversarial examples in a biometrics-based authentication system. Applicants recognize that one way to design stronger defense mechanisms is to incorporate domain knowledge. According to the disclosed embodiments, domain knowledge is used in a biometric verification setting to design a strong defense mechanism from adversarial biometric samples. More specifically, security of a biometrics-based authentication system is improved by using enrolled biometric samples of a user to detect adversarial biometric samples crafted specifically to target the enrolled user's identity. It is assumed that the adversary crafting the adversarial biometric samples does not have access to the enrolled biometric samples. However, augmented biometric samples with perturbations (realistic or synthetic and designed by human or automatically generated by a machine learning algorithm) are added to the enrolled biometric samples with the aim of increasing the difficulty of crafting adversarial biometric samples if the enrolled biometric samples are accessed by an adversary due to, for example, an insider data breach. The enrolled biometric samples with the augmented enrolled biometric samples are compared with the adversarial biometric samples based on different pre-defined metrics. Based on the comparison, a decision is made whether the adversarial biometric samples have been modified to impersonate the enrolled user.
(12)
(13) In one embodiment, the biometric authentication system 12 may be the front end for another system, such as a payment processing network 18, to authenticate users for a transaction. The payment processing network 18 may refer to an entity that receives transaction authorization requests from merchants or other entities and provides guarantees of payment, in some cases through an agreement between the transaction service provider and an issuer institution. The payment processing network supports and delivers payment related services (e.g., authentication services, authorization services, exception file services, and clearing and settlement services, etc.). Examples of a payment processing network may include a payment network provided by Visa®, MasterCard®, American Express®, or any other entity that processes credit card transactions, debit card transactions, and other types of commercial transactions.
(14) The biometric authentication system 12 acquires biometric samples of a first user (e.g., enrolled user A) during an enrollment stage. The enrollment stage may be performed through a software authentication application 34 provided by the biometric authentication system 12 that runs on either one of the servers 20 or a user device 14 (e.g., smartphone, PC, watch or tablet). During the enrollment process the authentication application 34 prompts the enrolled user to create enrolled biometric samples 30, which in one embodiment, are images of the user's face/head. In one embodiment, the images may be taken with the user device 14, and the authentication application 34 transmits the enrolled biometric samples 30 to the biometric authentication system 12 over the network 16, where they are received by one of the servers 20. The enrolled biometric samples 30 are processed by a projection DNN 22 into enrolled biometric dataset 32 and stored in the enrollment database 24, as explained further below.
(15) Subsequently, the biometric authentication system 12 receives an authentication request from a second user (user B). The authentication request transmitted from the second user includes submitted biometric samples 36. The request can be made by the enrolled user or another user, e.g., user B, or user B may be an attacker. The submitted biometric samples 36 are processed by the projection DNN 22 into submitted biometric datasets (not shown) and the similarity comparator 26 compares the enrolled biometric datasets 32 of the enrolled user to the submitted biometric datasets. If there is a match within one or more thresholds, the user is authorized and gains access to the payment processing network 18. If not, authorization request of the second user is rejected.
(16) Before describing the adversarial defense system 28 that enhances security of the biometric authentication system 12, the authentication process performed by the biometric authentication system 12 and attacks thereon are first explained.
(17)
(18) Similarly, the biometric authentication system 12 receives the submitted biometric sample x′ 36 (one or many) from the same or a second user. The projection DNN 22 applies the same mapping function f(⋅) to the submitted biometric samples 36 to generate a biometric template f(x′) 202. The mapping function f(⋅) projects x.sub.i and x′ to a common embedding subspace.
(19) The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 with the biometric template f(x′) 202 in the embedding subspace. This is done by computing a distance between the biometric template 200 and biometric template 202. The distance may be represented as similarity score or distance score, such that D=Σ.sub.i d(f(x)), f(x′)). If the similarity comparator 26 determines that the distance D is greater than a first threshold (t), the authorization request is rejected. Otherwise, the authentication request is authorized.
(20) Referring again to
(21)
(22) The perturbation Δx′ 300 may be crafted based either on the public biometric samples 38 or the enrolled biometric samples 30 if obtainable (e.g. during a breach of the enrollment database 24 or the user device 14 of the enrolled user). Due to the transferability of adversarial perturbation, even though the perturbation Δx′ is crafted based on the public biometric samples 38 of the enrolled user, the perturbation Δx′ can be used to attack the enrolled biometric samples 30. The perturbation Δx′ 300 may be physical or digital. In the example shown, the perturbation Δx′ 300 comprises cardboard glasses that are worn by the second user when taking biometric sample images to generate the perturbed adversarial sample x′+Δx′ 302.
(23) After the perturbed adversarial sample x′+Δx′ 302 is captured and submitted to the biometric authentication system, the projection DNN 22 applies the same mapping function f(⋅) to the perturbed adversarial sample x′+Δx′ 302 to generate a perturbed adversarial template f(x′+Δx′) 304. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 of the enrolled user with the perturbed adversarial template f(x′+Δx′) 304 in the embedding subspace. However, in this case since the perturbed adversarial template f (x′+Δx′) 304 is designed to be close in distance to the biometric template f(x.sub.i) 200, the similarity comparator 26 misclassifies and authorizes the attacking second user because the distance D or similarity score is less than the threshold t, i.e., Σ.sub.id(ƒ(x.sub.i), ƒ(x′+Δx′))<t. Thus, in this approach, the attacker uses a well-crafted perturbed adversarial biometric sample 302 to impersonate the enrolled user to attack the generic authentication system and generate a fraudulent transaction once authenticated.
(24) Referring again to
(25) In one embodiment, the adversarial defense system 28 may include an augmented biometric sample generator 42 that uses and modifies the enrolled biometric dataset 32 of the enrolled user to include: (i) the one or more enrolled biometric samples 30 of the enrolled user acquired during in the enrollment stage, which are difficult to obtain by the attacker, and (ii) augmented biometric samples 40 (can be both realistic or synthesized) created from the enrolled biometric samples 30 or the public biometric samples 38 that increase the difficulty of crafting perturbed adversarial samples 302 should the enrollment biometric dataset 32 be leaked due to an insider data breach. In one embodiment, the augmented biometric sample generator 42 adds learned perturbations 44 to the enrolled biometric samples 30 to generate the augmented biometric samples 40.
(26) In one embodiment, the projection DNN 22, the similarity comparator 26, and the adversarial defense system 28 are implemented as software components. In another embodiment, the components could be implemented as a combination of hardware and software. Although the projection DNN 22, the similarity comparator 26, and the adversarial defense system 28 are shown as separate components, the functionality of each may be combined into a lesser or greater number of modules/components. In addition, although one or more servers 20 are described as running the projection DNN 22, the similarity comparator 26, and the adversarial defense system 28, such components may be run on any type of one more computers that have a non-transitory memory and processor.
(27) Both the server 20 and the user devices 14 may include hardware components of typical computing devices (not shown), including a processor, input devices (e.g., keyboard, pointing device, microphone for voice commands, buttons, touchscreen, etc.), and output devices (e.g., a display device, speakers, and the like). The server 20 and user devices 14 may include computer-readable media, e.g., memory and storage devices (e.g., flash memory, hard drive, optical disk drive, magnetic disk drive, and the like) containing computer instructions that implement the functionality disclosed when executed by the processor. The server 20 and the user devices 14 may further include wired or wireless network communication interfaces for communication. It should be understood that the functions of the software components may be implemented using a different number of software components than that shown.
(28)
(29) Thereafter, during a request for authentication by a second user, the one or more servers 20 receive submitted biometric samples 36 from the second user (block 404).
(30) In response, the submitted biometric samples 36 of the second user are compared to the enrolled biometric samples 30 and to the augmented biometric samples of the enrolled user based on predefined metrics (block 406). In one embodiment, 46 may be implemented by the similarity comparator 26 executing on the one or more processors or servers 20.
(31) Based on the comparison, it is determined that the biometric samples of the second user have been modified with a perturbation 300 to impersonate the enrolled user (block 408). In one embodiment, block 408 may be implemented by the adversarial perturbation detector 28 executing on the one or more processors or servers 20. Referring again to
(32)
(33) First, the submitted biometric sample 36 comprising the perturbed adversarial sample x′+Δx′ 302 is received by the biometric authentication system 12, and the projection DNN 22 applies the mapping function f(⋅) to the perturbed adversarial sample x′+Δx′ 302 to generate the perturbed adversarial template f(x′+Δx′) 304. The similarity comparator 26 then compares the biometric template f(x.sub.i) 200 of the enrolled user from the enrollment database 24 with the perturbed adversarial template f(x′+Δx′) 304 in the embedding subspace by calculating the distance score as described in
(34) According to one aspect of the disclosed embodiments, the adversarial perturbation detector 28 is activated only in response to the request for authentication by second user being temporarily authorized by the biometric authentication system 12 via the similarity comparator 26 (block 500). According to one aspect of the disclosed embodiments, the adversarial perturbation detector 28 uses the enrolled biometric samples 30 of the enrolled user, which are hidden/inaccessible by the second user, to detect whether adversarial samples were submitted by the second user. During a training stage, the adversarial perturbation detector 28 uses the enrolled biometric samples 30 at least in part to create the augmented biometric samples 40 by adding learned perturbations 44 to the enrolled biometric samples 30. The augmented biometric samples 40 are then added to the enrolled biometric samples 30.
(35) In addition to the enrolled biometric sample x.sub.i 30, the adversarial perturbation detector 28 receives as input the submitted biometric samples 36 of the second user, which comprise perturbed adversarial samples x′+Δx′ 302, and determines if any perturbation are detected (block 502). Responsive to detecting any perturbations in the submitted biometric samples 36, the adversarial perturbation detector 28 rejects the authorization request of the second user (block 504). Otherwise the authorization request is granted and the second user is authorized.
(36)
(37) The process may begin by the adversarial perturbation detector 28 accessing the enrolled biometric dataset 32 from the enrollment database 24 to aid in detecting adversarial samples. The enrolled biometric dataset 32 includes images with the learned perturbations 44. Thus the enrolled biometric dataset 32 comprises samples/images, where iϵ{1: N}. Similarly, the perturbed adversarial samples x′+Δx′ 302 may include multiple perturbations, shown as ΣΔx′, where is a summation.
(38) According to a further aspect of the disclosed embodiments, the adversarial perturbation detector 28 further includes a transformation, has a learned transformation function k(⋅) configured to maximize the summation of distance Σ.sub.i d(k(x′+Δx′)−k(x.sub.i)), iϵ{1:M}. The adversarial perturbation detector 28 receives the enrolled biometric samples 30 comprising x.sub.i of the enrolled user and the perturbed adversarial samples x′+ΣΔx′ 302 of the second user for transformation into a transformed subspace. In one embodiment, this may be done by using transformation 48 to apply function k(⋅) to the enrolled biometric samples x.sub.i 30 and the perturbed adversarial samples x′+ΣΔx′ 302 to generate transformed enrolled samples k(x.sub.i) 600 and a transformed adversarial sample k(x′+ΣΔx′) 602, which are in the transformed subspace. In embodiments, the transformation function k(⋅) can be decomposed into multiple projections: k=g.sub.1∘g.sub.2∘g.sub.3 . . . , where g.sub.1 can be layer in a convolutional neural network, g.sub.2 can be another layer in the convolutional network, and g.sub.3 can be a non-linear projection in a support vector machine (SVM).
(39) The adversarial perturbation detector 28 then computes a distance (similarity score) between transformed enrolled samples k(x.sub.i) 600 and the transformed adversarial sample k(x′+ΣΔx′) 602 in the transformed subspace (block 604). In one embodiment the distance F can be computed as: F=Σ.sub.i d(k(x.sub.i), k(x′+Δx′)). In one embodiment, F can be a distance metric that is mathematically computed, but F may also be a learnt distance metric/classifier. The classifier 50 determines that the transformed adversarial sample k(x′+ΣΔx′) 602, and therefore the submitted biometric sample 36 of the second user, is adversarial when the computed distance F is greater than a second threshold t′, indicating that one or more perturbations have been detected in the perturbed adversarial biometric sample 302 (block 608). Otherwise, the second user is authorized (block 606).
(40)
(41) In one embodiment, the adversarial perturbation detector 28 may be deployed using a parallel model 700 (also shown in
(42) In another embodiment, the adversarial perturbation detector 28 may be deployed using a sequential model 702 in which the x.sub.i and learned perturbations x.sub.i′ of the enrolled biometric samples 30 are input sequentially to the transformation 48. In the sequential model 702, x.sub.i is inputted to the transformation 48 with the perturbed adversarial samples x′+Δx′ 302 to compute transformed enrolled samples k(x.sub.i) and transformed adversarial sample k(x′+Δx′). The adversarial perturbation detector 28 then computes the distance score and uses the classifier 50 to determine whether a perturbation as detected. If no perturbation is detected, then the learned perturbations x.sub.i′ of the enrolled biometric samples 30 is input to the transformation 48 with the perturbed adversarial samples x′+Δx′ 302 to compute transformed enrolled samples k(x.sub.i′) and transformed adversarial sample k(x′+Δx′). The adversarial perturbation detector 28 then computes the distance score and uses the classifier 50 to make a final decision of whether a perturbation as detected.
(43)
(44) The process learns the parameters for the transformation 48 k(⋅) and classifier 50 by inputting the training set to the transformation 48 k(⋅) (block 802). The process generates transformed enrolled samples k(x.sub.i), transformed publicly available sample k(x), and a transformed adversarial sample k(x′+Δx′). In embodiments, the transformation function k(⋅) can be decomposed into multiple projections: k=g1∘g2∘g3 . . . , where g1 can be a layer in a convolutional neural network, g2 can be another layer in the convolutional network, and g3 can be a non-linear projection in a support vector machine (SVM).
(45) Next the classifier 50 with learnable parameters σ is used to classify the transformed adversarial sample k(x′+Δx′) as a success (e.g., a “1”) or a fail (e.g., a “0”) based on the transformed enrolled samples k(x.sub.i) (block 804). In embodiments, the classifier may be the same or different than classifier 50. The classifier may be a deep neural network or a threshold-based classifier: Σ.sub.i d(k(x.sub.i), k(x′+ΣΔx′))<third threshold t″, it depends on the output of k( ) In either case, the classifier needs to contain an aggregation function (e.g., ED to merge the results of x.sub.i. Based on a result of the classification, the learnable parameters θ and a are updated and process may repeat as necessary.
(46) Another operation of the training process is to create the augmented biometric samples 40 by adding learned perturbations 44 to the enrolled biometric samples 30 of the enrolled user as described in block 402 of
(47)
(48) The process may include by retrieving the enrolled biometric data set x.sub.i 32 for the enrolled user from the enrollment database 24 and performing an operation q(⋅) on x.sub.i to generate variations of the learned perturbations Δx.sub.i (block 900). Operations in q(⋅) may increase the intra-class distances between x.sub.i, such as by generating random noise on the enrollment images. In an alternative embodiment, an image may be selected from the public biometric samples x 38, and generate adversarial images based on x.sub.i to slightly increase the distance: d(k(x), k(x.sub.i+Δx.sub.i)) to obtain Δx.sub.i. The learned perturbations Δx.sub.i are then added to the enrolled biometric data set 902 for the enrolled user (block 902).
(49) According the disclosed embodiments, adding the learned perturbations Δx.sub.i that are added to x.sub.i may increase the difficulty of creating adversarial biometric samples even in the case where the enrollment set is leaked to an adversary during an insider breach so that:
(50)
(51) A method and system for using an enrollment set to detect adversarial examples in biometrics-based authentication system has been disclosed. The present invention has been described in accordance with the embodiments shown, and there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention. For example, the exemplary embodiment can be implemented using hardware, software, a computer readable medium containing program instructions, or a combination thereof. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.