This document presents Semi-Adversarial Networks (SANs) for imparting demographic privacy to face images. SANs aim to confound soft-biometric attribute classifiers like gender, age, and race, while retaining the ability of face recognition systems to match images. The proposed method, SAN+, uses a cycle-GAN approach to modify target attributes based on a given label vector, while an auxiliary face matcher ensures generated images still match the original. Evaluation on the MUCT dataset shows SAN+ increases error rates of attribute predictors for modified images, while maintaining matching performance close to original images.
Semi-Adversarial Networks for Imparting Gender, Age and Race Privacy to Face ImagesAge, Gender and Race Privacy to Face Images
1. Semi-Adversarial Networks for Imparting
Age, Gender and Race Privacy
to Face Images
Vahid Mirjalili, Sebastian Raschka, Arun Ross
Workshop on Demographic Variations in Performance of
Biometric Algorithms
WACV 2019
1
2. Privacy
2
Privacy is defined as the right for users to be able to determine what
information about themselves to reveal and what information to conceal.
“Privacy is the right to be let alone” [Samuel Warren and Louis Brandeis
(1890)]
“Privacy is the right of people to conceal information about themselves
that others might use to their disadvantage” [Richard Posner (1983)]
3. Privacy in Biometrics
3
Privacy in Biometrics: any processing applied to
biometric data of individuals is subject to users’
consent
Primary purpose of collecting and storing biometric
data: recognition
Assume consent from users is acquired to use their
biometric data for recognition
Extracting other information such as age, gender,
ethnicity, etc.
Also requires consent from users for extracting
other information beyond recognition
5. Biometric (Face) Recognition
5
A. Identification
Determine identity of an unknown person
1-to-n matching
...
B. Verification
Verify claimed identity of a person
1-to-1 matching
(CelebA dataset) (MUCT dataset)
We assume consent from users to use their biometric data for
recognition is acquired.
7. Examples of Attribute Information “Mis”-use
7
User profiling
Targeted advertisement
Violating users’ privacy
Ethics
Linking attack
Combing extracted information with another
dataset of known identities
For example: Linking a dating website with
identities on a social media platform
Identity theft
An example of a possible misuse of
gender information by advertisers
8. Goal: Biometric Privacy Compliance
8
Retained
Confounded
Vahid
Male
29
White
Healthy, 187 lb.
Recognition
Gender
Age
Ethnicity
Health
✅
🚫
🚫
🚫
🚫
Objective: Retain recognition (identification or verification) while
confounding automatic extraction of auxiliary attributes
9. Related work
9
Title Authors Year Description
Privacy of Facial Soft Biometrics: Suppressing
Gender But Retaining Identity
Othman and Ross 2014
Mixing a face image with a
candidate from opposite gender.
Controllable Face Privacy Sim and Zhang 2015
Multimodal Discriminant Analysis
(MMDA)
Deep Feature Interpolation Upchurch et al. 2017
Interpolating Face Representation
in the Latent Space
Are Facial Attributes Adversarially Robust? Rozsa et al. 2016
Adversarial Perturbations on Deep
Neural Networks
Soft biometric privacy: Retaining biometric utility
of face images while perturbing gender
Mirjalili and Ross 2017
Adversarial Perturbations in a
Black-box Scenario Guided by
Opposite Gender
Semi-Adversarial Networks: Convolutional
Autoencoders for Imparting Privacy to Face
Images
Mirjalili et al. 2018
Deriving Perturbations using an
AutoEncoder and an Auxiliary
Gender Classifier
11. Semi-Adversarial Networks for Privacy in Face Images
Goal:
o Confounding soft-biometric attributes attribute classifiers will not work
o Retain the recognition capability face matchers still work
11
𝜙 𝑋 = 𝑋′
Semi-
Adversarial
Semi-Adversarial Network (SAN)
14. Overall SAN Architecture
14
𝑋 𝑋′𝜙 𝑋
Network 1
Conv. Autoencoder
M Auxiliary
Face
Matcher
Network 2:
G
Network3:
Auxiliary
Gender
Classifier
15. SAN for Multiple Attribute Privacy
Considering 3 soft-biometric attributes:
o Gender: [Male, Female]
o Age: 3 ordinal labels [0:Young, 1:Middle-age, 2:Old]
o Race: [A:African-descent, B:Caucasian]
SAN+
o Cycle-GAN:
• Generator: transforms images to a target label vector
• Discriminator: distinguishes between real (input images) vs. synthesized (generated
images)
o Auxiliary Face Matcher: ensures that the generated images match with their
original version
15
27. 27
Original Changing Gender
Modifying Face
Attributes using SAN+
All outputs match with their original
face image face recognition is
retained
(Users’ perspective) Controllable
soft-biometric privacy: users can
choose what attribute to confound
and what attributes to keep
(System’s perspective) Application
can arbitrarily randomize the
confounded/preserved attributes
per subjects
Changing Age Changing Race
29. Performance Evaluation
SAN+ model trained on CelebA and MORPH, and evaluated on MUCT
dataset
Attribute Prediction using COTS
o G-COTS: gender
o A-COTS: age (years)
o R-COTS: race
Face matching using M-COTS (state-of-the-art)
29
30. Gender Prediction using G-COTS
30
Flip none:
original labels
Flip: G A R
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
31. Gender Prediction using G-COTS
31
Flip: G A R
19%Flip Gender
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
32. Gender Prediction using G-COTS
32
Flip: G A R
19%
Flip Age
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
33. Gender Prediction using G-COTS
33
Flip: G A R
19%
Flip Race
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
34. Gender Prediction using G-COTS
34
Flip: G A R
19%
15%
Flip
Gender, Age
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
35. Gender Prediction using G-COTS
35
Flip: G A R
19%
15%
14%
Flip
Gender, Race
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
36. Gender Prediction using G-COTS
36
Flip: G A R
19%
15%
Flip Age, Race
14%
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
0.8%
37. Gender Prediction using G-COTS
37
Flip: G A R
19%
15%
14%
12%Flip
Gender, Age, Race
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
0.8%
38. Gender Prediction using G-COTS
Blue: not intended to change gender
Orange: cases which have
undergone gender-perturbation
38
Preserving the performance on all
blue sets
Increasing the EER on orange sets
Confounding Gender
Flip: G A R
19%
15%
14%
12%
Prediction Performance
on MUCT dataset (ROC-EER)
1.6%
0.7%
1.1%
0.8%
39. Age Prediction using A-COTS (in years)
39
0
4.6
13.5
4.4
14
13.1
4.3
13.2
Flip: G A R
Flip Gender
Flip Age
Flip
Gender, Age
Flip Race
Flip Age, Race
Flip
Gender, Race
Flip
Gender, Age, Race
Prediction Performance
on MUCT dataset (MAE)
Flip none:
original labels
40. Age Prediction using A-COTS (in years)
40
Blue: not intended to change age
Orange: image-sets which have
undergone age-perturbation
Increasing the MAE on orange sets
Confounding Age
0
4.6
13.5
4.4
14
13.1
4.3
13.2
Flip: G A R
Prediction Performance
on MUCT dataset (MAE)
41. Race Prediction using R-COTS
41
Blue: not intended to change race
Orange: image-sets which have
undergone age-perturbation
23%
24%
21%
22%
Flip: G A R
Prediction Performance
on MUCT dataset (ROC-EER)
Increasing the EER on orange cases
Confounding Race
2.3%
1.2%
0.6%
1.2%
42. Face Matching using M-COTS
42
SAN+ is able to retain the
matching performance close to
the original images (before
perturbation)
Biometric utility of images is
preserved
Baseline: face mixing approach (Othman and Ross)
GAN: trained without matcher
43. Summary
Semi-Adversarial Networks (SANs) for imparting demographic privacy to
face images
o Confounding soft-biometric attribute classifiers while retaining matching utility
SAN+ for multi-attribute privacy
o Age, gender and race
o Ability to modify any combination of attributes
Other applications beyond privacy
o For example, it can be used in image manipulation softwares, such as Photoshop
Code available on GitHub:
https://github.com/iPRoBe-lab/semi-adversarial-networks
43
44. Publications
1. Gender Privacy: An Ensemble of Semi Adversarial Networks for Confounding
Arbitrary Gender Classifiers, V. Mirjalili, S. Raschka, A. Ross, BTAS 2018.
2. Semi-Adversarial Networks: Convolutional Autoencoders for Imparting Privacy
to Face Images, V. Mirjalili, S. Raschka, A. Namboodiri, A. Ross, ICB 2018.
3. Soft biometric privacy: Retaining biometric utility of face images while
perturbing gender, V. Mirjalili, A. Ross, IJCB 2017.
4. Spoofing PRNU Patterns of Iris Sensors while Preserving Iris Recognition, S.
Banerjee, V. Mirjalili, A. Ross, ISBA 2019.
44