HEFT: Homomorphically Encrypted Fusion of Biometric Templates


Luke Sperling$\dagger$, N. Ratha$\ddagger$, A. Ross$\dagger$, V. Boddeti$\dagger$

$\dagger$Michigan State University, $\ddagger$University at Buffalo

12th October, 2022

IJCB 2022

Motivation

Fusion of Biometric Information





"A comprehensive overview of biometric fusion."Information Fusion, 2019"
"Deep learning approach for multimodal biometric recognition system based on fusion of iris, face, and finger vein traits." Sensors, 2020

Information Leakage from Representations

Attacks on Face templates
"Assessing Privacy Risks from Feature Vector Reconstruction Attacks," arXiv:2202.05760

Face reconstruction from template
"On the reconstruction of face images from deep face templates," TPAMI, 2018

Encryption: The Holy Grail?

  • Data encryption is an attractive option
    • protects user's privacy
    • enables free and open sharing
    • mitigate legal and ethical issues


  • Encryption scheme needs to allow computations directly on the encrypted data.
    • Solution: Homomorphic Encryption

Key Idea of Homomorphic Encryption

Ring Learning with Errors
op plaintext ciphertext
$x$ $(x + e_1) \mbox{ mod } t$
$y$ $(y + e_2) \mbox{ mod } t$
$+$ $x+y$ $(x+y + e_3') \mbox{ mod } t$
$\times$ $x\times y$ $(x\times y + e_4'') \mbox{ mod } t$

FHE in Biometrics

  • "Secure Face Matching Using Fully Homomorphic Encryption,", BTAS 2018

    • Focussed on protecting database of templates.
    • Allows match score computation in the encrypted domain.

HEFT

Homomorphically Encrypted Fusion of Biometric Templates

HEFT: Overview

HEFT: Concatenation

Homomorphic Concatenation

HEFT: Linear Projection

Linear Projection

Naive
Hybrid

Linear Projection Comparison

Computational Complexity

  • Hybrid
    • Pros: Low memory and runtime overhead
    • Cons: Scales linearly with number of samples

HEFT: Feature Normalization

$\ell_2$-Normalization of Vector



$\hat{\mathbf{u}} = \frac{\mathbf{u}}{\|\mathbf{u}\|_2} \quad \rightarrow \quad$ division$\dagger$


where

$\|\mathbf{u}\|_2 = \sqrt{\sum_{i=1}^d u_i^2} \quad \rightarrow \quad$ square-root$\dagger$


  • $\dagger$: problematic operations for FHE

Inverse Square Root: Polynomial Approximation

$$\frac{1}{\sqrt{x}} = \sum_{i=1}^6 a_i x^i$$

FHE-Aware Learning

Account for the limitations of FHE to improve performance

    • FHE is limited to specific operations on encrypted data.

    • Normalization is not directly computable - need to approximate.

    • Approximation is a source of error and hence a loss of matching performance

    • We incorporate approximate normalization into our training of the projection matrix to recover performance

Loss Function

Main Idea: FHE-Aware Learning
  • $$Loss = \lambda \underbrace{\frac{\sum_M d(\mathbf{c}_i, \mathbf{c}_j)}{|M|}}_{ \color{orange}{Pull} } + (1-\lambda)\underbrace{\frac{\sum_{V}[m + d(\mathbf{c}_i, \mathbf{c}_j) - d(\mathbf{c}_i, \mathbf{c}_k)]_{+}}{|V|}}_{ \color{orange}{Push} }$$
  • where $$d(\mathbf{c}_i, \mathbf{c}_j) = 1-P\underbrace{f(\mathbf{c}_i)}_{ \color{cyan}{approximation} } \cdot P\underbrace{f(\mathbf{c}_j)}_{ \color{cyan}{approximation} }$$ $f(\cdot)$ approximates the inverse norm of a vector.

Numerical Evaluation

Experimental Setup

Cross-Posed Labelled Faces in the Wild

    • Synthetic fusion dataset by randomly pairing classes.
    • 10,760 samples over 188 classes.

Fusion Improves Performance, Reduces Dimensionality

  • Fusion improves performance:
    • Face by 11.07%
    • Voice by 9.58%
  • Dimensionality Reduction: $512D \rightarrow 32D$ (16$\times$ compression)

Comparison of Normalization Methods

Computational Complexity


    • Projection is costliest operation

Summary

    • Introduces the first multimodal feature-level fusion system in the encrypted domain.
    • Improves performance of original templates while reducing their dimensionality.
    • Incorporates polynomial approximation for approximate normalization.
    • Incorporates FHE-Aware Learning to improve performance.
HEFT: Encrypted Biometric Fusion