Oblivious Neural Network Predictions via MiniONN Transformations
1. Oblivious Neural Network
Predictions via MiniONN
Transformations
Presented by: Sherif Abdelfattah
Liu, J., Juuti, M., Lu, Y., & Asokan, N. (2017, October). Oblivious neural network predictions via minionn
transformations. In Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications
Security (pp. 619-631). ACM. (121 citation)
1
2. Machine Learning as a Service
Input
Predictions
This way is a violation of clients’ privacy
2
3. Running predictions on client-side
• A naive solution is to have clients download the model and run the
prediction phase on client-side.
Model
• It becomes more difficult for service providers to update their models.
• For security applications (e.g., spam or malware detection services), an adversary can use
the model as an oracle to develop strategies for evading detection.
• If the training data contains sensitive information (such as patient records from a
hospital) revealing the model may compromise privacy of the training data.
3
4. Oblivious Neural Networks (ONN)
The solution is using make the neural network oblivious
• The server learns nothing about the client’s input.
• The clients learn nothing about the model.
4
12. Oblivious Activation Functions 𝒇(𝒚)
Piecewise linear functions
• For example (ReLU: 𝑥 = compare(𝑦, 0))
• Oblivious ReLU 𝑥𝑠+𝑥𝑐= compare 𝑦𝑠 + 𝑦𝑐, 0
• Computed obliviously by a garbled circuit2
2garbled circuit: is a two-party computation (2PC) technique that allow two parties to jointly compute a function without learning each other’s input.
13
13. Oblivious Activation Functions 𝒇(𝒚)
Smooth functions
• For example (Sigmoid: 𝑥 = Τ
1 1 + 𝑒−𝑦 )
• Oblivious Sigmoid 𝑥𝑠+𝑥𝑐= Τ
1 1 + 𝑒−(𝑦𝑠+𝑦𝑐)
• Approximate by a piecewise linear function
• Computed obliviously by a garbled circuit
14
15. Performance
1. MNIST (60 000 training images and 10 000 test images)
• Handwriting recognition
• CNN model
• ReLU activation function
2. CIFAR-10 (50 000 training images and 10 000 test images)
• Image classification
• CNN model
• ReLU activation function
3. Penn Treebank (PTB) (929 000 training words, 73 000 validation words, and 82 000 test words.)
• language modeling: predicting next words given the previous words
• Long Short Term Memory (LSTM): commonly used for language modeling
• Sigmoidal activation function
16