This project is prepared for the paper "Oriole: Thwarting Privacy against Trustworthy Deep Learning Models" published in ACISP.
This code is intended only for presonal privacy or academic reserch. We prohibit commercialization activities without permisssion. The copyright belongs to the master student Liuqiao Chen and Prof. Qian of East China Normal University.
This folder contains relevant resources in this paper(Fawkes: Protecting Privacy against Unauthorized Deep Learning Models). More details in the relevant folder fawkes.
Install from PyPI.
pip install fawkes
This folder contains relevant resources in this paper(Oriole: Thwarting Privacy against Trustworth Deep Learning Models). The function of this module is mainly to make m multi-cloaked images for each image in the training dataset. More details in the relevant folder oriole.
Install from PyPI.
pip install fawkes
And then you need to download this repo and overwrite the 'utils.py' as the Tips do.
- Your'd better not change the value of the batch-size unless you have very powerful GPU computing resources.
- Run on GPU. neither the current Fawkes packge or the Oriole package support GPU. To use GPU, you need to clone this repo, install the required packges in setup.py, and replace the tensorflow with tensorflow-gpu Then you can run Fawkes or Oriole like this:
python fawkes/protection.py [args]
(Fawkes) orpython oriole/F_protection.py [args]
(Oriole).