pytorch/functorch/examples/dp_cifar10
Richard Zou 5e5c319549 Move functorch python bindings to torch/csrc (#85426)
This moves functorch's python bindings to torch/csrc/functorch/init.cpp.
Coming next is the torchdim move. I didn't do torchdim yet because
moving functorch's python bindings unblocks some other things that I
want to do first.

Test Plan:
- tests
Pull Request resolved: https://github.com/pytorch/pytorch/pull/85426
Approved by: https://github.com/ezyang
2022-09-22 18:47:12 +00:00
..
cifar10_opacus.py Turn on linting for functorch (#81987) 2022-07-25 14:36:22 +00:00
cifar10_transforms.py Move functorch python bindings to torch/csrc (#85426) 2022-09-22 18:47:12 +00:00
README.md [functorch] Fix some lint issues (pytorch/functorch#606) 2022-07-21 13:41:26 -07:00

Differential Privacy with ResNet18

Differential Privacy

Differential privacy is a way of training models that ensures no attacker can figure out the training data from the gradient updates of the model. Recently, a paper was published comparing the performance of Opacus to a JAX-based system.

Original differential privacy paper JAX-based differential privacy paper

Opacus

Opacus is a differential privacy library built for PyTorch. They have added hooks to PyTorch's autograd that compute per sample gradients and a differential privacy engine that computes differentially private weight updates.

Example

This example runs ResNet18 by either having Opacus compute the differentially private updates or getting the per sample gradients using vmap and grad and computing the differentially private update from those.

As a caveat, the transforms version may not be computing the exact same values as the opacus version. No verification has been done yet for this.

Requirements

These examples use Opacus version 1.0.1 and torchvision 0.11.2