Opacus

Click to visit website
About
Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the client, has little impact on training performance and allows the client to online track the privacy budget expended at any given moment. Opacus features scalable vectorized per-sample gradient computation, is built on PyTorch and supports most types of PyTorch models, and has an extensible open source, modular API for differential privacy research.
Platform
Task
Features
• extensible: open source, modular api for differential privacy research.
• built on pytorch: supports most types of pytorch models and can be used with minimal modification to the original neural network.
• scalable: vectorized per-sample gradient computation that is 10x faster than microbatching
FAQs
What is Opacus?
Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the client, has little impact on training performance and allows the client to online track the privacy budget expended.
Is Opacus open-source? What is the license?
Yes! Opacus is open-source for public use, and it is licensed under the Apache 2.0 license.
How can I report a bug or ask a question?
You can report bugs or ask questions by submitting GitHub issues. To submit a GitHub issue, please click here.
I'd like to contribute to Opacus. How can I do that?
Thank you for your interest in contributing to Opacus! Submit your contributions using GitHub pull requests here. Please take a look at Opacus contribution guide.
If I use Opacus in my paper, how can I cite it?
If you use Opacus in your papers, you can cite it as follows: @article{opacus, title={Opacus: {U}ser-Friendly Differential Privacy Library in {PyTorch}}, author={Ashkan Yousefpour and Igor Shilov and Alexandre Sablayrolles and Davide Testuggine and Karthik Prasad and Mani Malek and John Nguyen and Sayan Ghosh and Akash Bharadwaj and Jessica Zhao and Graham Cormode and Ilya Mironov}, journal={arXiv preprint arXiv:2109.12298}, year={2021} }
What is DP-SGD?
DP-SGD is an algorithm described in this paper; Opacus is its Pytorch implementation. Please refer to this blog post to read more about DP-SGD.
How do I attach the privacy engine?
Training with Opacus is as simple as instantiating a `PrivacyEngine` and attaching it to the `optimizer`...
What is the secure_rng argument in PrivacyEngine?
Not all pseudo random number generators (RNGs) are born equal... Opacus supports a CSPRNG provided by the torchcsprng library. This option is controlled by setting `secure_rng` to `True`...
My model doesn’t converge with default privacy settings. What do I do?
Opacus has several settings that control the amount of noise, which affects convergence. The most important one is `noise_multiplier`, which is typically set between 0.1 and 2.
How to deal with out-of-memory errors?
Dealing with per-sample gradients will inevitably put more pressure on your memory...The first sanity check to do is to make sure that you don’t go out of memory with "standard" training (without DP).
What does epsilon=1.1 really mean? How about delta?
The (epsilon, delta) pair quantifies the privacy properties of the DP-SGD algorithm (see the blog post). A model trained with (epsilon, delta)-differential privacy (DP) protects the privacy of any training example, no matter how strange, ill-fitting, or perfect this example is.
How does batch size affect my privacy budget?
Assuming that batches are randomly selected, an increase in the batch size increases the sampling rate, which in turn increases the privacy budget. This effect can be counterbalanced by choosing a larger learning rate (since per-batch gradients approximate the true gradient of the model better) and aborting the training earlier.
My model throws IncompatibleModuleException. What is going wrong?
Your model most likely contains modules that are not compatible with Opacus. The most prominent example of these modules is batch-norm types. Before validating you model try to fix incompatible modules using `ModuleValidator.fix(model)` as described here
What is virtual batch size?
Opacus computes and stores _per-sample_ gradients under the hood. What this means is that, for every regular gradient expected by the optimizer, Opacus will store `batch_size` per-sample gradients on each step.
What are `alphas`?
Although we report expended privacy budget using the (epsilon, delta) language, internally, we track it using Rényi Differential Privacy (RDP) . In short, (alpha, epsilon)-RDP bounds the Rényi divergence of order alpha between the distribution of the mechanism’s outputs on any two datasets that differ in a single element.
Job Opportunities
There are currently no job postings for this AI tool.
Ratings & Reviews
No ratings available yet. Be the first to rate this tool!
Alternatives

vantage6
Open-source platform for privacy-preserving federated learning, enabling collaborative data analysis without centralizing sensitive information.
View DetailsFeatured Tools
Songmeaning
Songmeaning is an AI-powered tool that helps users uncover the hidden stories and meanings behind song lyrics, enhancing their musical understanding.
View DetailsPropLytics
PropLytics is an AI-powered platform for real estate investors, providing data-backed ROI insights to help make smarter, faster investment decisions.
View DetailsGitGab
GitGab is an AI tool that contextualizes top AI models like ChatGPT, Claude, and Gemini with your GitHub repositories and local code for enhanced development.
View Details
nuptials.ai
nuptials.ai is an AI wedding planning partner, offering timeline planning, budget optimization, vendor matching, and a 24/7 planning assistant to help plan your perfect day.
View Details
Fastbreak AI
Fastbreak AI is an ultimate AI-powered sports operations engine, offering intelligent software for sports league scheduling, tournament management, and brand sponsorship.
View Details
Molku
Molku is an AI-powered tool that automates data extraction and document filling, allowing users to effortlessly transfer data from various source files into templates.
View DetailsBestFaceSwap
BestFaceSwap is an AI-powered online tool that enables users to easily change faces in videos and photos with high-quality and realistic results.
View DetailsHumanize AI Text
Humanize AI Text is the best AI humanizer tool that transforms AI-generated content into human-like writing, bypassing major AI detectors with ease.
View Details
RightHair
RightHair is a free AI hairstyle changer that allows users to virtually try over 200 hairstyles and colors by uploading their photo, instantly transforming their look.
View DetailsHealing Grace Alternative Healing
Healing Grace Alternative Healing is a center offering personalized care through organic bath and body products, natural remedies, and spiritual healing practices.
View Details
Smart Cookie Trivia
Smart Cookie Trivia is a platform offering a wide variety of trivia questions across numerous categories to help users play trivia, explore different topics, and expand their knowledge.
View DetailsLatest AI News
View All News
IBM's Power11 redefines enterprise AI, combining extreme reliability, integrated security, and on-chip inference for critical workloads.

The EU criminalizes AI-generated child abuse that is indistinguishable from real, compelling tech to safeguard against its dark potential.

From collaborative brainstorming to autonomous app generation, Firebase Studio's new Gemini-powered "Agent modes" reshape development.