A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Differentially Private (Gradient) Expectation Maximization Algorithm with Statistical Guarantees
[article]
2022
arXiv
pre-print
To address this issue, we propose in this paper the first DP version of (Gradient) EM algorithm with statistical guarantees. ...
(Gradient) Expectation Maximization (EM) is a widely used algorithm for estimating the maximum likelihood of mixture models or incomplete data problems. ...
(will be inserted by the editor)
Differentially Private Expectation Maximization
Algorithm with Statistical Guarantees ...
arXiv:2010.13520v3
fatcat:ngg3yg4u5jcvnbm3tq6koeexfi
Individual Privacy Accounting for Differentially Private Stochastic Gradient Descent
[article]
2023
arXiv
pre-print
Differentially private stochastic gradient descent (DP-SGD) is the workhorse algorithm for recent advances in private deep learning. ...
We also design an efficient algorithm to investigate individual privacy across a number of datasets. We find that most examples enjoy stronger privacy guarantees than the worst-case bound. ...
The authors would also like to thank Yu-Xiang Wang for enlightening discussions about ex-post differential privacy. ...
arXiv:2206.02617v6
fatcat:ngmebhqlm5dbhhfxc6jhhxvpai
Differentially Private Stochastic Gradient Descent with Low-Noise
[article]
2023
arXiv
pre-print
In the pairwise learning setting, we propose a simple differentially private SGD algorithm based on gradient perturbation. ...
In this paper, we focus on the privacy and utility (measured by excess risk bounds) performances of differentially private stochastic gradient descent (SGD) algorithms in the setting of stochastic convex ...
Under a low-noise condition, we remove the term O 1 • We propose a simple differentially private SGD algorithm for pairwise learning with utility guarantees. ...
arXiv:2209.04188v2
fatcat:chcj46c4cjeuno6a6quycuq2km
Differentially Private Variational Autoencoders with Term-wise Gradient Aggregation
[article]
2020
arXiv
pre-print
Using differentially private SGD (DP-SGD), which randomizes a stochastic gradient by injecting a dedicated noise designed according to the gradient's sensitivity, we can easily build a differentially private ...
This paper studies how to learn variational autoencoders with a variety of divergences under differential privacy constraints. ...
to cover the increased sensitivity fail into differential privacy guarantee that we expected. ...
arXiv:2006.11204v1
fatcat:ars4wmrbnrcw3iyjehgrqy7cjq
DPSUR: Accelerating Differentially Private Stochastic Gradient Descent Using Selective Update and Release
[article]
2023
arXiv
pre-print
To protect against these attacks, differential privacy (DP) has become the de facto standard for privacy-preserving machine learning, particularly those popular training algorithms using stochastic gradient ...
Motivated by this, this paper proposes DPSUR, a Differentially Private training framework based on Selective Updates and Release, where the gradient from each iteration is evaluated based on a validation ...
Deep Learning with Differential Privacy Differentially Private Stochastic Gradient Descent (DPSGD) is a widely-adopted training algorithm for deep neural networks with differential privacy guarantees. ...
arXiv:2311.14056v2
fatcat:2euzsanltzc7vepvcrmg7zdeku
Differentially Private Variational Dropout
[article]
2017
arXiv
pre-print
We demonstrate the accuracy of our privacy-preserving variational dropout algorithm on benchmark datasets. ...
Deep neural networks with their large number of parameters are highly flexible learning systems. ...
lower bound (3) becomes equivalent to maximization of the expected log-likelihood with fixed parameter α. ...
arXiv:1712.02629v3
fatcat:nrgk4uxiebd6dd5dh4s4q2xnei
Differentially private Riemannian optimization
[article]
2022
arXiv
pre-print
We further show privacy guarantees of the proposed differentially private Riemannian (stochastic) gradient descent using an extension of the moments accountant technique. ...
We introduce a framework of differentially private Riemannian optimization by adding noise to the Riemannian gradient on the tangent space. ...
Algorithm 1 is ( , δ)-differentially private. Proof. ...
arXiv:2205.09494v1
fatcat:izioq6ycw5dctlyl77xypa2sd4
Differentially Private Online Submodular Maximization
[article]
2020
arXiv
pre-print
In this work we consider the problem of online submodular maximization under a cardinality constraint with differential privacy (DP). ...
In the full-information setting, we develop an (ε,δ)-DP algorithm with expected (1-1/e)-regret bound of 𝒪( k^2log |U|√(T log k/δ)/ε). ...
We provide a differentially private online learning algorithm for DR-submodular maximization that achieves low expected regret. ...
arXiv:2010.12816v1
fatcat:jbciay4wxregjf6csjkpg7ouiu
Differentially-private Federated Neural Architecture Search
[article]
2020
arXiv
pre-print
To further preserve privacy, we study differentially-private FNAS (DP-FNAS), which adds random noise to the gradients of architecture variables. ...
We provide theoretical guarantees of DP-FNAS in achieving differential privacy. ...
Algorithm 1 shows the execution workflow in one iteration of the differentially-private federated NAS (DP-FNAS) algorithm. Per-sample gradient clipping is used with hyperparameters R G and R H . ...
arXiv:2006.10559v2
fatcat:7xpjs4bc6rf5nj6ljle4mnzdpm
Differentially Private Fair Learning
[article]
2019
arXiv
pre-print
Our second algorithm is a differentially private version of the oracle-efficient in-processing approach of [Agarwal et al., 2018] that can be used to find the optimal fair classifier, given access to a ...
Our first algorithm is a private implementation of the equalized odds post-processing approach of [Hardt et al., 2016]. ...
(They need not be differentially private with respect to the unprotected attributes X -although sometimes are.) • Fairness: Our learning algorithms guarantee approximate notions of statistical fairness ...
arXiv:1812.02696v3
fatcat:36rhfmbrlnbwdaqerkekw7mn2q
Evaluating Differentially Private Machine Learning in Practice
[article]
2019
arXiv
pre-print
Current mechanisms for differentially private machine learning rarely offer acceptable utility-privacy trade-offs with guarantees for complex learning tasks: settings that provide limited accuracy loss ...
Differential privacy is a strong notion for privacy that can be used to prove formal guarantees, in terms of a privacy budget, ϵ, about how much information is leaked by a mechanism. ...
Motivated by relaxations of differential privacy, Abadi et al. [1] propose the moments accountant (MA) mechanism for bounding the expected privacy loss of differentially-private algorithms. ...
arXiv:1902.08874v4
fatcat:7ic6gclgfnhrrhs2wqn3mqfcwi
Differentially Private Fair Binary Classifications
[article]
2024
arXiv
pre-print
We first propose an algorithm based on the decoupling technique for learning a classifier with only fairness guarantee. ...
We then refine this algorithm to incorporate differential privacy. The performance of the final algorithm is rigorously examined in terms of privacy, fairness, and utility guarantees. ...
These classifiers can be learned by applying an existing differentially private learning method -most notably, differentially private stochastic gradient descent (DP-SGD) [32] -to each demographic subgroup ...
arXiv:2402.15603v1
fatcat:6kyefjnl5zbdfkzsoolleqv4ve
Differentially private data cubes
2011
Proceedings of the 2011 international conference on Management of data - SIGMOD '11
Given a fixed privacy guarantee, we show that it is NP-hard to choose the initial set of cuboids so that the maximal noise over all published cuboids is minimized, or so that the number of cuboids with ...
In this paper, we address this problem using differential privacy (DP), which provides provable privacy guarantees for individuals by adding noise to query answers. ...
This problem has a very efficient solution with good statistical guarantees. Program (8) can be viewed as a least L 2 -norm problem. ...
doi:10.1145/1989323.1989347
dblp:conf/sigmod/DingWHL11
fatcat:2y3szrycpfecrkfnicaxodmgna
Differentially private distributed logistic regression using private and public data
2014
BMC Medical Genomics
Conclusion: Logistic regression models built with our new algorithm based on both private and public datasets demonstrate better utility than models that trained on private or public datasets alone without ...
sacrificing the rigorous privacy guarantee. ...
with β new in Algorithm 1. ...
doi:10.1186/1755-8794-7-s1-s14
pmid:25079786
pmcid:PMC4101668
fatcat:kleiilbronfvjodrtbxpo3l2pm
Fast Differentially Private Matrix Factorization
[article]
2015
arXiv
pre-print
We present a simple algorithm that is provably differentially private, while offering good performance, using a novel connection of differential privacy to Bayesian posterior sampling via Stochastic Gradient ...
Differentially private collaborative filtering is a challenging task, both in terms of accuracy and speed. ...
The following theorem guarantees that our procedure is indeed differentially private. ...
arXiv:1505.01419v2
fatcat:q5e3d7gyirchffzh4qjojhb5nm
« Previous
Showing results 1 — 15 out of 6,567 results