A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Differentially Private ADMM for Distributed Medical Machine Learning
[article]
2020
arXiv
pre-print
In this paper, we propose a differentially private ADMM algorithm (P-ADMM) to provide dynamic zero-concentrated differential privacy (dynamic zCDP), by inserting Gaussian noise with linearly decaying variance ...
Moreover, through our experiments performed on real-world datasets, we empirically show that P-ADMM has the best-known performance among the existing differentially private ADMM based algorithms. ...
In this paper, we propose a differentially private ADMM algorithm (P-ADMM) to remedy privacy concerns of distributed machine learning. ...
arXiv:1901.02094v3
fatcat:vo3wlarhdrdbjcmozs5ytvbgt4
DP-ADMM: ADMM-based Distributed Learning with Differential Privacy
[article]
2019
arXiv
pre-print
Such an iterative process could cause privacy concerns of data owners. The goal of this paper is to provide differential privacy for ADMM-based distributed machine learning. ...
To our knowledge, this is the first paper to provide explicit convergence and utility properties for differentially private ADMM-based distributed learning algorithms. ...
Zhang and Zhu [20] propose two perturbation methods: primal perturbation and dual perturbation to guarantee dynamic differential privacy in ADMM-based distributed learning. Zhang et al. ...
arXiv:1808.10101v6
fatcat:glcbppnc55el7d6ila2ixgmq5i
Differentially Private Collaborative Intrusion Detection Systems For VANETs
[article]
2020
arXiv
pre-print
In this paper, we propose a privacy-preserving machine-learning based collaborative IDS (PML-CIDS) for VANETs. ...
We use the differential privacy to capture the privacy notation of the PML-CIDS and propose a method of dual variable perturbation to provide dynamic differential privacy. ...
We also describe the privacy concerns associated with the ADMM-based collaborative learning and define the dynamic differential privacy. ...
arXiv:2005.00703v1
fatcat:tnd6yzhj5vhjxorudvcum3o7gq
Differentially Private ADMM for Convex Distributed Learning: Improved Accuracy via Multi-Step Approximation
[article]
2020
arXiv
pre-print
In this paper, we aim to propose a new differentially private distributed ADMM algorithm with improved accuracy for a wide range of convex learning problems. ...
Alternating Direction Method of Multipliers (ADMM) is a popular algorithm for distributed learning, where a network of nodes collaboratively solve a regularized empirical risk minimization by iterative ...
Differentially Private ADMM-based Distributed Learning Recently, there are some works focusing on differentially private ADMM-based distributed learning algorithms. ...
arXiv:2005.07890v1
fatcat:dlylwulfljbmnglbfujynwhkoq
Dynamic Privacy For Distributed Machine Learning Over Network
[article]
2016
arXiv
pre-print
dynamic differential privacy. ...
This paper focuses on a class of regularized empirical risk minimization (ERM) machine learning problems, and develops two methods to provide differential privacy to distributed learning algorithms over ...
In this work, we extend the notion of differential privacy to a dynamic setting, and define dynamic differential privacy to capture the distributed and iterative nature of the ADMM-based distributed ERM ...
arXiv:1601.03466v3
fatcat:2w2aesxmsrbolg2lch6kbg3jlq
Private Networked Federated Learning for Nonsmooth Objectives
[article]
2024
arXiv
pre-print
We provide complete theoretical proof for the privacy guarantees and the algorithm's convergence to the exact solution. ...
The proposed algorithm relies on the distributed Alternating Direction Method of Multipliers (ADMM) and uses the approximation of the augmented Lagrangian to handle nonsmooth objective functions. ...
To meet the demand for better privacy accuracy tradeoff, the work in [8] proposed dynamic zero-concentrated differential privacy (zCDP) as an alternative to the standard (ǫ, δ)-differential privacy ( ...
arXiv:2306.14012v2
fatcat:3v2dgxpjkjep3iy4vknlzvpcii
Privacy-preserving Distributed Machine Learning via Local Randomization and ADMM Perturbation
[article]
2019
arXiv
pre-print
With the proliferation of training data, distributed machine learning (DML) is becoming more competent for large-scale learning tasks. ...
In this paper, we propose a privacy-preserving ADMM-based DML framework with two novel features: First, we remove the assumption commonly made in the literature that the users trust the server collecting ...
CONCLUSION In this paper, we have provided a privacy-preserving ADMM-based distributed machine learning framework. ...
arXiv:1908.01059v2
fatcat:6n7ex5esh5cg3hthdpic6zoqje
Improving the Privacy and Accuracy of ADMM-Based Distributed Algorithms
[article]
2018
arXiv
pre-print
Alternating direction method of multiplier (ADMM) is a popular method used to design distributed versions of a machine learning algorithm, whereby local computations are performed on local data with the ...
A differentially private ADMM was proposed in prior work (Zhang & Zhu, 2017) where only the privacy loss of a single node during one iteration was bounded, a method that makes it difficult to balance the ...
In International Conference
on Machine Learning, pp. 1701-1709, 2014.
Zhang, T. and Zhu, Q. Dynamic differential privacy for
admm-based distributed classification learning. ...
arXiv:1806.02246v1
fatcat:l22ubz5v75emdnmzknbzyqx4wm
Table of contents
2020
IEEE Transactions on Information Forensics and Security
Ren 987 DP-ADMM: ADMM-Based Distributed Learning With Differential Privacy ................................................ ........................................................................ Z. ...
Jain 880 Compressive Privacy for a Linear Dynamical System ............................... Y. Song, C. X. Wang, and W. P. ...
doi:10.1109/tifs.2019.2940362
fatcat:7ieq2gue2ndr5b3pa5suaba4ke
Differentially Private Federated Learning via Inexact ADMM
[article]
2021
arXiv
pre-print
Differential privacy (DP) techniques can be applied to the federated learning model to protect data privacy against inference attacks to communication among the learning agents. ...
We show that our algorithm provides ϵ̅-DP for every iteration, where ϵ̅ is a privacy parameter controlled by a user. ...
This material was based upon work supported by the U.S. Department of Energy, Office of Science, Advanced Scientific Computing Research, under Contract DE-AC02-06CH11357. ...
arXiv:2106.06127v2
fatcat:wdzjjtoh5ng2tk6rr3ec3lckxq
Towards Federated Bayesian Network Structure Learning with Continuous Optimization
[article]
2022
arXiv
pre-print
We develop a distributed structure learning method based on continuous optimization, using the alternating direction method of multipliers (ADMM), such that only the model parameters have to be exchanged ...
to their data owing to privacy or security concerns. ...
The NIH or NSF is not responsible for the views reported in this article. ...
arXiv:2110.09356v2
fatcat:dticsusmf5h2pkjjaljs77ee74
Local Differential Privacy in Decentralized Optimization
[article]
2019
arXiv
pre-print
Privacy concerns with sensitive data are receiving increasing attention. In this paper, we study local differential privacy (LDP) in interactive decentralized optimization. ...
In an asymptotic view, we address the following question: Under LDP, is it possible to design a distributed private minimizer for arbitrary closed convex constraints with utility loss not explicitly dependent ...
To embed the notion in distributed learning where A is selected as the ADMM, the functions f i act as the input and are the privacy concern. ...
arXiv:1902.06101v2
fatcat:kz47bqisbfdb7fhon75hjxbcay
Distributed Learning Applications in Power Systems: A Review of Methods, Gaps, and Challenges
2021
Energies
Distributed learning is a collaboratively decentralized machine learning algorithm designed to handle large data sizes, solve complex learning problems, and increase privacy. ...
This paper introduces three existing distributed learning frameworks and reviews the applications that have been proposed for them in power systems so far. ...
(IPE) for data sharing [12] , differential privacy (DP) technique for load data privacy [13] , generative adversarial networks (GANs) for power generation data [14] , decomposition algorithm-based ...
doi:10.3390/en14123654
fatcat:25fv4mw2dfan5c23lhi3l45tte
Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications
[article]
2020
arXiv
pre-print
To achieve this goal, it is essential to cater for high ML inference accuracy at scale under time-varying channel and network dynamics, by continuously exchanging fresh data and ML model updates in a distributed ...
Machine learning (ML) is a promising enabler for the fifth generation (5G) communication systems and beyond. ...
In this regard, leveraging the alternating direction method of multipliers (ADMM) method, group ADMM (GADMM) aims to enable distributed learning without any central entity while communicating only with ...
arXiv:2008.02608v1
fatcat:luuo5pja5zfihhpybger6tuqrq
2020 Index IEEE Transactions on Information Forensics and Security Vol. 15
2020
IEEE Transactions on Information Forensics and Security
., +, TIFS 2020 3107-3122
Convergence
DP-ADMM: ADMM-Based Distributed Learning With Differential Privacy. ...
Kwon, H., +, TIFS 2020 526-538
Distributed algorithms
Recycled ADMM: Improving the Privacy and Accuracy of Distributed Algo-
rithms. ...
G Gait analysis Deep Learning-Based Gait Recognition Using Smartphones in the Wild. ...
doi:10.1109/tifs.2021.3053735
fatcat:eforexmnczeqzdj3sc2j4yoige
« Previous
Showing results 1 — 15 out of 249 results