Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








929 Hits in 5.8 sec

Accuracy, Interpretability, and Differential Privacy via Explainable Boosting [article]

Harsha Nori, Rich Caruana, Zhiqi Bu, Judy Hanwen Shen, Janardhan Kulkarni
2021 arXiv   pre-print
Our experiments on multiple classification and regression datasets show that DP-EBM models suffer surprisingly little accuracy loss even with strong differential privacy guarantees.  ...  We show that adding differential privacy to Explainable Boosting Machines (EBMs), a recent method for training interpretable ML models, yields state-of-the-art accuracy while protecting privacy.  ...  We would like to thank Paul Koch, Scott Lundberg, Samuel Jenkins, and Joshua Allen for their thoughtful discussions and copyediting.  ... 
arXiv:2106.09680v1 fatcat:hbc44bztandm3m3fydn4mxmwiq

Interpreting Intelligibility under Uncertain Data Imputation

Brian Y. Lim, Danding Wang, Tze Ping Loh, Kee Yuan Ngiam
2018 International Conference on Intelligent User Interfaces  
This work aims to improve the understanding and trust of intelligible healthcare analytics in clinical end users to help drive the adoption of AI.  ...  feature attribution with uncertainty due to missing data imputation.  ...  FUTURE USER EXPERIMENTS: DISEASE RISK PREDICTION USE CASE We will investigate the impact of missing data on user trust in the explanations with an application use case in predictive healthcare analytics  ... 
dblp:conf/iui/LimWLN18 fatcat:4hmir4ccrfh6xfbr45uttd7n4q

Differential Privacy Techniques for Cyber Physical Systems: A Survey [article]

Muneeb Ul Hassan, Mubashir Husain Rehmani, Jinjun Chen
2019 arXiv   pre-print
In particular, we survey the application and implementation of differential privacy in four major applications of CPSs named as energy systems, transportation systems, healthcare and medical systems, and  ...  Passive attacks are being used by intruders to get access to private information of CPSs.  ...  Similar to Laplace mechanism, noise in Gaussian mechanism is calculated using normal (Gaussian) distribution [56] , [86] .  ... 
arXiv:1812.02282v3 fatcat:bnapnprldnaetjnjedz473lrme

DIFFERENTIAL PRIVACY FOR IOT-ENABLED CRITICAL INFRASTRUCTURE: A COMPREHENSIVE SURVEY

Muhammad Akbar Husnoo, Adnan Anwar, Ripon K. Chakrabortty, Robin Doss, Mike J. Ryan
2021 IEEE Access  
(ITSs), healthcare and medical systems, and Industrial Internet of Things (IIoT).  ...  However, for various reasons, those proposed solutions are not well suited for modern IoT-enabled critical infrastructure.  ...  The randomized Exponential Algorithm, E, can be denoted as: E(B, s) = l : P [l ∈ L] ∝ exp( s(B, s)/2∆s) (4) 3) Gaussian Mechanism: Gaussian Mechanism is another well-known method used for implementation  ... 
doi:10.1109/access.2021.3124309 fatcat:vejtyjyrwffeffi7ob2o2svyja

Privacy-Preserving Convex Optimization: When Differential Privacy Meets Stochastic Programming [article]

Vladimir Dvorkin, Ferdinando Fioretto, Pascal Van Hentenryck, Pierre Pinson, Jalal Kazempour
2022 arXiv   pre-print
The chance-constrained optimization additionally internalizes the conditional value-at-risk measure to model the tolerance towards the worst-case realizations of the optimality loss with respect to the  ...  non-private solution.  ...  Acknowledgments This work is supported by the Marie Sk lodowska-Curie Actions COFUND Postdoctoral Program with Iberdrola Group, Grant Agreement #101034297 -project Learning ORDER.  ... 
arXiv:2209.14152v1 fatcat:33am2be7jzg37i3m42nj5rcrva

Distributed Differentially Private Computation of Functions with Correlated Noise [article]

Hafiz Imtiaz, Jafar Mohammadi, Anand D. Sarwate
2021 arXiv   pre-print
Empirical results on regression and neural network problems for both synthetic and real datasets show that differentially private methods can be competitive with non-private algorithms in many scenarios  ...  CAPE can be used in conjunction with the functional mechanism for statistical and machine learning optimization problems.  ...  This class includes optimization algorithms, such as empirical risk minimization (ERM) problems, common in machine learning (ML) applications. Related Works.  ... 
arXiv:1904.10059v3 fatcat:rvtgmnq44jgl7mcbr7egepo7dy

Differentially Private Pre-Trained Model Fusion using Decentralized Federated Graph Matching [article]

Qian Chen, Yiqiang Chen, Xinlong Jiang, Teng Zhang, Weiwei Dai, Wuliang Huang, Zhen Yan, Bo Ye
2023 arXiv   pre-print
Through extensive experiments conducted on diverse image datasets and real-world healthcare applications, we provide empirical evidence showcasing the effectiveness of PrivFusion in maintaining model performance  ...  To enhance model privacy, our approach incorporates a hybrid local differentially private mechanism and decentralized federated graph matching, effectively protecting both activation values and weights  ...  The comparative methods include the Non-Private mechanism, Laplace mechanism, Gaussian mechanism, MultiBit mechanism, and our proposed hybrid mechanism with Perturbation-Filter Adapter (PFA).  ... 
arXiv:2311.03396v1 fatcat:fgkbvzanbfgsdikkxpuv2uik3m

Privacy-Preserving In-Context Learning with Differentially Private Few-Shot Generation [article]

Xinyu Tang, Richard Shin, Huseyin A. Inan, Andre Manoel, Fatemehsadat Mireshghallah, Zinan Lin, Sivakanth Gopi, Janardhan Kulkarni, Robert Sim
2024 arXiv   pre-print
These results open up new possibilities for ICL with privacy protection for a broad range of applications.  ...  This scenario poses privacy risks, as LLMs may leak or regurgitate the private examples demonstrated in the prompt.  ...  Gaussian mechanism vs Exponential mechanism. We finally experiment with the Gaussian and Exponential mechanisms for the DP mechanism in Alg. 1 and present the results in Tab. 4.  ... 
arXiv:2309.11765v2 fatcat:al2ctocuujhkhhdrmes5z77vby

Private Graph Data Release: A Survey [article]

Yang Li, Michael Purcell, Thierry Rakotoarivelo, David Smith, Thilina Ranbaduge, Kee Siong Ng
2022 arXiv   pre-print
This paper provides a comprehensive survey of private graph data release algorithms that seek to achieve the fine balance between privacy and utility, with a specific focus on provably private mechanisms  ...  However, the increasingly widespread adoption of graph analytics comes with a commensurate increase in the need to protect private information in graph data, especially in light of the many privacy breaches  ...  Another work [137] discusses the challenges in developing practical privacy-preserving analytics in IoT-based healthcare information systems.  ... 
arXiv:2107.04245v2 fatcat:54bvnswpnbfffiqd5ee5opfope

Group privacy for personalized federated learning [article]

Filippo Galli, Sayan Biswas, Kangsoo Jung, Tommaso Cucinotta, Catuscia Palamidessi
2022 arXiv   pre-print
To cope with the issue of protecting the privacy of the clients and allowing for personalized model training to enhance the fairness and utility of the system, we propose a method to provide group privacy  ...  On the other hand, with the recent advancements in various techniques to analyze data, there is a surge of concern for the privacy violation of the participating clients.  ...  Related works With the generalized Federated Averaging algorithm [36, 50] to solve the empirical risk minimization problem in Equation ( 5 ), an aggregated global model is optimized iteratively by a  ... 
arXiv:2206.03396v2 fatcat:ugdj54sgs5gy3nsplxlb2hmt4a

Recent trends towards privacy‐preservation in Internet of Things, its challenges and future directions

Mahdi Safaei Yaraziz, Ahmad Jalili, Mehdi Gheisari, Yang Liu
2022 Zenodo  
The Gaussian mechanism is the building block of a proprietary algorithm for minimising empirical risk based on stochastic gradient descent.  ...  This method applies a Gaussian mechanism to distort each client's local updates.  ...  Since the proposed method is in its early stages, the authors' main future direction for this work is to implement the system in a testable system to provide some guarantees of security and performance  ... 
doi:10.5281/zenodo.7503222 fatcat:schwtuw535h2zjfkqhhzdlbjxi

A Comprehensive Survey of the Internet of Things (IoT) and Edge Computing in Healthcare

Fatima Alshehri, Ghulam Muhammad
2020 IEEE Access  
Many papers in the literature deal with smart health care or health care in general.  ...  [85] realized a security mechanism for a smart healthcare system using the IoMT.  ...  The presented system was validated on a private dataset which included 36 subjects. The analysis was performed using the Infomax and entropy bound minimization (EBM) algorithms.  ... 
doi:10.1109/access.2020.3047960 fatcat:adbkd6gfg5dtnigtumglmtsecu

Do I Get the Privacy I Need? Benchmarking Utility in Differential Privacy Libraries [article]

Gonzalo Munilla Garrido, Joseph Near, Aitsam Muhammad, Warren He, Roman Matzutt, Florian Matthes
2021 arXiv   pre-print
This paper studies five libraries that offer differentially private analytics: Google DP, SmartNoise, diffprivlib, diffpriv, and Chorus.  ...  in differential privacy tools for non-experts.  ...  The libraries' goal is to report the approximate results of the queries in W on the private datasets in D while incurring minimal error. 1 .  ... 
arXiv:2109.10789v1 fatcat:j2tak3jjj5du5dbfeurpkewnna

PAPERS FROM ACTUARIAL JOURNALS WORLDWIDE

2012 Annals of Actuarial Science  
We call this risk sharing mechanism the conditional mean risk sharing.  ...  For the valuation of reverse mortgages with tenure payments, this article proposes a specific analytic valuation framework with mortality risk, interest rate risk, and housing price risk that helps determine  ...  Risk transformation can come about through changes in the operation of a business, explicit risk transfer mechanisms, financial changes, etc.  ... 
doi:10.1017/s1748499512000309 fatcat:npuvnlcspjhlngqxuwmuejsegy

Model Explanations with Differential Privacy [article]

Neel Patel, Reza Shokri, Yair Zick
2020 arXiv   pre-print
We evaluate the implications of differentially private models and our privacy mechanisms on the quality of model explanations.  ...  We design an adaptive differentially private gradient descent algorithm, that finds the minimal privacy budget required to produce accurate explanations.  ...  Introduction Machine learning models are currently applied in a variety of high-stakes domains, e.g. providing predictive healthcare analytics, assessing insurance policies, and making credit decisions  ... 
arXiv:2006.09129v1 fatcat:zauokesjpreuxj5o45sfkti5em
« Previous Showing results 1 — 15 out of 929 results