Differentially private SGD with non-smooth losses
نویسندگان
چکیده
In this paper, we are concerned with differentially private stochastic gradient descent (SGD) algorithms in the setting of convex optimization (SCO). Most existing work requires loss to be Lipschitz continuous and strongly smooth, model parameter uniformly bounded. However, these assumptions restrictive as many popular losses violate conditions including hinge for SVM, absolute robust regression, even least square an unbounded domain. We significantly relax establish privacy generalization (utility) guarantees SGD using output perturbations associated non-smooth losses. Specifically, function is relaxed have ?-Hölder (referred smoothness) which instantiates continuity (?=0) strong smoothness (?=1). prove that noisy smooth perturbation can guarantee (?,?)-differential (DP) attain optimal excess population risk O(dlog?(1/?)n?+1n), up logarithmic terms, complexity O(n2??1+?+n). This shows important trade-off between computational statistically performance. particular, our results indicate ??1/2 sufficient (?,?)-DP while achieving a linear O(n).
منابع مشابه
Differentially Private Local Electricity Markets
Privacy-preserving electricity markets have a key role in steering customers towards participation in local electricity markets by guarantying to protect their sensitive information. Moreover, these markets make it possible to statically release and share the market outputs for social good. This paper aims to design a market for local energy communities by implementing Differential Privacy (DP)...
متن کاملDifferentially Private Data Releasing for Smooth Queries
In the past few years, differential privacy has become a standard concept in the area of privacy. One of the most important problems in this field is to answer queries while preserving differential privacy. In spite of extensive studies, most existing work on differentially private query answering assumes the data are discrete (i.e., in {0, 1}) and focuses on queries induced by Boolean function...
متن کاملDifferentially- and non-differentially-private random decision trees
We consider supervised learning with random decision trees, where the tree construction is completely random. The method was used as a heuristic working well in practice despite the simplicity of the setting, but with almost no theoretical guarantees. The goal of this paper is to shed new light on the entire paradigm. We provide strong theoretical guarantees regarding learning with random decis...
متن کاملSmooth Sensitivity Based Approach for Differentially Private Principal Component Analysis
We consider the challenge of differentially private PCA. Currently known methods for this task either employ the computationally intensive exponential mechanism or require an access to the covariance matrix, and therefore fail to utilize potential sparsity of the data. The problem of designing simpler and more efficient methods for this task has been raised as an open problem in [19]. In this p...
متن کاملDifferentially Private Random Decision Forests using Smooth Sensitivity
We propose a new differentially-private decision forest algorithm that minimizes both the number of queries required, and the sensitivity of those queries. To do so, we build an ensemble of random decision trees that avoids querying the private data except to find the majority class label in the leaf nodes. Rather than using a count query to return the class counts like the current state-ofthe-...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied and Computational Harmonic Analysis
سال: 2022
ISSN: ['1096-603X', '1063-5203']
DOI: https://doi.org/10.1016/j.acha.2021.09.001