Differentially Private Linear Models for Gossip Learning through Data Perturbation
نویسندگان
چکیده
Privacy is a key concern in many distributed systems that are rich in personal data such as networks of smart meters or smartphones. Decentralizing the processing of personal data in such systems is a promising first step towards achieving privacy through avoiding the collection of data altogether. However, decentralization in itself is not enough: Additional guarantees such as differential privacy are highly desirable. Here, we focus on stochastic gradient descent (SGD), a popular approach to implement distributed learning. Our goal is to design differentially private variants of SGD to be applied in gossip learning, a decentralized learning framework. Known approaches that are suitable for our scenario focus on protecting the gradient that is being computed in each iteration of SGD. This has the drawback that each data point can be accessed only a small number of times. We propose a solution in which we effectively publish the entire database in a differentially private way so that linear learners could be run that are allowed to access any (perturbed) data point any number of times. This flexibility is very useful when using the method in combination with distributed learning environments. We show empirically that the performance of the obtained model is comparable to that of previous gradient-based approaches and it is even superior in certain scenarios. TYPE OF PAPER AND
منابع مشابه
Revisiting Differentially Private Regression: Lessons From Learning Theory and their Consequences
Private regression has received attention from both database and security communities. Recent work by Fredrikson et al. (USENIX Security 2014) analyzed the functional mechanism (Zhang et al. VLDB 2012) for training linear regression models over medical data. Unfortunately, they found that model accuracy is already unacceptable with differential privacy when ε = 5. We address this issue, present...
متن کامل(Near) Dimension Independent Risk Bounds for Differentially Private Learning
In this paper, we study the problem of differentially private risk minimization where the goal is to provide differentially private algorithms that have small excess risk. In particular we address the following open problem: Is it possible to design computationally efficient differentially private risk minimizers with excess risk bounds that do not explicitly depend on dimensionality (p) and do...
متن کاملDifferentially Private Model Selection via Stability Arguments and the Robustness of the Lasso
We design differentially private algorithms for statistical model selection. Given a data set and a large, discrete collection of “models”, each of which is a family of probability distributions, the goal is to determine the model that best “fits” the data. This is a basic problem in many areas of statistics and machine learning. We consider settings in which there is a well-defined answer, in ...
متن کاملDifferentially Private Feature Selection via Stability Arguments, and the Robustness of the Lasso
We design differentially private algorithms for statistical model selection. Given a data set and alarge, discrete collection of “models”, each of which is a family of probability distributions, the goal isto determine the model that best “fits” the data. This is a basic problem in many areas of statistics andmachine learning.We consider settings in which there is a well-defined...
متن کاملPrivacy Preserving Machine Learning: Related Work
A practical scenario of PPML is where only one central party has the entire data on which the ML algorithm has to be learned. Agrawal and Ramakrishnan [1] proposed the first method to learn a Decision Tree classifier on a database without revealing any information about individual records. They consider public model private data setting where the algorithm and its parameters are public whereas ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- OJIOT
دوره 3 شماره
صفحات -
تاریخ انتشار 2017