Learning Differentially Private Recurrent Language Models
نویسندگان
چکیده
We demonstrate that it is possible to train large recurrent language models with user-level differential privacy guarantees with only a negligible cost in predictive accuracy. Our work builds on recent advances in the training of deep networks on user-partitioned data and privacy accounting for stochastic gradient descent. In particular, we add user-level privacy protection to the federated averaging algorithm, which makes “large step” updates from user-level data. Our work demonstrates that given a dataset with a sufficiently large number of users (a requirement easily met by even small internet-scale datasets), achieving differential privacy comes at the cost of increased computation, rather than in decreased utility as in most prior work. We find that our private LSTM language models are quantitatively and qualitatively similar to un-noised models when trained on a large dataset.
منابع مشابه
Learning Differentially Private Recurrent Language Models
We demonstrate that it is possible to train large recurrent language models with user-level differential privacy guarantees with only a negligible cost in predictive accuracy. Our work builds on recent advances in the training of deep networks on user-partitioned data and privacy accounting for stochastic gradient descent. In particular, we add user-level privacy protection to the federated ave...
متن کاملLearning Differentially Private Language Models Without Losing Accuracy
We demonstrate that it is possible to train large recurrent language models with user-level differential privacy guarantees without sacrificing predictive accuracy. Our work builds on recent advances in the training of deep networks on userpartitioned data and privacy accounting for stochastic gradient descent. In particular, we add user-level privacy protection to the federated averaging algor...
متن کاملPublic Schools and Private Language Institutes: Any Differences in Students’ L2 Motivational Self System?
To enrich our understanding of the attitudinal/motivational basis of foreign language learning at junior high school level, this study investigated the students’ status of L2 motivation, the relationship between motivational factors, and the possibility of predicting their motivated learning behavior in light of Dörnyei’s (2005, 2009) theory of L2 Motivational Self System. To this end, 1462 jun...
متن کاملThe Impact of Language Learning Activities on the Spoken Language Development of 5-6-Year-Old Children in Private Preschool Centers of Langroud
The Impact of Language Learning Activities on the Spoken Language Development of 5-6-Year-Old Children in Private Preschool Centers of Langroud N. Bagheri, M.A. E. Abbasi, Ph.D. M. GeramiPour, Ph.D. The present study was conducted to investigate the impact of language learning activities on development of spoken language in 5-6-year-old children at private preschool center...
متن کاملDifferentially Private Distributed Learning for Language Modeling Tasks
One of the big challenges in machine learning applications is that training data can be different from the real-world data faced by the algorithm. In language modeling, users’ language (e.g. in private messaging) could change in a year and be completely different from what we observe in publicly available data. At the same time, public data can be used for obtaining general knowledge (i.e. gene...
متن کامل