نتایج جستجو برای: distilling

تعداد نتایج: 796  

Journal: :Physical review letters 2002
J Eisert S Scheel M B Plenio

We show that no distillation protocol for Gaussian quantum states exists that relies on (i) arbitrary local unitary operations that preserve the Gaussian character of the state and (ii) homodyne detection together with classical communication and postprocessing by means of local Gaussian unitary operations on two symmetric identically prepared copies. This is in contrast to the finite-dimension...

Journal: :CoRR 2017
Nicholas Frosst Geoffrey E. Hinton

Deep neural networks have proved to be a very effective way to perform classification tasks. They excel when the input data is high dimensional, the relationship between the input and the output is complicated, and the number of labeled training examples is large [Szegedy et al., 2015, Wu et al., 2016, Jozefowicz et al., 2016, Graves et al., 2013]. But it is hard to explain why a learned networ...

Journal: :Science 2009
Michael Schmidt Hod Lipson

For centuries, scientists have attempted to identify and document analytical laws that underlie physical phenomena in nature. Despite the prevalence of computing power, the process of finding natural laws and their corresponding equations has resisted automation. A key challenge to finding analytic relations automatically is defining algorithmically what makes a correlation in observed data imp...

2017
Jiangyan Yi Jianhua Tao Zhengqi Wen Ya Li

This paper proposes an approach to distill knowledge from an ensemble of models to a single deep neural network (DNN) student model for punctuation prediction. This approach makes the DNN student model mimic the behavior of the ensemble. The ensemble consists of three single models. Kullback-Leibler (KL) divergence is used to minimize the difference between the output distribution of the DNN st...

Journal: :CoRR 2018
Jiangyan Yi Jianhua Tao Zhengqi Wen Bin Liu

In order to improve the performance for far-field speech recognition, this paper proposes to distill knowledge from the close-talking model to the far-field model using parallel data. The close-talking model is called the teacher model. The farfield model is called the student model. The student model is trained to imitate the output distributions of the teacher model. This constraint can be re...

Journal: :CoRR 2016
Bharat Bhusan Sau Vineeth N. Balasubramanian

The remarkable successes of deep learning models across various applications have resulted in the design of deeper networks that can solve complex problems. However, the increasing depth of such models also results in a higher storage and runtime complexity, which restricts the deployability of such very deep models on mobile and portable devices, which have limited storage and battery capacity...

Journal: :PLoS ONE 2008
Qing-Peng Kong Antonio Salas Chang Sun Noriyuki Fuku Masashi Tanaka Li Zhong Cheng-Ye Wang Yong-Gang Yao Hans-Jürgen Bandelt

BACKGROUND Large-scale genome sequencing poses enormous problems to the logistics of laboratory work and data handling. When numerous fragments of different genomes are PCR amplified and sequenced in a laboratory, there is a high imminent risk of sample confusion. For genetic markers, such as mitochondrial DNA (mtDNA), which are free of natural recombination, single instances of sample mix-up i...

2008
E. A. Meirom N. H. Lindner Y. Berlatzky E. Poem N. Akopian J. E. Avron

We develop a framework to calculate the density matrix of a pair of photons emitted in a decay cascade with partial " which path " ambiguity. We describe an appropriate entanglement distillation scheme which works also for certain random cascades. The qualitative features of the distilled entanglement are presented in a two dimensional " phase diagram ". The theory is applied to the quantum tom...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید