منابع مشابه
Dropout distillation
Dropout is a popular stochastic regularization technique for deep neural networks that works by randomly dropping (i.e. zeroing) units from the network during training. This randomization process allows to implicitly train an ensemble of exponentially many networks sharing the same parametrization, which should be averaged at test time to deliver the final prediction. A typical workaround for t...
متن کاملPolicy Distillation
Policies for complex visual tasks have been successfully learned with deep reinforcement learning, using an approach called deep Q-networks (DQN), but relatively large (task-specific) networks and extensive training are needed to achieve good performance. In this work, we present a novel method called policy distillation that can be used to extract the policy of a reinforcement learning agent a...
متن کاملDistillation Startup of Fully Thermally Coupled Distillation Columns: Theoretical Examinations
The fully thermally coupled distillation column offers an alternative to conventional distillation towers, with the possibility of savings in both energy and capital costs. This innovative and promising alternative provides the opportunity to separate a multicomponent mixture into fractions with high purities merely in one column. A lack of knowledge still exists when dealing with the startup o...
متن کاملEnergy efficient distillation
Distillation is responsible for a significant amount of the energy consumption of the world’s process industry and also in the natural gas processing. There is a significant energy saving potential that can be obtained by applying new energy saving distillation technology that has appeared in the last two decades. The fully thermally coupled dividing wall columns have the attractive feature of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SCIENTIA SINICA Informationis
سال: 2021
ISSN: 1674-7267
DOI: 10.1360/ssi-2020-0165