On Weight-Noise-Injection Training
نویسندگان
چکیده
While injecting weight noise during training has been proposed for more than a decade to improve the convergence, generalization and fault tolerance of a neural network, not much theoretical work has been done to its convergence proof and the objective function that it is minimizing. By applying the Gladyshev Theorem, it is shown that the convergence of injecting weight noise during training an RBF network is almost sure. Besides, the corresponding objective function is essentially the mean square errors (MSE). This objective function indicates that injecting weight noise during training an radial basis function (RBF) network is not able to improve fault tolerance. Despite this technique has been effectively applied to multilayer perceptron, further analysis on the expected update equation of training MLP with weight noise injection is presented. The performance difference between these two models by applying weight injection is discussed.
منابع مشابه
Note on Weight Noise Injection During Training a MLP
Although many analytical works have been done to investigate the change of prediction error of a trained NN if its weights are injected by noise, seldom of them has truly investigated on the dynamical properties (such as objective functions and convergence behavior) of injecting weight noise during training. In this paper, four different online weight noise injection training algorithms for mul...
متن کاملEmpirical studies on weight noise injection based online learning algorithms
While weight noise injection during training has been adopted in attaining fault tolerant neural networks (NNs), theoretical and empirical studies on the online algorithms developed based on these strategies have yet to be complete. In this paper, we present results on two important aspects in online learning algorithms based on combining weight noise injection and weight decay. Through intensi...
متن کاملSNIWD: Simultaneous Weight Noise Injection with Weight Decay for MLP Training
Despite noise injecting during training has been demonstrated with success in enhancing the fault tolerance of neural network, theoretical analysis on the dynamic of this noise injection-based online learning algorithm has far from complete. In particular, the convergence proofs for those algorithms have not been shown. In this regards, this paper presents an empirical study on the non-converge...
متن کاملEnhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
We analyze the effects of analog noise on the synaptic arithmetic during multilayer perceptron training, by expanding the cost function to include noise-mediated terms. Predictions are made in the light of these calculations that suggest that fault tolerance, training quality and training trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct cla...
متن کاملConvergence analysis of on-line weight noise injection training algorithms for MLP networks
Injecting weight noise during training has been proposed for almost two decades as a simple technique to improve fault tolerance and generalization of a multilayer perceptron (MLP). However, little has been done regarding their convergence behaviors. Therefore, we presents in this paper the convergence proofs of two of these algorithms for MLPs. One is based on combining injecting multiplicativ...
متن کامل