نتایج جستجو برای: training iteration
تعداد نتایج: 358779 فیلتر نتایج به سال:
Big Data problems in Machine Learning have large number of data points or large number of features, or both, which make training of models difficult because of high computational complexities of single iteration of learning algorithms. To solve such learning problems, Stochastic Approximation offers an optimization approach to make complexity of each iteration independent of number of data poin...
Recently Semi-Supervised learning algorithms such as co-training are used in many application domains. In co-training, two classifiers based on different views of data or on different learning algorithms are trained in parallel and then unlabeled data that are classified differently by the classifiers but for which one classifier has large confidence are labeled and used as training data for th...
Self-training is a semi-supervised learning algorithm in which a learner keeps on labeling unlabeled examples and retraining itself on an enlarged labeled training set. Since the self-training process may erroneously label some unlabeled examples, sometimes the learned hypothesis does not perform well. In this paper, a new algorithm named Setred is proposed, which utilizes a specific data editi...
Injecting carefully chosen noise can speed convergence in the backpropagation training of a convolutional neural network (CNN). The Noisy CNN algorithm speeds training on average because the backpropagation algorithm is a special case of the generalized expectation-maximization (EM) algorithm and because such carefully chosen noise always speeds up the EM algorithm on average. The CNN framework...
Generative adversarial network (GAN) has gotten wide re-search interest in the field of deep learning. Variations of GAN have achieved competitive results on specific tasks. However, the stability of training and diversity of generated instances are still worth studying further. Training of GAN can be thought of as a greedy procedure, in which the generative net tries to make the locally optima...
in this paper, we have proposed a new iterative method for finding the solution of ordinary differential equations of the first order. in this method we have extended the idea of variational iteration method by changing the general lagrange multiplier which is defined in the context of the variational iteration method.this causes the convergent rate of the method increased compared with the var...
This paper investigates a split-complex backpropagation algorithm with momentum (SCBPM) for complex-valued neural networks. Some convergence results for SCBPM are proved under relaxed conditions compared with existing results. The monotonicity of the error function during the training iteration process is also guaranteed. Two numerical examples are given to support the theoretical findings.
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
In this paper, we apply the Newton’s and He’s iteration formulas in order to solve the nonlinear algebraic equations. In this case, we use the stochastic arithmetic and the CESTAC method to validate the results. We show that the He’s iteration formula is more reliable than the Newton’s iteration formula by using the CADNA library.
This paper presents an approach to estimating the parameters of continuous density HMMs for visual speech recognition. One of the key issues of image-based visual speech recognition is normalization of lip location and lighting condition prior to estimating the parameters of HMMs. We presented a normalized training method in which the normalization process is integrated in the model training. T...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید