نتایج جستجو برای: gradient descent algorithm
تعداد نتایج: 869527 فیلتر نتایج به سال:
2 (First order) Descent methods, rates 2 2.1 Gradient descent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 What can we achieve? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Second order methods: Newton’s method . . . . . . . . . . . . . . . . . 7 2.4 Multistep first order methods . . . . . . . . . . . . . . . . . . . . . . . . 8 2.4.1 Heavy ball method . ...
Stochastic convex optimization is a basic and well studied primitive in machine learning. It is well known that convex and Lipschitz functions can be minimized efficiently using Stochastic Gradient Descent (SGD). The Normalized Gradient Descent (NGD) algorithm, is an adaptation of Gradient Descent, which updates according to the direction of the gradients, rather than the gradients themselves. ...
Stochastic gradient descent remains popular in large-scale machine learning, on account of its very low computational cost and robustness to noise. However, gradient descent is only linearly efficient and not transformation invariant. Scaling by a local measure can substantially improve its performance. One natural choice of such a scale is the Hessian of the objective function: Were it availab...
Backpropagation is the standard training procedure for Multiple Layer Perceptron networks. It is based on the gradient descent to minimize the network error. However, using the gradient descent algorithm leads to some problems with the convergence of the training at all and to restrictions concerning applicable transfer functions as well. This paper describes a complete substitution of the grad...
In order to improve the efficiency and classification ability of Support vector machines (SVM) based on stochastic gradient descent algorithm, three algorithms of improved stochastic gradient descent (SGD) are used to solve support vector machine, which are Momentum, Nesterov accelerated gradient (NAG), RMSprop. The experimental results show that the algorithm based on RMSprop for solving the l...
The stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. This paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (MDNN) and studies the stability of this algorithm. Also, stable learning algorithm for parameters of ...
Vector Quantization (VQ) has its origins in signal processing where it is used for compact, accurate representation of input signals. However, since VQ induces a partitioning of the input space, it can also be used for statistical pattern recognition. In this paper we present a novel gradient descent VQ classi cation algorithm which minimizes the Bayes Risk, which we refer to as the Generalized...
Parallel and distributed algorithms have become a necessity in modern machine learning tasks. In this work, we focus on parallel asynchronous gradient descent [1, 2, 3] and propose a zealous variant that minimizes the idle time of processors to achieve a substantial speedup. We then experimentally study this algorithm in the context of training a restricted Boltzmann machine on a large collabor...
The steepest-descent method is a well-known and effective single-objective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multi-objective optimization by considering the concurrent minimization of n smooth criteria {J i } (i = 1,. .. , n). The novel algorithm is based on the following observation: consider a...
A complex-valued nonlinear gradient descent (CNGD) learning algorithm for a simple finite impulse response (FIR) nonlinear neural adaptive filter with an adaptive amplitude of the complex activation function is proposed. This way the amplitude of the complex-valued analytic nonlinear activation function of a neuron in the learning algorithm is made gradient adaptive to give the complex-valued a...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید