نتایج جستجو برای: gradient descent algorithm
تعداد نتایج: 869527 فیلتر نتایج به سال:
This paper proposes a global learning of neural networks by hybrid optimization algorithm. The hybrid algorithm combines a stochastic approximation with a gradient descent. The stochastic approximation is first applied for estimating an approximation point inclined toward a global escaping from a local minimum, and then the backpropagation(BP) algorithm is applied for high-speed convergence as ...
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the descent condition at each iteration of the Polak-Ribière-Polyak conjugate gradient algorithm. It is proved that this line search algorithm preserves the usual convergence properties of any descent algorithm. In particular, it is shown that the Zoutendijk condition holds...
Recently several novel gradient descent approaches like natural or relative gradient methods have been proposed to derive rigorously various powerful ICA algorithms. In this paper we propose some extensions of Amari’s Natural Gradient and Atick-Redlich formulas. They allow us to derive rigorously some already known algorithms, like for example, robust ICA algorithm and local algorithm for blind...
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [Y.H. Dai and Y, Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), pp.177-182.] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter k δ is proposed. The parameter k δ is computed by means of the conj...
The descent auxiliary problem method allows one to nd the solution of minimization problems by solving a sequence of auxiliary problems which incorporate a linesearch strategy. We derive the basic algorithm and study its convergence properties within the framework of innnite dimensional pseudoconvex minimization. We also introduce a partial descent type auxiliary problem method which partially ...
Restricted Boltzmann Machines (RBMs) are widely used as building blocks for deep learning models. Learning typically proceeds by using stochastic gradient descent, and the gradients are estimated with sampling methods. However, the gradient estimation is a computational bottleneck, so better use of the gradients will speed up the descent algorithm. To this end, we first derive upper bounds on t...
in this paper, the artificial neural network (ann) approach is applied for forecasting groundwater level fluctuation in aghili plain,southwest iran. an optimal design is completed for the two hidden layers with four different algorithms: gradient descent withmomentum (gdm), levenberg marquardt (lm), resilient back propagation (rp), and scaled conjugate gradient (scg). rain,evaporation, relative...
in this paper, the artificial neural network (ann) approach is applied for forecasting groundwater level fluctuation in aghili plain,southwest iran. an optimal design is completed for the two hidden layers with four different algorithms: gradient descent withmomentum (gdm), levenberg marquardt (lm), resilient back propagation (rp), and scaled conjugate gradient (scg). rain,evaporation, relative...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید