EA-CG: An Approximate Second-Order Method for Training Fully-Connected Neural Networks

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

General Backpropagation Algorithm for Training Second-order Neural Networks

The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong non...

متن کامل

Universality of Fully-Connected Recurrent Neural Networks

It is shown from the universality of multi-layer neural networks that any discretetime or continuous-time dynamical system can be approximated by discrete-time or continuous-time recurrent neural networks, respectively.

متن کامل

A Piecewise Approximate Method for Solving Second Order Fuzzy Differential Equations Under Generalized ‎Differentiability‎

In this paper a numerical method for solving second order fuzzy differential equations under generalized differentiability is proposed. This method is based on the interpolating a solution by piecewise polynomial of degree 4 in the range of solution. Moreover we investigate the existence, uniqueness and convergence of approximate solutions. Finally the accuracy of piecewise approximate method b...

متن کامل

on descent spectral cg algorithm for training recurrent neural networks

In this paper, we evaluate the performance of a new class of conjugate gradient methods for training recurrent neural networks which ensure the sufficient descent property. The presented methods preserve the advantages of classical conjugate gradient methods and simultaneously avoid the usually inefficient restarts. Simulation results are also presented using three different recurrent neural ne...

متن کامل

Two-Stage Second Order Training in Feedforward Neural Networks

In this paper, we develop and demonstrate a new 2 order two-stage algorithm called OWO-Newton. First, two-stage algorithms are motivated and the Gauss Newton input weight Hessian matrix is developed. Block coordinate descent is used to apply Newton’s algorithm alternately to the input and output weights. Its performance is comparable to Levenberg-Marquardt and it has the advantage of reduced co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the AAAI Conference on Artificial Intelligence

سال: 2019

ISSN: 2374-3468,2159-5399

DOI: 10.1609/aaai.v33i01.33013337