نتایج جستجو برای: error back propagation algorithm
تعداد نتایج: 1172470 فیلتر نتایج به سال:
A comparative study of artificial neural network (ANN) and multiple regression is made to predict the fat tail weight of Balouchi sheep from birth, weaning and finishing weights. A multilayer feed forward network with back propagation of error learning mechanism was used to predict the sheep body weight. The data (69 records) were randomly divided into two subsets. The first subset is the train...
In this paper we developed a new method for computing learning rate Back-propagation algorithm to train feed-forward neural networks. Our idea is based on the approximating inverse Hessian matrix error function originally suggested by Andrie. Experimental results show that proposed considerably improve convergence of chosen test problem.
Cellular neural networks (CNN) were introduced by Chua and Yang in 1998 [1]. The idea of the CNN was inspired from the architecture of the cellular automata and the neural networks. Unlike the conventional neural networks, the CNN has local connectivity property. Since the structure of the CNN resembles the structure of animals retina, the CNN can be used for various image processing applicatio...
در این تحقیق از روش دسته بندی گروهی داده ها (gmdh) جهت تخمین عمق آبشستگی اطراف سازه های هیدرولیکی مورد مطالعه شامل دیواره های جانبی پل، خطوط انتقال سیال و پایه پل می باشند. شبکه gmdh توسط الگوریتم های انتشار برگشتی(back propagation)، لونبرگ-مارکوئت(levenberg-marquardt)، برنامه ریزی ژنتیک(genetic programming)، جامعه پرندگان (particle swarm optimization) و جستجوی گرانشی (gravitational search algo...
The original back-propagation methods were plagued with variable parameters which affected both the convergence properties of the training and the generalisation abilities of the resulting network. These parameters presented many difficulties when attempting to use these networks to solve particular mapping problems. A combination of established numerical minimisation methods (Polak-Ribiere Con...
| This paper describes several algorithms, mapping the back propagation learning algorithm onto a large 2-D torus architecture. To obtain high speedup, we have suggested an approach to combine the possible parallel aspects (training set parallelism, node parallelism and pipelining of training patterns) of the algorithm. Several algorithms were implemented on a 512 processor Fujitsu AP1000 and c...
This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...
This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...
With a lot of successful applications of neural network-based classification, it has been recognized the classification can produce more accurate results than conventional approaches for remotely-sensed data. Although its training procedure is sensitive to the choice of initial network parameters and to over-fitting, the multilayer feed-forward networks trained by the back-propagation algorithm...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید