نتایج جستجو برای: feedforward neural networks

تعداد نتایج: 638547  

1996
H. Gemmeke W. Eppler T. Fischer A. Menchikov S. Neusser

Two novel neural chips SAND (Simple Applicable Neural Device) and SIOP (Serial Input Operating Parallel) are described. Both are highly usable for hardware triggers in particle physics. The chips are optimized for a high input data rate at a very low cost basis. The performance of a single SAND chip is 200 MOPS due to four parallel 16 bit multipliers and 40 bit adders working in one clock cycle...

1999
John J. Soraghan Amir Hussain Ivy Shim

A general class of Computationally Efficient locally Recurrent Networks (CERN) is described for real-time adaptive signal processing. The structure of the CERN is based on linear-in-the-parameters single-hiddenlayered feedforward neural networks such as the Radial Basis Function (RBF) network, the Volterra Neural Network (VNN) and the recently developed Functionally Expanded Neural Network (FEN...

2010
Rahul P. Deshmukh

The artificial neural networks (ANNs) have been applied to various hydrologic problems recently. This research demonstrates static neural approach by applying Modular feedforward neural network to rainfall-runoff modeling for the upper area of Wardha River in India. The model is developed by processing online data over time using static modular neural network modeling. Methodologies and techniq...

Journal: :CoRR 2003
Artur Rataj

This paper studies how the generalization ability of neurons can be affected by mutual processing of different signals. This study is done on the basis of a feedforward artificial neural network, that is used here as a model of the very basic processes in a network of biological neurons. The mutual processing of signals, called here an interference of signals, can possibly be a good model of pa...

2005
Qin-Yu Zhu A.K P. N. Suganthan Guang-Bin Huang

Extreme learning machine (ELM) [G.-B. Huang, Q.-Y. Zhu, C.-K. Siew, Extreme learning machine: a new learning scheme of feedforward neural networks, in: Proceedings of the International Joint Conference on Neural Networks (IJCNN2004), Budapest, Hungary, 25–29 July 2004], a novel learning algorithm much faster than the traditional gradient-based learning algorithms, was proposed recently for sing...

2003
Pascal Hitzler Anthony Karel Seda

One approach to integrating first-order logic programming and neural network systems employs the approximation of semantic operators by feedforward networks. For this purpose, it is necessary to view these semantic operators as continuous functions on the reals. This can be accomplished by endowing the space of all interpretations of a logic program with topologies obtained from suitable embedd...

Journal: :Pattern Recognition Letters 1998
Manish Sarkar Bayya Yegnanarayana Deepak Khemani

Most of the real life classification problems have ill defined, imprecise or fuzzy class boundaries. Feedforward neural networks with conventional backpropagation learning algorithm are not tailored to this kind of classification problem. Hence, in this paper, feedforward neural networks, that use backpropagation learning algorithm with fuzzy objective functions, are investigated. A learning al...

2004
BAO-LiANG Lu YAN BAI YOSHIKAZU NISHIKAWA

Abs t rac t : W e propose an architecture of a multilayer quadratic perceptron (MLQP) that combines advantages of multilayer perceptrons(MLPs) and higher-order feedforward neural networks. The features of MLQP are in its simple structure, practical number of adjustable connection weights and powerful learning ability. I n this paper, the architecture of MLQP is described, a backpropagation lear...

1998
Gérard DREYFUS

The fundamental property of feedforward neural networks parsimonious approximation makes them excellent candidates for modeling static nonlinear processes from measured data. Similarly, feedback (or recurrent) neural networks have very attractive properties for the dynamic nonlinear modeling of artificial or natural processes; however, the design of such networks is more complex than that of fe...

Journal: :Neurocomputing 2006
Guang-Bin Huang Qin-Yu Zhu Chee Kheong Siew

It is clear that the learning speed of feedforward neural networks is in general far slower than required and it has been a major bottleneck in their applications for past decades. Two key reasons behind may be: (1) the slow gradient-based learning algorithms are extensively used to train neural networks, and (2) all the parameters of the networks are tuned iteratively by using such learning al...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید