نتایج جستجو برای: layer perceptron mlp
تعداد نتایج: 290043 فیلتر نتایج به سال:
A general problem in model selection is to obtain the right parameters that make a model fit observed data. For a Multilayer Perceptron (MLP) trained with Backpropagation (BP), this means finding the right hidden layer size, appropriate initial weights and learning parameters. This paper proposes a method (G-Prop-II) that attempts to solve that problem by combining a genetic algorithm (GA) and ...
In this paper we propose a synergistic melting of neural networks and decision trees (DT) we call neural decision trees (NDT). NDT is an architecture a la decision tree where each splitting node is an independent multilayer perceptron allowing oblique decision functions or arbritrary nonlinear decision function if more than one layer is used. This way, each MLP can be seen as a node of the tree...
A general problem in model selection is to obtain the right parameters that make a model "t observed data. For a multilayer perceptron (MLP) trained with back-propagation (BP), this means "nding appropiate layer size and initial weights. This paper proposes a method (G-Prop, genetic backpropagation) that attempts to solve that problem by combining a genetic algorithm (GA) and BP to train MLPs w...
This work concerns testing the number of parameters in one hidden layer multilayer perceptron (MLP). For this purpose we assume that we have identifiable models, up to a finite group of transformations on the weights, this is for example the case when the number of hidden units is know. In this framework, we show that we get a simple asymptotic distribution, if we use the logarithm of the deter...
A globally convergent homotopy method is deened that is capable of sequentially producing large numbers of stationary points of the multi-layer perceptron mean-squared error surface. Using this algorithm large subsets of the stationary points of two test problems are found. It is shown empirically that the MLP neural network appears to have an extreme ratio of saddle points compared to local mi...
The Fake News Challenge (FNC-1) is a public competition that aims to find automatic methods for detecting fake news. The dataset for the challenge consists of headline-body pairs, with the objective being to classify the pairs as unrelated, agreeing, disagreeing, or discussing. We developed four neural network models for FNC-1, two using a feed-forward architecture and two using a recurrent arc...
Here we have presented an alternate ANN structure called functional link ANN (FLANN) for image denoising. In contrast to a feed forward ANN structure i.e. a multilayer perceptron (MLP), the FLANN is basically a single layer structure in which non-linearity is introduced by enhancing the input pattern with nonlinear function expansion. In this work three different expansions is applied. With the...
Abstract—A glia is a nervous cell in the brain. Currently, the glia is known as a important cell for the human’s cerebration. Because the glia transmits signals to neurons and other glias. We notice features of the glia and consider to apply it for an artificial neural network. In this paper, we propose a Multi-layer perceptron (MLP) with pulse glial chain. The pulse glial chain is inspired fro...
In this paper, we present a new method for epilepsy seizure detection based on autoregressive modelling. The method, termed linear prediction coding (LPC), is used to model ictal and seizure-free EEG signals. It is found that the modeling error energy is substantially higher for ictal EEG signals compared to seizure-free EEG signals. Moreover, it is known that ictal EEG signals have higher ener...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید