نتایج جستجو برای: فراشبیه mlp
تعداد نتایج: 5049 فیلتر نتایج به سال:
We apply multilayer perceptron (MLP) based hierarchical Tandem features to large vocabulary continuous speech recognition in Mandarin. Hierarchical Tandem features are estimated using a cascade of two MLP classifiers which are trained independently. The first classifier is trained on perceptual linear predictive coefficients with a 90 ms temporal context. The second classifier is trained using ...
We consider simple cube-curves in the orthogonal 3D grid. The union of all cells contained in such a curve (also called the tube of this curve) is a polyhedrally bounded set. The curve’s length is defined to be that of the minimum-length polygonal curve (MLP) fully contained and complete in the tube of the curve. So far no provable general algorithm is known for the approximative calculation of...
A pruning schema is applied to Multi-Layer Perceptron (MLP) gender classi£er. MLP uses eigenvector coef£cients of the face space created by Principal Component Analysis (PCA). We show that pruning improves the initial MLP performance by preserving the most effective input while eliminating most of the units and connections. Pruning is also used as a tool to monitor which eigenvectors contribute...
In tandem systems, the outputs of multi-layer perceptron (MLP) classifiers have been successfully used as features for HMM-based automatic speech recognition. In this paper, we propose a data-driven clustered hierarchical tandem system that yields improved performance on a large-vocabulary broadcast news transcription task. The complicated global learning for a large monolithic MLP classifier i...
In this paper, the application of neural networks to study the design of short-term load forecasting (STLF) Systems for Illam state located in west of Iran was explored. One important architecture of neural networks named Multi-Layer Perceptron (MLP) to model STLF systems was used. Our study based on MLP was trained and tested using three years (2004-2006) data. The results show that MLP networ...
Multi-layer perceptrons (MLP) make powerful classifiers that may provide superior performance compared with other classifiers, but are often criticized for the number of free parameters. Most commonly, parameters are set with the help of either a validation set or crossvalidation techniques, but there is no guarantee that a pseudo-test set is representative. Further difficulties with MLPs inclu...
This article, presents some results obtained in the face recognition using Multi-Layer Perceptrons (MLP) Neural Networks for classification. Two designs are studied: single network model and multi networks model. The input images are resized, and converted to a vector of pixels before they are applied to the input of the MLP Network. The back propagation algorithm is used to train the MLP netwo...
This paper proposes the use of new target vectors for MLP learning in EEG signal classification. A large Euclidean distance provided by orthogonal bipolar vectors as new target ones is explored to improve the learning and generalization abilities of MLPs. The data set consisted of EEG signals captured from normal individuals and individuals under brain-death protocol. Experimental results are r...
It is difficult to train a multi-layer perceptron (MLP) when there are only a few labeled samples available. However, by pretraining an MLP with vast amount of unlabeled samples available, we may achieve better generalization performance. Schulz et al. (2012) showed that it is possible to pretrain an MLP in a less greedy way by utilizing the two-layer contractive encodings, however, with a cost...
Multilayer perceptron network (MLP), FIR neural network and Elman neural network were compared in four different time series prediction tasks. Time series include load in an electric network series, fluctuations in a far-infrared laser series, numerically generated series and behaviour of sunspots series. FIR neural network was trained with temporal backpropagation learning algorithm. Results s...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید