نتایج جستجو برای: hidden layer
تعداد نتایج: 345063 فیلتر نتایج به سال:
Feature Extraction Features are extracted using a two-layer stacked auto-encoder (SAE) with 1440 input neurons, 500 first hidden layer neurons and 100 2nd layer hidden neurons. Each CQT input patch consists 8 frames with 180 frequency bins or one measure of the song and each chroma feature input consists of 144 frequency bins and 10 frames with 50% overlap. The SAE is then trained on a set of c...
the aim of this research is the assessment of the relation between rainfall and large scale synoptically patterns at khorasan razavi province. in this study, using adaptive neuro fuzzy inference system, the rainfall estimation has been done from april to june in the area under study. spring rainfall data including the information of 38 synoptic, climatologic and rain gauge stations from 1970 to...
An artificial neural network has been used to determine the volume flux and rejections of Ca2+ , Na+ and Cl¯, as a function of transmembrane pressure and concentrations of Ca2+, polyethyleneimine, and polyacrylic acid in water softening by nanofiltration process in presence of polyelectrolytes. The feed-forward multi-layer perceptron artificial neural network including an eight-neuron hidde...
We propose a novel architecture of recurrent neural networks (RNNs) for causal prediction which we call Entangled RNN (E-RNN). To issue causal predictions, E-RNN can propagate the backward hidden states of Bi-RNN through an additional forward hidden layer. Unlike a 2-layer RNNs, all the hidden states of E-RNN depend on all the inputs seen so far. Furthermore, unlike a Bi-RNN, for causal predict...
In this paper, a novel single-hidden-layer feed-forward quantum neural network model is proposed based on some concepts and principles in the quantum theory. By combining the quantum mechanism with the feed-forward neural network, we defined quantum hidden neurons and connected quantum weights, and used them as the fundamental information processing unit in a single-hidden-layer feed-forward ne...
Several neural network architectures have been developed over the past several years. One of the most popular and most powerful architectures is the multilayer perceptron. This architecture will be described in detail and recent advances in training of the multilayer perceptron will be presented. Multilayer perceptrons are trained using various techniques. For years the most used training metho...
This work explores the neural features that are trained by decreasing a discriminative energy. It directly resolves the unfaithful representation problem and the ambiguous internal representation problem in various backpropagation training algorithms for MLP. It also indirectly overcomes the premature saturation problem. Keywords—Multilayer perceptron; deep learning; Boltzmann machine; ambiguou...
Extreme Learning Machine (ELM) is a neural network architecture in which hidden layer weights are randomly chosen and output layer weights determined analytically. We interpret ELM as an approximation to a network with infinite number of hidden units. The operation of the infinite network is captured by neural network kernel (NNK). We compare ELM and NNK both as part of a kernel method and in n...
It is well known that standard single-hidden layer feedforward networks (SLFNs) with at most N hidden neurons (including biases) can learn N distinct samples (x(i),t(i)) with zero error, and the weights connecting the input neurons and the hidden neurons can be chosen "almost" arbitrarily. However, these results have been obtained for the case when the activation function for the hidden neurons...
We present a mathematical construction for the restricted Boltzmann machine (RBM) that does not require specifying the number of hidden units. In fact, the hidden layer size is adaptive and can grow during training. This is obtained by first extending the RBM to be sensitive to the ordering of its hidden units. Then, with a carefully chosen definition of the energy function, we show that the li...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید