Linear programming, recurrent associative memories, and feed-forward neural networks
نویسندگان
چکیده
منابع مشابه
Solving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks
Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints. In this paper, to solve this problem, we combine a discretization method and a neural network method. By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem. Then, we use...
متن کاملsolving linear semi-infinite programming problems using recurrent neural networks
linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints. in this paper, to solve this problem, we combine a discretization method and a neural network method. by a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem. then, we use...
متن کاملA greenhouse control with feed-forward and recurrent neural networks
Greenhouses are classified as complex systems, so it is difficult to implement classical control methods for this kind of process. In our case we have chosen neural network techniques to drive the internal climate of a greenhouse. An Elman neural network has been used to emulate the direct dynamics of the greenhouse. Based on this model, a multilayer feedforward neural network has been trained ...
متن کاملFeed-Forward and Recurrent Neural Networks in Signal Prediction
The paper is devoted to time series prediction using linear, perceptron and Elman neural networks of the proposed pattern structure. Signal wavelet de-noising in the initial stage is discussed as well. The main part of the paper is devoted to the comparison of different models of time series prediction. The proposed algorithm is applied to the real signal representing gas consumption.
متن کاملFeed-Forward Chains of Recurrent Attractor Neural Networks Near Saturation
We perform a stationary state replica analysis for a layered network of Ising spin neurons, with recurrent Hebbian interactions within each layer, in combination with strictly feedforward Hebbian interactions between successive layers. This model interpolates between the fully recurrent and symmetric attractor network studied by Amit el al, and the strictly feed-forward attractor network studie...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computers & Mathematics with Applications
سال: 1991
ISSN: 0898-1221
DOI: 10.1016/0898-1221(91)90036-4