Implementing Gaussian process inference with neural networks.

نویسندگان

  • Marcus Frean
  • Matt Lilley
  • Phillip Boyle
چکیده

Gaussian processes compare favourably with backpropagation neural networks as a tool for regression, and Bayesian neural networks have Gaussian process behaviour when the number of hidden neurons tends to infinity. We describe a simple recurrent neural network with connection weights trained by one-shot Hebbian learning. This network amounts to a dynamical system which relaxes to a stable state in which it generates predictions identical to those of Gaussian process regression. In effect an infinite number of hidden units in a feed-forward architecture can be replaced by a merely finite number, together with recurrent connections.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Eecient Implementation of Gaussian Processes

Neural networks and Bayesian inference provide a useful framework within which to solve regression problems. However their parameterization means that the Bayesian analysis of neural networks can be diicult. In this paper, we investigate a method for regression using Gaussian process priors which allows exact Bayesian analysis using matrix manipulations. We discuss the workings of the method in...

متن کامل

Forecasting Industrial Production in Iran: A Comparative Study of Artificial Neural Networks and Adaptive Nero-Fuzzy Inference System

Forecasting industrial production is essential for efficient planning by managers. Although there are many statistical and mathematical methods for prediction, the use of intelligent algorithms with desirable features has made significant progress in recent years. The current study compared the accuracy of the Artificial Neural Networks (ANN) and Adaptive Nero-Fuzzy Inference System (ANFIS) app...

متن کامل

Accelerating Deep Gaussian Processes Inference with Arc-Cosine Kernels

Deep Gaussian Processes (DGPs) are probabilistic deep models obtained by stacking multiple layers implemented through Gaussian Processes (GPs). Although attractive from a theoretical point of view, learning DGPs poses some significant computational challenges that arguably hinder their application to a wider variety of problems for which Deep Neural Networks (DNNs) are the preferred choice. We ...

متن کامل

Bayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference

We present an efficient Bayesian convolutional neural network (convnet). The model offers better robustness to over-fitting on small data than traditional approaches. This is by placing a probability distribution over the convnet’s kernels (also known as filters). We approximate the model’s intractable posterior with Bernoulli variational distributions. This requires no additional model paramet...

متن کامل

Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection

In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • International journal of neural systems

دوره 16 5  شماره 

صفحات  -

تاریخ انتشار 2006