Training and making calculations with mixed order hyper-networks

نویسندگان

  • Kevin Swingler
  • Leslie S. Smith
چکیده

A neural network with mixed order weights, n neurons and a modified Hebbian learning rule can learn any function f : {−1, 1}n → R and reproduce its output as the network’s energy function. The network weights are equal to Walsh coefficients, the fixed point attractors are local maxima in the function, and partial sums across the weights of the network calculate averages for hyperplanes through the function. If the network is trained on data sampled from a distribution, then marginal and conditional probability calculations may be made and samples from the distribution generated from the network. These qualities make the network ideal for optimisation fitness function modelling and make the relationships amongst variables explicit in a way that architectures such as the MLP do not.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Structure discovery in mixed order hyper networks

Correspondence: [email protected] Computing and Mathematics, University of Stirling, FK9 4LA Stirling, UK Abstract Background: Mixed Order Hyper Networks (MOHNs) are a type of neural network in which the interactions between inputs are modelled explicitly by weights that can connect any number of neurons. Such networks have a human readability that networks with hidden units lack. They can be u...

متن کامل

Prioritization the Criteria of Wireless Sensor Networks in the Rehabilitation Supervision Using the Fuzzy MCDM Approach

Introduction: The "Wireless Sensor Network" based rehabilitation is one of the major issue in hospitals. The purpose of this study was to prioritization the criteria of wireless sensor networks in the rehabilitation supervision using the Fuzzy MCDM Approach. Methods: In this descriptive study, the population consisted of all doctors and nurses in Tehran Day's Hospital, with 210 people. From the...

متن کامل

Algorithms for Hyper-Parameter Optimization

Several recent advances to the state of the art in image classification benchmarks have come from better configurations of existing techniques rather than novel approaches to feature learning. Traditionally, hyper-parameter optimization has been the job of humans because they can be very efficient in regimes where only a few trials are possible. Presently, computer clusters and GPU processors m...

متن کامل

Flexible Scheduling of Active Distribution Networks for Market Participation with Considering DGs Availability

The availability of sufficient and economic online capacity to support the network while encountering disturbances and failures leading to supply and demand imbalance has a crucial role in today distribution networks with high share of Distributed Energy Resources (DERs), especially Renewable Energy Resources (RESs). This paper proposes a two-stage decision making framework for the Distribution...

متن کامل

Mixed Precision Training of Convolutional Neural Networks using Integer Operations

The state-of-the-art (SOTA) for mixed precision training is dominated by variants of low precision floating point operations, and in particular FP16 accumulating into FP32 Micikevicius et al. (2017). On the other hand, while a lot of research has also happened in the domain of low and mixed-precision Integer training, these works either present results for non-SOTA networks (for instance only A...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 141  شماره 

صفحات  -

تاریخ انتشار 2014