نتایج جستجو برای: mutual information theory mi

تعداد نتایج: 1876105  

2015
Øivind Strand Loet Leydesdorff

Article history: Received 16 September 2011 Received in revised form 3 August 2012 Accepted 19 August 2012 Available online 5 September 2012 Using information theory and data for all (0.5 million) Norwegian firms, the national and regional innovation systems are decomposed into three subdynamics: (i) economic wealth generation, (ii) technological novelty production, and (iii) government interve...

جمشید صالحی صدقیانی, , رامین جباری, , مقصود امیری, ,

Abstract The present study aims at determining a proper decision making model for investment. In this regard, the effective criteria for evaluating the performance of mutual funds are extracted through reviewing research literature. Afterwards, the importance of each criterion (sharp, trainer, Jensen, Sortino) will be assessed through using the Shannon entropy. The study sample includes eight ...

Journal: :Journal of Machine Learning Research 2014
Zoltán Szabó

Since the pioneering work of Shannon, entropy, mutual information, association, divergence measures and kernels on distributions have found a broad range of applications in many areas of machine learning. Entropies provide a natural notion to quantify the uncertainty of random variables, mutual information and association indices measure the dependence among its arguments, divergences and kerne...

Journal: :Inf. Sci. 2014
Miguel Ángel Gómez-Villegas Paloma Main Paola Viviani

We introduce a methodology for sensitivity analysis of evidence variables in Gaussian Bayesian networks. Knowledge of the posterior probability distribution of the target variable in a Bayesian network, given a set of evidence, is desirable. However, this evidence is not always determined; in fact, additional information might be requested to improve the solution in terms of reducing uncertaint...

2005
Manu Bansal Indranil Sarkar

In this work, we propose a wavelet-based hierarchical approach using mutual information (MI) to solve the correspondence problem in stereo vision. The correspondence problem involves identifying corresponding pixels between images of a given stereo pair. This results in a disparity map which is required to extract depth information of the relevant scene. Until recently, mostly correlation-based...

2004
Kostadin Koroutchev David R. C. Dominguez Eduardo Serrano Francisco de Borja Rodríguez Ortiz

Following Gardner [1], we calculate the information capacity and other phase transition related parameters for a symmetric Hebb network with small word topology in mean-field approximation. It was found that the topology dependence can be described by very small number of parameters, namely the probability of existence of loops with given length. In the case of small world topology, closed alge...

Journal: :Neural computation 2005
Qing Song

We focus on the scenario of robust information clustering (RIC) based on the minimax optimization of mutual information (MI). The minimization of MI leads to the standard mass-constrained deterministic annealing clustering, which is an empirical risk-minimization algorithm. The maximization of MI works out an upper bound of the empirical risk via the identification of outliers (noisy data point...

1997
Antonio Turiel Elka Korutcheva Néstor Parga

We calculate the mutual information (MI) of a two-layered neural network with noiseless, continuous inputs and binary, stochastic outputs under several assumptions on the synaptic efficiencies. The interesting regime corresponds to the limit where the number of both input and output units is large but their ratio is kept fixed at a value α. We first present a solution for the MI using the repli...

Journal: :Entropy 2017
Tarald O. Kvålseth

Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید