نتایج جستجو برای: maximization of entropy

تعداد نتایج: 21174033  

2008
Qiuping A. Wang

We propose an extension of the principle of virtual work of mechanics to random dynamics of mechanical systems. The total virtual work of the interacting forces and inertial forces on every particle of the system is calculated by considering the motion of each particle. Then according to the principle of Lagrange-d’Alembert for dynamical equilibrium, the vanishing ensemble average of the virtua...

2001
A. Kehagias

A Simulated Annealing method is presented for the solution of nonlinear time series estimation problems, by maximization of the a Posteriori Likelihood function. Homogeneous temperature annealing is proposed for smoothing problems and inhomogeneous temperature annealing for filtering problems. Both methods of annealing guarantee convergence to the Maximum A Posteriori Likelihood (MAP) estimate....

2008
A. Y. Abul-Magd

We consider a possible generalization of the random matrix theory, which involves the maximization of Tsallis’ q-parametrized entropy. We discuss the dependence of the spacing distribution on q using a nonextensive generalization of Wigner’s surmises for ensembles belonging to the orthogonal, unitary and symplectic symmetry universal classes. PACS numbers: 03.65.-w, 05.45.Mt, 05.30.Ch

2002
Gustavo Castellano

A strong discrepancy of criteria appears in texts on Statistical Mechanics when associating the partition function of simple magnetic systems with thermodynamical potentials. The aim of this work is to provide an adequate description, starting from the maximization of entropy. Finally, a discussion about some thermodynamic properties is given. r 2002 Elsevier Science B.V. All rights reserved.

Journal: :iranian journal of astronomy and astrophysics 2014
nematollah riazi saeid davatolhagh hooman moradpour

using the usual definitions of information and entropy in quantum gravity and statistical mechanics and the existing views about the relation between information and complexity, we examine the evolution of complexity in an ever expanding universe.

2000
Jose C. Principe Deniz Erdogmus

Adaptive signal processing theory was born and has lived by exclusively exploiting the mean square error criterion. When we think of the goal of least squares without restrictions of Gaussianity, one has to wonder why an information theoretic error criterion is not utilized instead. After all, the goal of adaptive filtering should be to find the linear projection that best captures the informat...

2008
Vladislav Kargin

If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید