Beyond Gibbs-Boltzmann-Shannon: general entropies—the Gibbs-Lorentzian example

نویسندگان

  • Rudolf A. Treumann
  • Wolfgang Baumjohann
چکیده

*Correspondence: Rudolf A. Treumann, International Space Science Institute, Hallerstrasse 6, CH-3012 Bern, Switzerland e-mail: [email protected] We propose a generalization of Gibbs’ statistical mechanics into the domain of non-negligible phase space correlations. Derived are the probability distribution and entropy as a generalized ensemble average, replacing Gibbs-Boltzmann-Shannon’s entropy definition enabling construction of new forms of statistical mechanics. The general entropy may also be of importance in information theory and data analysis. Application to generalized Lorentzian phase space elements yields the Gibbs-Lorentzian power law probability distribution and statistical mechanics. The corresponding Boltzmann, Fermi and Bose-Einstein distributions are found. They apply only to finite temperature states including correlations. As a by-product any negative absolute temperatures are categorically excluded, supporting a recent “no-negative T” claim.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unique Additive Information Measures- Boltzmann-gibbs-shannon, Fisher and Beyond

It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Further possibilities are investigated, too.

متن کامل

Additive Entropies of degree-q and the Tsallis Entropy

The Tsallis entropy is shown to be an additive entropy of degree-q that information scientists have been using for almost forty years. Neither is it a unique solution to the nonadditive functional equation from which random entropies are derived. Notions of additivity, extensivity and homogeneity are clarified. The relation between mean code lengths in coding theory and various expressions for ...

متن کامل

Thermodynamics and Time–averages

For a dynamical system far from equilibrium, one has to deal with empirical probabilities defined through time–averages, and the main problem is then how to formulate an appropriate statistical thermodynamics. The common answer is that the standard functional expression of Boltzmann-Gibbs for the entropy should be used, the empirical probabilities being substituted for the Gibbs measure. Other ...

متن کامل

Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy

The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a considerable effort has been devoted to the study of new entropic forms, which generalize the standard Boltzmann-Gibbs (BG) entropy and are widely applicable in thermodynamics, quantum mechanics and information theory. In [25], by extending previous ideas of Shannon [40, 41], Khinchin proposed a ...

متن کامل

Boltzmann-gibbs Entropy: Axiomatic Characterization and Application

Wepresent axiomatic characterizations of both Boltzmann andGibbs entropies together with an application.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014