Measurement Invariance, Entropy, and Probability
نویسندگان
چکیده
We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student’s probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability.
منابع مشابه
Risk measurement and Implied volatility under Minimal Entropy Martingale Measure for Levy process
This paper focuses on two main issues that are based on two important concepts: exponential Levy process and minimal entropy martingale measure. First, we intend to obtain risk measurement such as value-at-risk (VaR) and conditional value-at-risk (CvaR) using Monte-Carlo methodunder minimal entropy martingale measure (MEMM) for exponential Levy process. This Martingale measure is used for the...
متن کاملHow to Read Probability Distributions as Statements about Process
Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into fou...
متن کاملAnalysis on Behaviors of Controlled Quantum Systems via Quantum Entropy
In this paper, we investigate the essential properties of finite dimensional measurement-based quantum feedback control systems using a kind of quantum entropy, or socalled linear entropy. We show how the terms appear in the stochastic master equation affect the purity of the conditional density matrix of the system, and clarify a limitation of control action via Hamiltonian. Moreover, applying...
متن کاملA simple derivation and classification of common probability distributions based on information symmetry and measurement scale.
Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions ha...
متن کاملThe concept of logic entropy on D-posets
In this paper, a new invariant called {it logic entropy} for dynamical systems on a D-poset is introduced. Also, the {it conditional logical entropy} is defined and then some of its properties are studied. The invariance of the {it logic entropy} of a system under isomorphism is proved. At the end, the notion of an $ m $-generator of a dynamical system is introduced and a version of the Kolm...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 12 شماره
صفحات -
تاریخ انتشار 2010