Microeconomic Theory and the Kullback-Liebler Discrepancy: Some Remarkable Connections
نویسندگان
چکیده
Using a well-known utility of wealth function, the classic contingent claims model is shown to produce measures of expected wealth, certainty equivalence, gains from trade, and risk premia with relative entropy interpretations, as measured by the KullbackLiebler discrepancy. Author
منابع مشابه
a class of improved parametrically guided nonparametric regression estimators
In this paper we define a class of estimators for a nonparametric regression model with the aim of reducing bias. The estimators in the class are obtained via a simple two stage procedure. In the first stage, a potentially misspecified parametric model is estimated and in the second stage the parametric estimate is used to guide the derivation of a final semiparametric estimator. Mathematically...
متن کاملAn Efficient Image Similarity Measure Based on Approximations of KL-Divergence Between Two Gaussian Mixtures
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians. The first method is based on matching between the Gaussian elements of the two Gaussian mixture densities. The second method is based on the unscented transform. The proposed methods are utilized for image retrieval tasks. Continuous probabilistic image modeling base...
متن کاملAutomatic concept identification in goal-oriented conversations
We address the problem of identifying key domain concepts automatically from an unannotated corpus of goal-oriented human-human conversations. We examine two clustering algorithms, one based on mutual information and another one based on Kullback-Liebler distance. In order to compare the results from both techniques quantitatively, we evaluate the outcome clusters against reference concept labe...
متن کاملLORENTZ TRANSFORMATION FROM INFORMATION Derivation of Lorentz transformation from principles of statistical information theory
The Lorentz transformation is derived from invariance of an information quantity related to statistical hypothesis testing on single particle system identification parameters. Invariance results from recognition of an equivalent observer as one who reaches the same conclusions as another when the same statistical methods are used. System identity is maintained by parameter values which minimize...
متن کاملGeneralised Pinsker Inequalities
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Liebler divergence in two ways: we consider arbitrary f -divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the clas...
متن کامل