Characterizations and Kullback-Leibler Divergence of Gompertz Distributions
نویسنده
چکیده
In this note, we characterize the Gompertz distribution in terms of extreme value distributions and point out that it implicitly models the interplay of two antagonistic growth processes. In addition, we derive a closed form expressions for the Kullback-Leibler divergence between two Gompertz Distributions. Although the latter is rather easy to obtain, it seems not to have been widely reported before. 1 The Gompertz Distribution The Gompertz distribution provides a statistical formulation of the Gompertz law of mortality [1]. Its probability density function (pdf) is defined for x ∈ [0,∞) and given by f(x | b, q) = e b q e e−q e bx (1) where the parameter b > 0 determines scale and q > 0 is a shape parameter. The corresponding cumulative density function (cdf) amounts to F (x | b, q) = 1− e e−q e bx (2) and will be of interest in our discussion below. Regarding the density in (1), we note that it is unimodal and rather flexible. Depending on the choice of b and q, it may be skewed to the left or to the right; however, for q ≥ 1, its mode will always be at 0. Due to its origins as a model of mortality, the Gompertz distribution is a staple in statistical biology and the demographic and actuarial sciences [2,3]. It was observed to model income distributions [4] and has been used as a model of the diffusion of novel products as well as of customer life-time values [5,6,7] in economics and marketing . Finally, in the context of social media analysis, the Gompertz distribution was found to account well for the temporal evolution of collective attention to viral Web content or social media services [8,9]. Our goal with this note is to provide an accessible account of some of the properties of the Gompertz distribution. Furthermore, we derive a closed form expression for the Kullback-Leibler divergence between Gompertz distributions which is useful for the purpose of model selection or statistical inference. 2 Interpretation in Terms of Extreme Value Distributions Interestingly, the Gompertz distribution is rather closely related to extreme value theory. Here, we briefly demonstrate that it can be expressed in terms of the three extreme value distributions. First of all, the Gompertz distribution corresponds to a zero-truncated Gumbel minimum distribution. The Gumbel distribution is the type I extreme value distribution. When used to model the distribution of sample minima, its pdf is defined for x ∈ (−∞,∞) and usually expressed as fG(x | m, s) = 1 s e(x−m)/s e−e (x−m)/s = 1 s e x s e− m s e−e x s e− m s (3) where m is a location parameter and s > 0 determines scale. Hence, defining b = 1s and q = e −m/s allows us to re-parameterize (3) and to write it as fG(x | b, q) = b q e e−q e bx (4) such that the corresponding cumulative density function amounts to FG(x | b, q) = 1− e−q e bx . (5) Looking at the cumulative density in (5), we note that limx→∞ FG(x) = 1 as well as FG(0) = 1− e−q. Accordingly, by left truncating the density in (4) at 0, we obtain a distribution whose pdf is given by fG(x) ∫∞ 0 fG(x)dx = fG(x) 1− FG(0) = fG(x) 1− (1− e−q) = efG(x) = e b q e e−q e bx . (6) This, however, is indeed the probability density of the Gompertz distribution as introduced in (1). Second of all, the Gompertz is indirectly related to the Fréchet and to the Weibull distribution. The Fréchet distribution is the type II extreme value distribution. It is usually defined for x ∈ (0,∞) in which case its pdf is given by fF (x | a, r) = a r (x r )−1−a e−( x r ) −a (7) where a > 0 and r > 0 are shape and scale parameters, respectively. The Weibull distribution is the type III extreme value distribution. It is commonly defined for x ∈ [0,∞) and its pdf amount to fW(x | k, l) = k l (x l )k−1 e−( x l ) k (8) where k > 0 and l > 0 are shape and scale parameters, respectively. In order to expose the connections between the densities in (7) and (8) and the Gompertz density in (1), we recall that if a random variable X is distributed according to fX(x), the monotonously transformed random variable Y = h(X) has a pdf that is given by fY (y) = fX ( h−1(y) ) ∣∣∣∣ d dyh−1(y) ∣∣∣∣ . (9) Using this identity, it is straightforward to see that the Gompertz distribution also results from transforming Fréchet or Weibull distributions. In particular, if fX(x) is a Fréchet density and y = − lnx, then x = e−y and dx dy = −e−y (10)
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملMinimum Dynamic Discrimination Information Models
In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distributionG is that which has least Kullback–Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residualmoment growth inequalities, or h...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملPark Estimation of Kullback – Leibler divergence by local likelihood
Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...
متن کاملKullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1402.3193 شماره
صفحات -
تاریخ انتشار 2014