Extensions of the parametric families of divergences used in statistical inference
نویسندگان
چکیده
We propose a simple method of construction of new families of φ-divergences. This method called convex standardization is applicable to convex and concave functions ψ(t) twice continuously differentiable in a neighborhood of t = 1 with nonzero second derivative at the point t = 1. Using this method we introduce several extensions of the LeCam, power, χ and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
منابع مشابه
Exact Statistical Inference for Some Parametric Nonhomogeneous Poisson Processes
Nonhomogeneous Poisson processes (NHPPs) are often used to model recurrent events, and there is thus a need to check model fit for such models. We study the problem of obtaining exact goodness-of-fit tests for certain parametric NHPPs, using a method based on Monte Carlo simulation conditional on sufficient statistics. A closely related way of obtaining exact confidence intervals in parametri...
متن کاملاستنباط پیشگو ناپارامتری فازی بهینه برای طرح نمونهگیری جهت پذیرش یک مرحلهای
Acceptance sampling is one of the main parts of the statistical quality control. It is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling procedures can be used in an acceptance control program to reach better quality with lower expenses, improvement of the control and the increase of efficiency. The aims of this paper, studying acceptance sampling based on non-...
متن کاملTesting statistical hypotheses based on the density power divergence
The family of density power divergences is an useful class which generates robust parameter estimates with high efficiency. None of these divergences require any non-parametric density estimate to carry out the inference procedure. However, these divergences have so far not been used effectively in robust testing of hypotheses. In this paper, we develop tests of hypotheses based on this family ...
متن کاملPattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review
We review the information-geometric framework for statistical pattern recognition: First, we explain the role of statistical similarity measures and distances in fundamental statistical pattern recognition problems. We then concisely review the main statistical distances and report a novel versatile family of divergences. Depending on their intrinsic complexity, the statistical patterns are lea...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Kybernetika
دوره 44 شماره
صفحات -
تاریخ انتشار 2008