Characterizing the Functional Density Power Divergence Class

نویسندگان

چکیده

Divergence measures have a long association with statistical inference, machine learning and information theory. The density power divergence related produced many useful (and popular) procedures, which provide good balance between model efficiency on one hand outlier stability or robustness the other. logarithmic divergence, particular transform of has also been very successful in producing efficient stable inference procedures; addition it led to significant demonstrated applications success minimum procedures based (which go by names $\beta $ -divergence notation="LaTeX">$\gamma -divergence, respectively) make imperative meaningful look for other, similar divergences may be obtained as transforms same spirit. With this motivation we search such referred herein functional class. present article characterizes class, thus identifies available within construct that explored further possible

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

The power divergence and the density power divergence families: the mathematical connection

The power divergence family of Cressie and Read (1984) is a highly popular family of density-based divergences which is widely used in robust parametric estimation and multinomial goodness-of-fit testing. This family forms a subclass of the family of φ-divergences (Csiszár, 1963; Pardo, 2006) or disparities (Lindsay, 1994). The more recently described family of density power divergences (Basu e...

متن کامل

Testing statistical hypotheses based on the density power divergence

The family of density power divergences is an useful class which generates robust parameter estimates with high efficiency. None of these divergences require any non-parametric density estimate to carry out the inference procedure. However, these divergences have so far not been used effectively in robust testing of hypotheses. In this paper, we develop tests of hypotheses based on this family ...

متن کامل

Minimum density power divergence estimator for diffusion processes

In this paper, we consider the robust estimation for a certain class of diffusion processes including the Ornstein–Uhlenbeck process based on discrete observations. As a robust estimator, we consider the minimum density power divergence estimator (MDPDE) proposed by Basu et al. (Biometrika 85:549–559, 1998). It is shown that the MDPDE is consistent and asymptotically normal. A simulation study ...

متن کامل

Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator

In this paper a robust version of the Wald test statistic for composite likelihood is 11 considered by using the composite minimum density power divergence estimator instead of the 12 composite maximum likelihood estimator. This new family of test statistics will be called Wald-type 13 test statistics. The problem of testing a simple and a composite null hypothesis is considered and 14 the robu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2023

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2022.3210436