Empirical phi-divergence test statistics for testing simple and composite null hypotheses
نویسندگان
چکیده
The main purpose of this paper is to introduce first a new family of empirical test statistics for testing a simple null hypothesis when the vector of parameters of interest are defined through a specific set of unbiased estimating functions. This family of test statistics is based on a distance between two probability vectors, with the first probability vector obtained by maximizing the empirical likelihood on the vector of parameters, and the second vector defined from the fixed vector of parameters under the simple null hypothesis. The distance considered for this purpose is the phi-divergence measure. The asymptotic distribution is then derived for this family of test statistics. The proposed methodology is illustrated through the well-known data of Newcomb’s measurements on the passage time for light. A simulation study is carried out to compare its performance with respect to the empirical likelihood ratio test when confidence intervals are constructed based on the respective statistics for small sample sizes. The results suggest that the “empirical modified likelihood ratio test statistic” provides a competitive alternative to the empirical likelihood ratio test statistic, and is also more robust than the empirical likelihood ratio test statistic in the presence of contamination in the data. Finally, we propose empirical phi-divergence test statistics for testing a composite null hypothesis and present some asymptotic as well as simulation results to study the performance of these test procedures. AMS 2001 Subject Classification: 62E20
منابع مشابه
Extreme-value Moment Goodness-of-fittests
A general goodness-of-fit test for scale-parameter families of distribu tions is introduced, which is based on quotients of expected sample minima. The test is independent of the mean of the distribution, and, in applications to testing for expQnentiality of data, compares favorably to other goodness-of-fit tests for expo nentialitybased on the empirical distribution function, regression meth...
متن کاملComposite Likelihood Methods Based on Minimum Density Power Divergence Estimator
In this paper a robust version of the Wald test statistic for composite likelihood is 11 considered by using the composite minimum density power divergence estimator instead of the 12 composite maximum likelihood estimator. This new family of test statistics will be called Wald-type 13 test statistics. The problem of testing a simple and a composite null hypothesis is considered and 14 the robu...
متن کاملInfluence analysis of robust Wald-type tests
We consider a robust version of the classical Wald test statistics for testing simple and composite null hypotheses for general parametric models. These test statistics are based on the minimum density power divergence estimators instead of the maximum likelihood estimators. An extensive study of their robustness properties is given though the influence functions as well as the chi-square infla...
متن کاملParametric Estimation and Tests through Divergences and Duality Technique
We introduce estimation and test procedures through divergence optimization for discrete or continuous parametric models. This approach is based on a new dual representation for divergences. We treat point estimation and tests for simple and composite hypotheses, extending maximum likelihood technique. An other view at the maximum likelihood approach, for estimation and test, is given. We prove...
متن کاملAR-order estimation by testing sets using the Modified Information Criterion
The Modified Information Criterion (MIC) is an Akaike-like criterion which allows performance control by means of a simple a priori defined parameter, the upper-bound on the error of the first kind (false alarm probability). The criterion MIC is for example used to estimate the order of Auto-Regressive (AR) processes. The criterion can only be used to test pairs of composite hypotheses; in an A...
متن کامل