Loglinear Models: An approach based on φ-Divergences
نویسنده
چکیده
In this paper we present a review of some results about inference based on φ-divergence measures, under assumptions of multinomial sampling and loglinear models. The minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator is considered. This estimator is used in a φdivergence measure which is the basis of new statistics for solving three important problems of testing regarding loglinear models: Goodness-of-fit, nested sequence of loglinear models and nonadditivity in loglinear models.
منابع مشابه
Minimum Φ-divergence Estimator and Hierarchical Testing in Loglinear Models
In this paper we consider inference based on very general divergence measures, under assumptions of multinomial sampling and loglinear models. We define the minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator. This estimator is then used in a φ-divergence goodness-of-fit statistic, which is the basis of two new statistics for solving the prob...
متن کاملPhi-Divergence Constrained Ambiguous Stochastic Programs for Data-Driven Optimization
This paper investigates the use of φ-divergences in ambiguous (or distributionally robust) two-stage stochastic programs. Classical stochastic programming assumes the distribution of uncertain parameters are known. However, the true distribution is unknown in many applications. Especially in cases where there is little data or not much trust in the data, an ambiguity set of distributions can be...
متن کاملPhi-divergence Test Statistics in Multinomial Sampling for Hierarchical Sequences of Loglinear Models with Linear Constraints
We consider nested sequences of hierarchical loglinear models when expected frequencies are subject to linear constraints and we study the problem of finding the model in the the nested sequence that is able to explain more clearly the given data. It will be necessary to give a method to estimate the parameters of the loglinear models and also a procedure to choose the best model among the mode...
متن کاملA Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models
Estimators derived from a divergence criterion such as φ−divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estima...
متن کاملRobust Solutions of Optimization Problems Affected by Uncertain Probabilities
In this paper we focus on robust linear optimization problems with uncertainty regions defined by φ-divergences (for example, chi-squared, Hellinger, Kullback-Leibler). We show how uncertainty regions based on φ-divergences arise in a natural way as confidence sets if the uncertain parameters contain elements of a probability vector. Such problems frequently occur in, for example, optimization ...
متن کامل