نتایج جستجو برای: s information criteria

تعداد نتایج: 2008737  

2007
Andrew R. Liddle

Model selection is the problem of distinguishing competing models, perhaps featuring different numbers of parameters. The statistics literature contains two distinct sets of tools, those based on information theory such as the Akaike Information Criterion (AIC), and those on Bayesian inference such as the Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance Information Crite...

2009
Nan Lin Tianqing Liu Baoxue Zhang

A general information criterion with a general penalty which depends on the size of samples is developed for nested and non-nested models in the context of inequality constraints. The true parameters may be defined by a specified parametric model, or a set of specified estimating functions. When the true parameters are defined by estimating functions, we use the empirical likelihood approach to...

2009
Igor Vajda Domingo Morales I. Vajda D. Morales

This report constitutes an unrefereed manuscript which is intended to be submitted for publication. Any opinions and conclusions expressed in this report are those of the author(s) and do not necessarily represent the views of the Institute. This paper deals with Bayesian models given by statistical experiments and common types of loss functions. Probability of error of the Bayes identificator ...

Journal: :iranian journal of public health 0
jafar-sadegh tabrizi mostafa farahbakhsh javad shahgoli mohammad reza rahbar mohammad naghavi-behzad hamid-reza ahadi

background: excellence and quality models are comprehensive methods for improving the quality of healthcare. the aim of this study was to design excellence and quality model for training centers of primary health care using delphi method. methods: in this study, delphi method was used. first, comprehensive information were collected using literature review. in extracted references, 39 models we...

Journal: :VLSI Signal Processing 2000
José Carlos Príncipe Dongxin Xu Qun Zhao John W. Fisher

This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi’s quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We als...

Journal: :IEEE Trans. Acoustics, Speech, and Signal Processing 1985
Mati Wax Thomas Kailath

2005
Kirill Chernyshov

The aim of the paper is to present a conceptual approach to identification of nonlinear stochastic systems based on information measures of dependence. In the paper, an identification problem statement using the information criterion under rather general conditions is proposed. It is based on a parameterized description of the system model under study combined with a corresponding method of est...

1995
J. H. van Schuppen

Attention is focused in this paper on the approximation problem of system identii-cation with information theoretic criteria. For a class of problems it is shown that the criterion of mutual information rate is identical to the criterion of exponential-of-quadratic cost and to H 1 entropy. In addition the relation between the likelihood function and divergence is explored. As a consequence of t...

2005
G. Celeux F. Forbes

The deviance information criterion (DIC) introduced by Spiegelhalter et al. (2002) for model assessment and model comparison is directly inspired by linear and generalised linear models, but it is open to different possible variations in the setting of missing data models, depending in particular on whether or not the missing variables are treated as parameters. In this paper, we reassess the c...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید