نتایج جستجو برای: divergence time estimation

تعداد نتایج: 2136009  

Journal: :Mathematics 2021

Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) Lebesgue measure. These based on certain k-nearest neighbor statistics pair independent identically distributed (i.i.d.) due vector samples. The novelty results is also treating ...

2012
Gholamhossein Yari Alireza Mirhabibi Abolfazl Saghafi

Recently, a new entropy based divergence measure has been introduced which is much like Kullback-Leibler divergence. This entropy measures the distance between an empirical and a prescribed survival function and is a lot easier to compute in continuous distributions than the K-L divergence. In this paper we show that this distance converges to zero with increasing sample size and we apply it to...

2012
Alessandra Martins Coelho Vania V. Estrela Felipe P. do Carmo Sandro R. Fernandes

This work addresses the problem of error concealment in video transmission systems over noisy channels employing Bregman divergences along with regularization. Error concealment intends to improve the effects of disturbances at the reception due to bit-errors or cell loss in packet networks. Bregman regularization gives accurate answers after just some iterations with fast convergence, better a...

2013
Gerard Pons-Moll Jonathan Taylor Jamie Shotton Aaron Hertzmann Andrew W. Fitzgibbon

We present a new method for inferring dense data to model correspondences, focusing on the application of human pose estimation from depth images. Recent work proposed the use of regression forests to quickly predict correspondences between depth pixels and points on a 3D human mesh model. That work, however, used a proxy forest training objective based on the classification of depth pixels to ...

Journal: :Entropy 2014
Shinto Eguchi Osamu Komori Atsumi Ohara

We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving...

2006
Young Kyung Lee Byeong U. Park

Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...

Journal: :Entropy 2014
Adom Giffin Renaldas Urniezius

In this paper, we continue our efforts to show how maximum relative entropy (MrE) can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a non-equilibrium state, i.e., perturbed by varying external forces. Traditional parameter estimation can be performed by using filters, such as the exte...

Journal: :IEEE Trans. Information Theory 1999
Paul P. B. Eggermont Vincent N. LaRiccia

In the random sampling setting we estimate the entropy of a probability density distribution by the entropy of a kernel density estimator using the double exponential kernel. Under mild smoothness and moment conditions we show that the entropy of the kernel density estimator equals a sum of independent and identically distributed (i.i.d.) random variables plus a perturbation which is asymptotic...

2010
Yusuf Kenan Yilmaz Ali Taylan Cemgil

We develop a probabilistic framework for multiway analysis of high dimensional datasets. By exploiting a link between graphical models and tensor factorization models we can realize any arbitrary tensor factorization structure, and many popular models such as CP or TUCKER models with Euclidean error and their non-negative variants with KL error appear as special cases. Due to the duality betwee...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید