نتایج جستجو برای: divergence analysis

تعداد نتایج: 2858177  

2010
Jigang Sun Colin Fyfe Malcolm K. Crowe

Curvilinear Component Analysis (CCA) is an interesting flavour of multidimensional scaling. In this paper one version of CCA is proved to be related to the mapping found by a specific Bregman divergence and its stress function is redefined based on this insight, and its parameter (the neighbourhood radius) is explained.

2016
Chaoyue Liu Mikhail Belkin

Clustering, in particular k-means clustering, is a central topic in data analysis. Clustering with Bregman divergences is a recently proposed generalization of k-means clustering which has already been widely used in applications. In this paper we analyze theoretical properties of Bregman clustering when the number of the clusters k is large. We establish quantization rates and describe the lim...

2008
Alessandro De Gregorio Stefano M. Iacus

In this paper we propose the use of φ-divergences as test statistics to verify simple hypotheses about a one-dimensional parametric diffusion process dXt = b(Xt, θ)dt + σ(Xt, θ)dWt, from discrete observations {Xti , i = 0, . . . , n} with ti = i∆n, i = 0, 1, . . . , n, under the asymptotic scheme ∆n → 0, n∆n → ∞ and n∆n → 0. The class of φ-divergences is wide and includes several special member...

2012
Mary Gardiner Mark Dras

This paper looks at the problem of valence shifting, rewriting a text to preserve much of its meaning but alter its sentiment characteristics. There has only been a small amount of previous work on the task, which appears to be more difficult than researchers anticipated, not least in agreement between human judges regarding whether a text had indeed had its valence shifted in the intended dire...

2012
Matus Telgarsky Sanjoy Dasgupta

This manuscript develops the theory of agglomerative clustering with Bregman divergences. Geometric smoothing techniques are developed to deal with degenerate clusters. To allow for cluster models based on exponential families with overcomplete representations, Bregman divergences are developed for nondifferentiable convex functions.

Journal: :Entropy 2014
Takafumi Kanamori Masashi Sugiyama

Estimating a discrepancy between two probability distributions from samples is an important task in statistics and machine learning. There are mainly two classes of discrepancy measures: distance measures based on the density difference, such as the Lp-distances, and divergence measures based on the density ratio, such as the φ-divergences. The intersection of these two classes is the L1-distan...

Journal: :CoRR 2014
Mark D. Reid Rafael M. Frongillo Robert C. Williamson

Mixability of a loss is known to characterise when constant regret bounds are achievable in games of prediction with expert advice through the use of the aggregating algorithm [Vovk, 2001]. We provide a new interpretation of mixability via convex analysis that highlights the role of the Kullback-Leibler divergence in its definition. This naturally generalises to what we call Φ-mixability where ...

Journal: :SIAM J. Matrix Analysis Applications 2007
Inderjit S. Dhillon Joel A. Tropp

This paper discusses a new class of matrix nearness problems that measure approximation error using a directed distance measure called a Bregman divergence. Bregman divergences offer an important generalization of the squared Frobenius norm and relative entropy, and they all share fundamental geometric properties. In addition, these divergences are intimately connected with exponential families...

Laser diode beam divergence is the main parameter for beam shaping and fiber optic coupling. Increasing the waveguide layer thickness is the conventional method to decrease the beam divergence. In this paper, the broadened asymmetric waveguide is introduced to decrease the divergence without increasing the optical power. The asymmetric waveguide was used to shift the vertical optical field to n...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید