نتایج جستجو برای: معیار kullback
تعداد نتایج: 34836 فیلتر نتایج به سال:
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two probability distributions. Although difficult to understand by examining the equation, an intuition and understanding of the KL divergence arises from its intimate relationship with likelihood theory. We discuss how KL divergence arises from likelihood theory in an attempt t...
In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability distribution is defined by the Kullback-Leibler (KL) divergence. We consider DRO problems where the ambiguity is in the objective function, which takes a form of an expectation, and show that the resulted minimax DRO problems can be formulated as a one-layer convex minimization ...
Let X | μ ∼ Np(μ, vxI) and Y | μ ∼ Np(μ, vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ, and let p(x|μ) and p(y |μ) denote the conditional densities of X and Y . Based on only observing X = x, we consider the problem of obtaining a predictive distribution p̂(y |x) for Y that is close to p(y |μ) as measured by Kullback-Leibler loss. The natural straw man ...
Accelerated algorithms for maximum likelihood image reconstruction are essential for emerging applications such as 3D tomography, dynamic tomographic imaging, and other high dimensional inverse problems. In this paper, we introduce and analyze a class of fast and stable sequential optimization methods for computing maximum likelihood estimates and study its convergence properties. These methods...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید