نتایج جستجو برای: log convex function

تعداد نتایج: 1314863  

Journal: :Cambridge journal of mathematics 2021

We prove that the Robin ground state and torsion function are respectively log-concave $\frac{1}{2}$-concave on an uniformly convex domain $\Omega\subset \mathbb{R}^N$ of class $\mathcal{C}^m$, with $[m -\frac{ N}{2}]\geq 4$, provided parameter exceeds a critical threshold. Such threshold depends $N$, $m$, geometry $\Omega$, precisely diameter boundary curvatures up to order $m$.

2014
Xiaoqin Zhang Zhengyuan Zhou Di Wang Yi Ma

In this paper, we study the low-rank tensor completion problem, where a high-order tensor with missing entries is given and the goal is to complete the tensor. We propose to minimize a new convex objective function, based on log sum of exponentials of nuclear norms, that promotes the low-rankness of unfolding matrices of the completed tensor. We show for the first time that the proximal operato...

2014
Daniel Khashabi

Variational principle for probabilistic learning Yet another justification More simplification of updates for mean-field family Examples Dirichlet Process Mixture On minimization of divergence measures Energy minimization justifications Variational learning with exponential family Mean parametrization and marginal polytopes Convex dualities The log-partition function and conjugate duality Belie...

2008
Gábor Fejes Tóth

A classical theorem of Rogers states that for any convex body K in n-dimensional Euclidean space there exists a covering of the space by translates of K with density not exceeding n log n+n log log n+5n. Rogers’ theorem does not say anything about the structure of such a covering. We show that for sufficiently large values of n the same bound can be attained by a covering which is the union of ...

Journal: :Journal of Machine Learning Research 2008
Michael Collins Amir Globerson Terry Koo Xavier Carreras Peter L. Bartlett

Log-linear and maximum-margin models are two commonly-used methods in supervised machine learning, and are frequently used in structured prediction problems. Efficient learning of parameters in these models is therefore an important problem, and becomes a key factor when learning from very large data sets. This paper describes exponentiated gradient (EG) algorithms for training such models, whe...

Journal: :CoRR 2015
Jérémy Barbay Carlos Ochoa Pablo Pérez-Lantero

Divide-and-conquer is a central paradigm for the design of algorithms, through which some fundamental computational problems, such as sorting arrays and computing convex hulls, are solved in optimal time within Θ(n logn) in the worst case over instances of size n. A finer analysis of those problems yields complexities within O(n(1 + H(n1, . . . , nk))) ⊆ O(n(1+ log k)) ⊆ O(n logn) in the worst ...

2006
Imre Csiszár

The notion of generalized maximum likelihood estimate for finite dimensional canonically convex exponential families, studied in detail in previous works of the authors, is extended to an infinite dimensional setting. Existence of the estimate when a generalized log-likelihood function is bounded above, and a continuity property are established. Related literature and examples are discussed. MS...

Journal: :CoRR 2013
Tianbao Yang Lijun Zhang

We motivate this study from a recent work on a stochastic gradient descent (SGD) method with only one projection (Mahdavi et al., 2012), which aims at alleviating the computational bottleneck of the standard SGD method in performing the projection at each iteration, and enjoys an O(log T/T ) convergence rate for strongly convex optimization. In this paper, we make further contributions along th...

2015
Sébastien Bubeck Ronen Eldan

We prove that the Fenchel dual of the log-Laplace transform of the uniform measure on a convex body in Rn is a (1 + o(1))n-self-concordant barrier. This gives the first construction of a universal barrier for convex bodies with optimal self-concordance parameter. The proof is based on basic geometry of log-concave distributions, and elementary duality in exponential families.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید