نتایج جستجو برای: log error loss function

تعداد نتایج: 1829298  

Journal: :CoRR 2015
Alexandre de Brébisson Pascal Vincent

In a multi-class classification problem, it is standard to model the output of a neural network as a categorical distribution conditioned on the inputs. The output must therefore be positive and sum to one, which is traditionally enforced by a softmax. This probabilistic mapping allows to use the maximum likelihood principle, which leads to the well-known log-softmax loss. However the choice of...

2016
Pascal Germain Francis R. Bach Alexandre Lacoste Simon Lacoste-Julien

We exhibit a strong link between frequentist PAC-Bayesian risk bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization risk bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam’s razor criteria, under the assumption that the data ...

Journal: :J. Inf. Sci. Eng. 2004
Chien-Ming Lee Jin-Jang Leou

For entropy-coded MPEG-4 images, a transmission error in a codeword may cause the underlying codeword and its subsequent codewords within a video packet to be misinterpreted, resulting in great degradation of the received MPEG-4 images. Here a transmission error may be a single-bit error or a burst error. In this study, a postprocessing approach to detection and concealment of transmission erro...

Journal: :CoRR 2014
Quentin F. Stout

We give an algorithm for determining an optimal step function approximation of weighted data, where the error is measured with respect to the L∞ norm. The algorithm takes Θ(n+ log n · b(1 + log n/b)) time and Θ(n) space, where b is the number of steps. Thus the time is Θ(n log n) in the worst case and Θ(n) when b = O(n/ log n log log n). A minor change determines the optimal reduced isotonic re...

2011
Michael Auli Adam Lopez

Log-linear parsing models are often trained by optimizing likelihood, but we would prefer to optimise for a task-specific metric like Fmeasure. Softmax-margin is a convex objective for such models that minimises a bound on expected risk for a given loss function, but its naı̈ve application requires the loss to decompose over the predicted structure, which is not true of F-measure. We use softmax...

2008
Felix Salfner Steffen Tschirpke

Error logs are a fruitful source of information both for diagnosis as well as for proactive fault handling – however elaborate data preparation is necessary to filter out valuable pieces of information. In addition to the usage of well-known techniques, we propose three algorithms: (a) assignment of error IDs to error messages based on Levenshtein’s edit distance, (b) a clustering approach to g...

Support vector machine (SVM) is a popular classification technique which classifies data using a max-margin separator hyperplane. The normal vector and bias of the mentioned hyperplane is determined by solving a quadratic model implies that SVM training confronts by an optimization problem. Among of the extensions of SVM, cost-sensitive scheme refers to a model with multiple costs which conside...

2014
Quentin F. Stout

We give an algorithm for determining an optimal step function approximation of weighted data, where the error is measured with respect to the L∞ norm. The algorithm takes Θ(n+ log n · b(1 + log n/b)) time and Θ(n) space, where b is the number of steps. Thus the time is Θ(n log n) in the worst case and Θ(n) when b = O(n/ log n log log n). A minor change determines the optimal reduced isotonic re...

2014
Thomas L. Toulias Christos P. Kitsos Mohammad Fraiwan Al-Saleh

This paper introduces, investigates, and discusses the γ-order generalized lognormal distribution (γ-GLD). Under certain values of the extra shape parameter γ, the usual lognormal, log-Laplace, and log-uniform distribution, are obtained, as well as the degenerate Dirac distribution.The shape of all themembers of the γ-GLD family is extensively discussed.The cumulative distribution function is e...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید