منابع مشابه
Alphabetic coding with exponential costs
An alphabetic binary tree formulation applies to problems in which an outcome needs to be determined via alphabetically ordered search prior to the termination of some window of opportunity. Rather than finding a decision tree minimizing ∑n i=1w(i)l(i), this variant involves minimizing loga ∑n i=1 w(i)a l(i) for a given a ∈ (0, 1). This note introduces a dynamic programming algorithm that finds...
متن کاملInteger Coding with Nonlinear Costs
Let P = {p(i)} be a measure of strictly positive probabilities on the set of nonnegative integers. Although the countable number of inputs prevents usage of the Huffman algorithm, there are nontrivial P for which known methods find a source code that is optimal in the sense of minimizing expected codeword length. For some applications, however, a source code should instead minimize one of a fam...
متن کاملQuery Learning with Exponential Query Costs
In query learning, the goal is to identify an unknown object while minimizing the number of “yes” or “no” questions (queries) posed about that object. A well-studied algorithm for query learning is known as generalized binary search (GBS). We show that GBS is a greedy algorithm to optimize the expected number of queries needed to identify the unknown object. We also generalize GBS in two ways. ...
متن کاملExponential Family Sparse Coding with Applications to Self-taught Learning
Sparse coding is an unsupervised learning algorithm for finding concise, slightly higher-level representations of inputs, and has been successfully applied to self-taught learning, where the goal is to use unlabeled data to help on a supervised learning task, even if the unlabeled data cannot be associated with the labels of the supervised task [Raina et al., 2007]. However, sparse coding uses ...
متن کاملExponential Family Sparse Coding with Application to Self-taught Learning
Sparse coding is an unsupervised learning algorithm for finding concise, slightly higher-level representations of inputs, and has been successfully applied to self-taught learning, where the goal is to use unlabeled data to help on a supervised learning task, even if the unlabeled data cannot be associated with the labels of the supervised task [Raina et al., 2007]. However, sparse coding uses ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Information Processing Letters
سال: 2010
ISSN: 0020-0190
DOI: 10.1016/j.ipl.2009.11.008