Hard or Soft Classification? Large-Margin Unified Machines
نویسندگان
چکیده
منابع مشابه
Hard or Soft Classification? Large-margin Unified Machines.
Margin-based classifiers have been popular in both machine learning and statistics for classification problems. Among numerous classifiers, some are hard classifiers while some are soft ones. Soft classifiers explicitly estimate the class conditional probabilities and then perform classification based on estimated probabilities. In contrast, hard classifiers directly target on the classificatio...
متن کاملMulticategory large-margin unified machines
Hard and soft classifiers are two important groups of techniques for classification problems. Logistic regression and Support Vector Machines are typical examples of soft and hard classifiers respectively. The essential difference between these two groups is whether one needs to estimate the class conditional probability for the classification task or not. In particular, soft classifiers predic...
متن کاملLarge Margin Boltzmann Machines
Boltzmann Machines are a powerful class of undirected graphical models. Originally proposed as artificial neural networks, they can be regarded as a type of Markov Random Field in which the connection weights between nodes are symmetric and learned from data. They are also closely related to recent models such as Markov logic networks and Conditional RandomFields. Amajor challenge for Boltzmann...
متن کاملLarge Margin Boltzmann Machines and Large Margin Sigmoid Belief Networks
Current statistical models for structured prediction make simplifying assumptions about the underlying output graph structure, such as assuming a low-order Markov chain, because exact inference becomes intractable as the tree-width of the underlying graph increases. Approximate inference algorithms, on the other hand, force one to trade off representational power with computational efficiency. ...
متن کاملSoft-Margin Softmax for Deep Classification
In deep classification, the softmax loss (Softmax) is arguably one of the most commonly used components to train deep convolutional neural networks (CNNs). However, such a widely used loss is limited due to its lack of encouraging the discriminability of features. Recently, the large-margin softmax loss (L-Softmax [14]) is proposed to explicitly enhance the feature discrimination, with hard mar...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of the American Statistical Association
سال: 2011
ISSN: 0162-1459,1537-274X
DOI: 10.1198/jasa.2011.tm10319