نتایج جستجو برای: separation hyperplanes

تعداد نتایج: 124957  

Journal: :Discrete Optimization 2017
Pietro Belotti Julio C. Góez Imre Pólik Ted K. Ralphs Tamás Terlaky

We study the convex hull of the intersection of a disjunctive set defined by parallel hyperplanes and the feasible set of a mixed integer second order cone optimization (MISOCO) problem. We extend our prior work on disjunctive conic cuts (DCCs), which has thus far been restricted to the case in which the intersection of the hyperplanes and the feasible set is bounded. Using a similar technique,...

2011
Michael F. Barnsley Andrew Vince

This paper contains four main results associated with an attractor of a projective iterated function system (IFS). The first theorem characterizes when a projective IFS has an attractor which avoids a hyperplane. The second theorem establishes that a projective IFS has at most one attractor. In the third theorem the classical duality between points and hyperplanes in projective space leads to c...

2007
Georg Pölzlbauer Thomas Lidy Andreas Rauber

We present a classifier algorithm that approximates the decision surface of labeled data by a patchwork of separating hyperplanes. The hyperplanes are arranged in a way inspired by how Self-Organizing Maps are trained. We take advantage of the fact that the boundaries can often be approximated by linear ones connected by a low-dimensional nonlinear manifold. The resulting classifier allows for ...

2010
DAVID C. WILSON

This paper contains four main results associated with an attractor of a projective iterated function system (IFS). The first theorem characterizes when a projective IFS has an attractor which avoids a hyperplane. The second theorem establishes that a projective IFS has at most one attractor. In the third theorem the classical duality between points and hyperplanes in projective space leads to c...

2006
Georgi I. Nalbantov Jan C. Bioch Patrick J. F. Groenen

A new classification method is proposed, called Support Hyperplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using that semi-consistent hyperplane, which is farthest away from it. In this way, a good balance between goodness-of-fit and ...

Journal: :Journal of Machine Learning Research 2016
Nicos G. Pavlidis David P. Hofmeyr Sotiris K. Tasoulis

Associating distinct groups of objects (clusters) with contiguous regions of high probability density (high-density clusters), is a central assumption in statistical and machine learning approaches for the classification of unlabelled data. In unsupervised classification this cluster definition underlies a nonparametric approach known as density clustering. In semi-supervised classification, cl...

1995
Geoffrey E. Hinton Michael Revow

Conventional binary classification trees such as CART either split the data using axis-aligned hyperplanes or they perform a computationally expensive search in the continuous space of hyperplanes with unrestricted orientations. We show that the limitations of the former can be overcome without resorting to the latter. For every pair of training data-points, there is one hyperplane that is orth...

Journal: :Discrete Applied Mathematics 1995

Journal: :Journal of Combinatorial Theory, Series A 1980

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید