نتایج جستجو برای: Defining hyperplane

تعداد نتایج: 74519  

2015
Kristoffer Stensbo-Smidt

The perceptron algorithm is an algorithm for supervised linear classification. Restricting ourselves to considering only binary classification, the most basic linear classifier is a hyperplane separating the two classes of our dataset. More formally, assume that we have a normal vector w ∈ RD defining a hyperplane. Binary classification is typically performed by defining a function f , such that

2011
IZZET COSKUN

A Schubert class in the Grassmannian is rigid if the only proper subvarieties representing that class are Schubert varieties. The hyperplane class σ1 is not rigid because a codimension one Schubert cycle can be deformed to a smooth hyperplane section. In this paper, we show that this phenomenon accounts for the failure of rigidity in Schubert classes. More precisely, we prove that a Schubert cl...

2013
ARNAUD BODIN

Our aim is to generalize the result that two generic complex line arrangements are equivalent. In fact for a line arrangement A we associate its defining polynomial f = ∏ i(aix + biy + ci), so that A = (f = 0). We prove that the defining polynomials of two generic line arrangements are, up to a small deformation, topologically equivalent. In higher dimension the related result is that within a ...

Journal: :CoRR 2012
Wilhelm Plesken Thomas Bachler

We establish a connection between linear codes and hyperplane arrangements using the Thomas decomposition of polynomial systems and the resulting counting polynomial. This yields both a generalization and a refinement of the weight enumerator of a linear code. In particular, one can deal with infinitely many finite fields simultaneously by defining a weight enumerator for codes over infinite fi...

2001
Jonathan Wiens JONATHAN WIENS Sergey Yuzvinsky

Let V be a linear space of dimension over a field K. By an arrangement we shall mean a finite collection of affine subspaces of V . If all of the subspaces in an arrangement A have codimension k then we say that A is an ( , k)arrangement. If k = 1 and so A is a hyperplane arrangement then we shall say that A is an -arrangement. Let A be an arrangement and S the coordinate ring for V . For each ...

2004
G. BANDE P. GHIGGINI D. KOTSCHICK

We prove Gray–Moser stability theorems for complementary pairs of forms of constant class defining symplectic pairs, contact-symplectic pairs and contact pairs. We also consider the case of contact-symplectic and contact-contact structures, in which the constant class condition on a oneform is replaced by the condition that its kernel hyperplane distribution have constant class in the sense of ...

2016
Anton Kolosnitcyn

We consider a new interpretation of the modified simplex imbeddings method. The main construction of this method is a simplex which contains a solution of convex non-differentiable problem. A cutting plane drawn through the simplex center is used to delete a part of the simplex without the solution. The most interesting feature of this method is the convergence estimation which depends only on ...

2002
Dianne Cook Doina Caragea Vasant Honavar

In the simplest form support vector machines (SVM) define a separating hyperplane between classes generated from a subset of cases, called support vectors. The support vectors “mark” the boundary between two classes. The result is an interpretable classifier, where the importance of the variables to the classification, is identified by the coefficients of the variables defining the hyperplane. ...

Journal: :SIAM Journal on Applied Algebra and Geometry 2023

We describe a framework for estimating Hilbert–Samuel multiplicities pairs of projective varieties from finite point samples rather than defining equations. The first step involves proving that this multiplicity remains invariant under certain hyperplane sections which reduce to and curve . Next, we establish equals the Euler characteristic (and hence cardinality) complex link in Finally, provi...

2009
Marek Grochowski Wlodzislaw Duch

Neural networks and other sophisticated machine learning algorithms frequently miss simple solutions that can be discovered by a more constrained learning methods. Transition from a single neuron solving linearly separable problems, to multithreshold neuron solving k-separable problems, to neurons implementing prototypes solving q-separable problems, is investigated. Using Learning Vector Quant...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید