نتایج جستجو برای: normalizedsteepest descent
تعداد نتایج: 22332 فیلتر نتایج به سال:
let $x$ be a reflexive banach space, $t:xto x$ be a nonexpansive mapping with $c=fix(t)neqemptyset$ and $f:xto x$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. in this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences t...
The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this, optimization algorithms are still designed by hand. In this paper we show how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way. Our learned algorit...
In this paper, we propose to train the RBF neural network using a global descent method. Essentially, the method imposes a monotonic transformation on the training objective to improve numerical sensitivity without altering the relative orders of all local extrema. A gradient descent search which inherits the global descent property is derived to locate the global solution of an error objective...
We introduce the notion of the descent set polynomial as an alternative way of encoding the sizes of descent classes of permutations. Descent set polynomials exhibit interesting factorization patterns. We explore the question of when particular cyclotomic factors divide these polynomials. As an instance we deduce that the proportion of odd entries in the descent set statistics in the symmetric ...
In the category Top of topological spaces and continuous functions we prove that descent morphisms with respect to the class IE of continuous bijections are exactly the descent mor phisms providing a new characterization of the latter in terms of sub brations IE X of the basic bration given by Top X which are essentially complete lattices Also e ective descent morphisms are characterized in ter...
In this paper we investigate effective descent morphisms in categories of reflexive and transitive lax algebras. We show in particular that open and proper maps are effective descent, result that extends the corresponding results for the category of topological spaces and continuous maps. Introduction A morphism p : E → B in a category C with pullbacks is called effective descent if it allows a...
Example 1. Imagine that we are solving a non-convex optimization problem on some (multivariate) function f using gradient descent. Recall that gradient descent converges to local minima. Because non-convex functions may have multiple minima, we cannot guarantee that gradient descent will converge to the global minimum. To resolve this issue, we will use random restarts, the process of starting ...
Coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of interest because of its competitive performance in machine learning applications. A number of recent papers provided convergence rate estimates for their deterministic (cyclic) and randomized variants that differ in the selection of update coordinates. These estimates suggest randomized coordinate de...
The glmnet package by [1] is an extremely fast implementation of the standard coordinate descent algorithm for solving l1 penalized learning problems. In this paper, we consider a family of coordinate majorization descent algorithms for solving the l1 penalized learning problems by replacing each coordinate descent step with a coordinate-wise majorization descent operation. Numerical experiment...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید