نتایج جستجو برای: superlinear
تعداد نتایج: 1776 فیلتر نتایج به سال:
We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asymptotic sense when the size N of the system grows together with the number of iterations. The bound de...
We consider superlinearly convergent analogues of Newton methods for nondifferentiable operator equations in function spaces. The superlinear convergence analysis of semismooth methods for nondifferentiable equations described by a locally Lipschitzian operator in Rn is based on Rademacher’s theorem which does not hold in function spaces. We introduce a concept of slant differentiability and us...
We study the nonlinear elliptic boundary value problem Au = f(x, u) in Ω , Bu = g(x, u) on ∂Ω , where A is an operator of p−Laplacian type, Ω is an unbounded domain in R with non-compact boundary, and f and g are subcritical nonlinearities. We show existence of a nontrivial nonnegative weak solution when both f and g are superlinear. Also we show existence of at least two nonnegative solutions ...
In this note we show how the implicit filtering algorithm can be coupled with the BFGS quasi-Newton update to obtain a superlinearly convergent iteration if the noise in the objective function decays sufficiently rapidly as the optimal point is approached. We show how known theory for the noise-free case can be extended and thereby provide a partial explanation for the good performance of quasi...
We obtain new oscillation and gradient bounds for the viscosity solutions of fully nonlinear degenerate elliptic equations where the Hamiltonian is a sum of a sublinear and a superlinear part in the sense of Barles and Souganidis (2001). We use these bounds to study the asymptotic behavior of weakly coupled systems of fully nonlinear parabolic equations. Our results apply to some “asymmetric sy...
The rate of convergence of the conjugate gradient method takes place in essentially three phases, with respectively a sublinear, a linear and a superlinear rate. The paper examines when the superlinear phase is reached. To do this, two methods are used. One is based on the K-condition number, thereby separating the eigenvalues in three sets: small and large outliers and intermediate eigenvalues...
A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
Program speedup is an important measure of the performance of an algorithm on a parallel machine. Of particular importance is the near linear or superlinear speedup exhibited by the most performance-eecient algorithms for a given system. We describe network and program models for heterogeneous networks, deene notions of speedup and superlinear speedup, and observe that speedup consists of both ...
(f2) lim u→∞ f(u) u =f+ ¿ 0; lim u→−∞ f(u) u =f− ¿ 0: For the de5niteness, we assume f+ ¿f−, and when f+ =f−, we use f± to represent it. We will consider f being either superlinear or sublinear. f is said to be superlinear if f(u)=u is decreasing in (0;∞) and is increasing in (−∞; 0); and f is said to be sublinear if f(u)=u is increasing in (0;∞) and is decreasing in (−∞; 0): The semilinear equ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید