نتایج جستجو برای: ε weakly chebyshev subspace
تعداد نتایج: 79738 فیلتر نتایج به سال:
A Chebyshev polynomial of a square matrix A is a monic polynomial p of specified degree that minimizes ‖p(A)‖2. The study of such polynomials is motivated by the analysis of Krylov subspace iterations in numerical linear algebra. An algorithm is presented for computing these polynomials based on reduction to a semidefinite program which is then solved by a primaldual interior point method. Exam...
We consider the weakly dissipative and weakly dispersive Burgers-Hopf-Korteweg-de-Vries equation with the diffusion coefficient ε and the dispersion rate δ in the range δ/ε → 0. We study the travelling wave connecting u(−∞) = 1 to u(+∞) = 0 and show that it converges strongly to the entropic shock profile as ε, δ → 0. Key-words Travelling waves, moderate dispersion, Korteweg de Vries equation, ...
This paper deals with approximate solutions of general (that is, without any convexity assumption) multi-objective optimization problems (MOPs). In this text, by reviewing some standard scalarization techniques we are interested in finding the relationships between ε-(weakly, properly) efficient points of an MOP and ε-optimal solutions of the related scalarized problem. For this purpose, the re...
A classical theorem of Kuratowski says that every Baire one function on a Gδ subspace of a Polish (= separable completely metrizable) space X can be extended to a Baire one function on X. Kechris and Louveau introduced a finer gradation of Baire one functions into small Baire classes. A Baire one function f is assigned into a class in this heirarchy depending on its oscillation index β(f). We p...
In this paper, by introducing a class of relaxed filtered Krylov subspaces, we propose the subspace method for computing eigenvalues with largest real parts and corresponding eigenvectors non-symmetric matrices. As by-products, generalizations Chebyshev–Davidson solving eigenvalue problems are also presented. We give convergence analysis complex Chebyshev polynomial, which plays significant rol...
It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required, the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. Henc...
In this chapter, we will prove that given a set P of n points in IR, one can reduce the dimension of the points to k = O(ε−2 log n) and distances are 1 ± ε reserved. Surprisingly, this reduction is done by randomly picking a subspace of k dimensions and projecting the points into this random subspace. One way of thinking about this result is that we are “compressing” the input of size nd (i.e.,...
Let 1 < p ̸= 2 < ∞, ε > 0 and let T : lp(l2) into → Lp[0, 1] be an isomorphism Then there is a subspace Y ⊂ lp(l2), (1 + ε)-isomorphic to lp(l2), such that: T|Y is an (1+ ε)-isomorphism and T (Y ) is Kp-complemented in Lp [0, 1], with Kp depending only on p. Moreover, Kp ≤ (1 + ε)γp if p > 2 and Kp ≤ (1 + ε)γp/(p−1) if 1 < p < 2, where γr is the Lr norm of a standard Gaussian variable.
In this work, we develop fast algorithms for computations involving finite expansions in Gegenbauer polynomials. We describe a method to convert a linear combination of Gegenbauer polynomials up to degree n into a representation in a different family of Gegenbauer polynomials with generally O(n log(1/ε)) arithmetic operations where ε is a prescribed accuracy. Special cases where source or targe...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید