Evaluation of Functions, Gradients, and Jacobians
نویسنده
چکیده
A function f : D ⊂ R → R is ordinarily evaluated by constructing a sequence {x1, x2, . . . , xn}, where each xk is obtained by evaluating an expression of the form xk = fk(x1, . . . , xk−1), and xn = f(x1, . . . , xd). In realization of such algorithms by computer programs, the expressions fk can be limited to assignments, arithmetic operations, and functions already programmed or built into the hardware, such as those belonging to a standard library of mathematical functions. The number of steps n of the algorithm to evaluate the function may depend on the given values of x1, . . . , xd, but this dependence will be suppressed for simplicity of notation. Instead of considering the evaluation process as a mapping from a point in Rd to a point in R, it will be viewed as a transformation in Rn of the form x = F(x), where x = (x1, x2, . . . , xn), that is, as a fixed point problem in Rn or the equivalent equation G(x) ≡ x − F(x) = 0. These formulations give the possibility of improvement of accuracy of the function evaluation and validation of the computed result. In particular, if the expressions fk are differentiable, then the Jacobian J = F′(x) = (∂xi/∂xj) exists and is strictly lower triangular. One has G′(x) = I−J and G′(x)−1 = (I − J)−1 = I + J + · · ·+ Jm for some m ≤ n. Thus, Newton’s method can be applied to G(x) = 0 for improvement of computed values and their validation by interval inclusion. Another use of the matrix J is the computation of the gradient ∇f , a process commonly referred to as automatic differentiation. The forward mode consists of computing the right eigenvectors of J by the power method, the reverse mode yields a left eigenvector, also by the power method. In either case, accurate matrix-vector multiplication with the aid of a long accumulator and interval validation of results are applicable. The above results apply immediately to evaluation of functions f : D ⊂ Rp → Rq to obtain the corresponding Jacobian of the transformation. In the case p = q, the function being computed may be an inverse function, that is, one uses the computer to solve the equation f(x) = y for x = f−1(y) = g(y). Since the program will contain a subroutine for f(x), it is not necessary of find the Jacobian of g(y) by differentiation of the entire routine. Once a satisfactory
منابع مشابه
More about measures and Jacobians of singular random matrices
In this work are studied the Jacobians of certain singular transformations and the corresponding measures which support the jacobian computations.
متن کاملFast arithmetic and pairing evaluation on genus 2 curves
We present two algorithms for fast arithmetic in Jacobians of genus 2 curves. The first speeds up the process of “double-and-add” by an estimated 5.5% over the standard algorithm. The second modifies the construction of the functions used to compute the Weil and Tate pairings in a way that saves 6 field multiplications per evaluation.
متن کاملComplexity in Numerical
The evaluation or approximation of derivatives is an important part of many nonlinear computations. The cost of evaluating rst-and second-derivative matrices is often assumed to grow linearly and quadratically with the number of independent variables, respectively. It is shown here that much tighter bounds can be achieved through the exploitation of partial function-and argument-separability in...
متن کاملDifferential Inclusions and Young Measures Involving Prescribed Jacobians
This work presents a general principle in the spirit of convex integration, leading to a method for the characterization of Young measures generated by gradients that satisfy a wide range of constraints on the Jacobian determinant. Two special cases are particularly important in the theories of elasticity and fluid dynamics: (a) the generating gradients have positive Jacobians that are uniforml...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Reliable Computing
دوره 9 شماره
صفحات -
تاریخ انتشار 2003