For Numerical Differentiation , Dimensionality Can Be
نویسنده
چکیده
Finite diierence methods, like the mid-point rule, have been applied successfully to the numerical solution of ordinary and partial diierential equations. If such formulas are applied to observational data, in order to determine derivatives, the results can be disastrous. The reason for this is that measurement errors, and even rounding errors in computer approximations, are strongly ampliied in the diierentiation process, especially if small step-sizes are chosen and higher derivatives are required. A number of authors have examined the use of various forms of averaging which allows the stable computation of low order derivatives from observational data. The size of the averaging set acts like a regularization parameter and has to be chosen as a function of the grid size h. In this paper, it is initially shown how rst (and higher) order single-variate numerical diierentiation of higher dimensional observational data can be stabilized with a reduced loss of accuracy than occurs for the corresponding diierentiation of one-dimensional data. The result is then extended to the multivariate diierentiation of higher dimensional data. The nature of the trade-oo between convergence and stability is explicitly characterized, and the complexity of various implementations is examined.
منابع مشابه
For numerical differentiation, dimensionality can be a blessing!
Finite difference methods, such as the mid-point rule, have been applied successfully to the numerical solution of ordinary and partial differential equations. If such formulas are applied to observational data, in order to determine derivatives, the results can be disastrous. The reason for this is that measurement errors, and even rounding errors in computer approximations, are strongly ampli...
متن کاملDimensionality Reduction for Classification Comparison of Techniques and Dimension Choice
We investigate the effects of dimensionality reduction using different techniques and different dimensions on six two-class data sets with numerical attributes as pre-processing for two classification algorithms. Besides reducing the dimensionality with the use of principal components and linear discriminants, we also introduce four new techniques. After this dimensionality reduction two algori...
متن کاملA hybrid method with optimal stability properties for the numerical solution of stiff differential systems
In this paper, we consider the construction of a new class of numerical methods based on the backward differentiation formulas (BDFs) that be equipped by including two off--step points. We represent these methods from general linear methods (GLMs) point of view which provides an easy process to improve their stability properties and implementation in a variable stepsize mode. These superioritie...
متن کاملانجام یک مرحله پیش پردازش قبل از مرحله استخراج ویژگی در طبقه بندی داده های تصاویر ابر طیفی
Hyperspectral data potentially contain more information than multispectral data because of their higher spectral resolution. However, the stochastic data analysis approaches that have been successfully applied to multispectral data are not as effective for hyperspectral data as well. Various investigations indicate that the key problem that causes poor performance in the stochastic approaches t...
متن کاملDimensionality Reduction for Uncertainty Quantification of Nuclear Engineering Models
The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and...
متن کامل