نتایج جستجو برای: nonlinear eigenvectors

تعداد نتایج: 223454  

Journal: :international journal of civil engineering 0
a. kaveh m. najimi

in this paper, the rayleigh's quotient and the inverse vector iteration method are presented. the latter approach helps to obtain the natural frequencies and mode shapes of a structure. inverse vector iteration method with shifting enables to determine the higher modes. some basic theorems of linear algebra are presented and extended to study the free vibration of structures. the variation...

Journal: :CoRR 2018
Brendan Gavin Agnieszka Miedlar Eric Polizzi

The linear FEAST algorithm is a method for solving linear eigenvalue problems. It uses complex contour integration to calculate the eigenvectors whose eigenvalues that are located inside some user-defined region in the complex plane. This makes it possible to parallelize the process of solving eigenvalue problems by simply dividing the complex plane into a collection of disjoint regions and cal...

2007
Minghao Cai S.-T. John Yu Moujin Zhang

In this paper, we report theoretical and numerical solutions of linear and nonlinear elastic waves in a thin rod. First, the classical solution of linear elastic wave in a thin rod is adapted such that it is ready to be compared with the numerical solution of a nonlinear formulation. Based on mass and momentum conservation, we then derive several forms of modeling equations for nonlinear elasti...

2016
Stefanos Zafeiriou Georgios Tzimiropoulos Maria Petrou

We propose a robust approach to discriminant kernel-based feature extraction for face recognition and verification. We show, for the first time, how to perform the eigen analysis of the within-class scatter matrix directly in the feature space. This eigen analysis provides the eigenspectrum of its range space and the corresponding eigenvectors as well as the eigenvectors spanning its null space...

Journal: :JDCTA 2010
Kezheng Lin Ying Xu Yuan Zhong

A novelty method of 2DGabor-KDA(kernel Fisher discriminant analysis) for face recognition is proposed. This involves convolving face images which are segmented into several sub-areas according to the five particular face parts are extracted through 2DGabor wavelet, average values are calculated from feature vectors gained from the corresponding pixel of each test sample and then the eigenvector...

Journal: :Applied Mathematics and Computation 2010
Raffaele Chiappinelli

1. Chiappinelli, Raffaele; Furi, Massimo; Pera, Maria Patrizia Persistence of the normalized eigenvectors of a perturbed operator in the variational case Glasg. Math. J. 55 (2013), no. 3, 629–638. 2. Chiappinelli, Raffaele Variational methods for NLEV approximation near a bifurcation point Int. J. Math. Math. Sci. 2012, Art. ID 102489, 32 pp 3. Chiappinelli, Raffaele; Furi, Massimo; Pera, Maria...

2016
Guang-Ho Cha

This paper presents a new nonlinear approximate indexing method for highdimensional data such as multimedia data. The new indexing method is designed for approximate similarity searches and all the work is performed in the transformed Gaussian space. In this indexing method, we first map the input space into a feature space via the Gaussian mapping, and then compute the top eigenvectors in the ...

2002
Anna Perelomova

Five eigenvectors of the linear thermoviscous flow over the homogeneous background derived for the quasi-plane geometry of the flow. The corresponding projectors are calculated and applied to get the nonlinear evolution equations for the interacting vortical and acoustic modes. Equation on streaming cased by arbitrary acoustic wave is specified. The correspondence to the known results on stream...

1998
Asim Gangopadhyaya Jeffry V. Mallow Uday P. Sukhatme

For nonrelativistic Hamiltonians which are shape invariant, analytic expressions for the eigenvalues and eigenvectors can be derived using the well known method of supersymmetric quantum mechanics. Most of these Hamiltonians also possess spectrum generating algebras and are hence solvable by an independent group theoretic method. In this paper, we demonstrate the equivalence of the two methods ...

A. K. Wadhwani Manish Dubey, Monika Saraswat

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید