نتایج جستجو برای: principal components analysispca
تعداد نتایج: 498150 فیلتر نتایج به سال:
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal st...
Principal components analysis (PCA) is a classical method for the reduction of dimensionality of data in the form of n observations (or cases) of a vector with p variables. Contemporary data sets often have p comparable to, or even much larger than n. Our main assertions, in such settings, are (a) that some initial reduction in dimensionality is desirable before applying any PCA-type search for...
A procedure is proposed for the dimensional reduction of time series. Similarly to principal components, the procedure seeks a low-dimensional manifold that minimizes information loss. Unlike principal components, however, the procedure involves dynamical considerations, through the proposal of a predictive dynamical model in the reduced manifold. Hence the minimization of the uncertainty is no...
university of tehran administers a test known as the university of tehran english proficiency test (the utept) to phd candidates on a yearly basis. by definition, the test can be considered a high-stakes one. the validity of high stakes tests needs to be known (roever, 2001). as mesick (1988) maintains, if the validity of high stakes tests are not known, it might have some undesirable consequen...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید