Dimensionality reduction to maximize prediction generalization capability

نویسندگان

چکیده

Generalization of time series prediction remains an important open issue in machine learning, wherein earlier methods have either large generalization error or local minima. We develop analytically solvable, unsupervised learning scheme that extracts the most informative components for predicting future inputs, termed predictive principal component analysis (PredPCA). Our can effectively remove unpredictable noise and minimize test through convex optimization. Mathematical analyses demonstrate that, provided with sufficient training samples sufficiently high-dimensional observations, PredPCA asymptotically identify hidden states, system parameters, dimensionalities canonical nonlinear generative processes, a global convergence guarantee. performance using sequential visual inputs comprising hand-digits, rotating 3D objects, natural scenes. It reliably estimates distinct states predicts outcomes previously unseen input data, based exclusively on noisy observations. The simple architecture low computational cost are highly desirable neuromorphic hardware.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalization Bounds for Supervised Dimensionality Reduction

We introduce and study the learning scenario of supervised dimensionality reduction, which couples dimensionality reduction and a subsequent supervised learning step. We present new generalization bounds for this scenario based on a careful analysis of the empirical Rademacher complexity of the relevant hypothesis set. In particular, we show an upper bound on the Rademacher complexity that is i...

متن کامل

Dimensionality reduction approach to multivariate prediction

The authors consider dimensionality reduction methods used for prediction, such as reduced rank regression, principal component regression and partial least squares. They show how it is possible to obtain intermediate solutions by estimating simultaneously the latent variables for the predictors and for the responses. They obtain a continuum of solutions that goes from reduced rank regression t...

متن کامل

Margin Based Dimensionality Reduction and Generalization

Linear discriminant analysis (LDA) for dimension reduction has been applied to a wide variety of problems such as face recognition. However, it has a major computational difficulty when the number of dimensions is greater than the sample size. In this paper, we propose a margin based criterion for linear dimension reduction that addresses the above problem associated with LDA. We establish an e...

متن کامل

Evolutionary Approach to Dimensionality Reduction

Excess of data due to different voluminous storage and online devices has become a bottleneck to seek meaningful information therein and we are information wise rich but knowledge wise poor. One of the major problems in extracting knowledge from large databases is the size of dimension i.e. number of features, of databases. More often than not, it is observed that some features do not affect th...

متن کامل

Using FACTS Controllers to Maximize Available Transfer Capability

|This paper concentrates on studying the e ect of SVC and TCSC controllers on the Available Transfer Capability (ATC). Standard voltage collapse techniques are used to determine the ATC of a test system, considering a variety of system limits. Then, based on second-order sensitivity analysis, optimal locations for these particular FACTS controllers are determined; these techniques are compared ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Nature Machine Intelligence

سال: 2021

ISSN: ['2522-5839']

DOI: https://doi.org/10.1038/s42256-021-00306-1