Learning principal directions: Integrated-squared-error minimization

نویسندگان

  • Jong-Hoon Ahn
  • Jong-Hoon Oh
  • Seungjin Choi
چکیده

A common derivation of principal component analysis (PCA) is based on the minimization of the squared-error between centered data and linear model, corresponding to the reconstruction error. In fact, minimizing the squared-error leads to principal subspace analysis where scaled and rotated principal axes of a set of observed data, are estimated. In this paper, we introduce and investigate an alternative error measure, integrated-squared-error (ISE), the minimization of which determines the exact principal axes (without rotational ambiguity) of a set of observed data. We show that exact principal directions emerge from the minimization of ISE. We present a simple EM algorithm, ’EM-ePCA’, which is similar to EM-PCA [9], but finds exact principal directions without rotational ambiguity. In addition, we revisit the generalized Hebbian algorithm (GHA) and show that it emerges from the integrated-squared-error minimization in a single-layer linear feedforward neural network.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimization of Information Loss through Neural Network Learning

In this article, we explore the concept of minimization of information loss (MIL) as a a target for neural network learning. We relate MIL to supervised and unsupervised learning procedures such as the Bayesian maximum a-posteriori (MAP) discriminator, minimization of distortion measures such as mean squared error (MSE) and cross-entropy (CE), and principal component analysis (PCA). To deal wit...

متن کامل

Minimization of Information Loss through

In this article, we explore the concept of minimization of information loss (MIL) as a a target for neural network learning. We relate MIL to supervised and unsupervised learning procedures such as the Bayesian maximum a-posteriori (MAP) discriminator, minimization of distortion measures such as mean squared error (MSE) and cross-entropy (CE), and principal component analysis (PCA). To deal wit...

متن کامل

Maximum Likelihood Structure and Motion Estimation Integrated over Tim

Least squares minimization of the differential epipolar constraint is a fast and efficient technique to estimate structure and motion for pair of views. Previous work in this area showed how unbiased and consistent estimates could be obtained minimizing the squared errors. However, it implicitly assumes that the errors along the x and y directions are identical and uncorrelated. This is rarely ...

متن کامل

Using Machine Learning ARIMA to Predict the Price of Cryptocurrencies

The increasing volatility in pricing and growing potential for profit in digital currency have made predicting the price of cryptocurrency a very attractive research topic. Several studies have already been conducted using various machine-learning models to predict crypto currency prices. This study presented in this paper applied a classic Autoregressive Integrated Moving Average(ARIMA) model ...

متن کامل

Principal Component Analysis and Effective K-Means Clustering

The widely adopted K-means clustering algorithm uses a sum of squared error objective function. A detailed analysis shows the close relationship between K-means clustering and principal component analysis (PCA) which is extensively utilized in unsupervised dimension reduction. We prove that the continuous solutions of the discrete K-means clustering membership indicators are the data projection...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 70  شماره 

صفحات  -

تاریخ انتشار 2007