Multivariate Regression via Stiefel Manifold Constraints

نویسندگان

  • Gökhan H. Bakir
  • Arthur Gretton
  • Matthias O. Franz
  • Bernhard Schölkopf
چکیده

We introduce a learning technique for regression between highdimensional spaces. Standard methods typically reduce this task to many onedimensional problems, with each output dimension considered independently. By contrast, in our approach the feature construction and the regression estimation are performed jointly, directly minimizing a loss function that we specify, subject to a rank constraint. A major advantage of this approach is that the loss is no longer chosen according to the algorithmic requirements, but can be tailored to the characteristics of the task at hand; the features will then be optimal with respect to this objective, and dependence between the outputs can be exploited.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least-Squares Regression with Unitary Constraints for Network Behaviour Classification

In this paper, we propose a least-squares regression method [2] with unitary constraints with applications to classification and recognition. To do this, we employ a kernel to map the input instances to a feature space on a sphere. In a similar fashion, we view the labels associated with the training data as points which have been mapped onto a Stiefel manifold using random rotations. In this m...

متن کامل

A completed adaptive de-mixing algorithm on Stiefel manifold for ICA

Based on the one-bit-matching principle and by turning the de-mixing matrix into an orthogonal matrix via certain normalization, Ma et al proposed a one-bit-matching learning algorithm on the Stiefel manifold for independent component analysis [8]. But this algorithm is not adaptive. In this paper, an algorithm which can extract kurtosis and its sign of each independent source component directl...

متن کامل

A Neural Stiefel Learning based on Geodesics Revisited

In this paper we present an unsupervised learning algorithm of neural networks with p inputs and m outputs whose weight vectors have orthonormal constraints. In this setting the learning algorithm can be regarded as optimization posed on the Stiefel manifold, and we generalize the natural gradient method to this case based on geodesics. By exploiting its geometric property as a quotient space: ...

متن کامل

Mixtures of multivariate power exponential distributions.

An expanded family of mixtures of multivariate power exponential distributions is introduced. While fitting heavy-tails and skewness have received much attention in the model-based clustering literature recently, we investigate the use of a distribution that can deal with both varying tail-weight and peakedness of data. A family of parsimonious models is proposed using an eigen-decomposition of...

متن کامل

Quadratic programs over the Stiefel manifold

We characterize the optimal solution of a quadratic program over the Stiefel manifold with an objective function in trace formulation. The result is applied to relaxations of HQAP and MTLS. Finally, we show that strong duality holds for the Lagrangian dual, provided some redundant constraints are added to the primal program. © 2005 Elsevier B.V. All rights reserved.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004