Regularized Kernel Local Linear Embedding on Dimensionality Reduction for Non-vectorial Data

نویسندگان

  • Yi Guo
  • Junbin Gao
  • Paul Wing Hing Kwan
چکیده

In this paper, we proposed a new nonlinear dimensionality reduction algorithm called regularized Kernel Local Linear Embedding (rKLLE) for highly structured data. It is built on the original LLE by introducing kernel alignment type of constraint to effectively reduce the solution space and find out the embeddings reflecting the prior knowledge. To enable the non-vectorial data applicability of the algorithm, a kernelized LLE is used to get the reconstruction weights. Our experiments on typical non-vectorial data show that rKLLE greatly improves the results of KLLE.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Graph Laplacian Regularized Kernel Learning via Semidefinite-Quadratic-Linear Programming

Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs). These include Maximum Variance Unfolding (MVU) (Weinberger et al., 2004) in nonlinear dimensionality reduction, and Pairwise Constraint Propagation (PCP) (Li et al., 2008) in constrained clustering. Although in theory SDPs can be...

متن کامل

Kernel Laplacian Eigenmaps for Visualization of Non-vectorial Data

In this paper, we propose the Kernel Laplacian Eigenmaps for nonlinear dimensionality reduction. This method can be extended to any structured input beyond the usual vectorial data, enabling the visualization of a wider range of data in low dimension once suitable kernels are defined. Comparison with related methods based on MNIST handwritten digits data set supported the claim of our approach....

متن کامل

Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing

Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their obj...

متن کامل

Spectral Affine-Kernel Embeddings

In this paper, we propose a controllable embedding method for highand low-dimensional geometry processing through sparse matrix eigenanalysis. Our approach is equally suitable to perform non-linear dimensionality reduction on big data, or to offer non-linear shape editing of 3D meshes and pointsets. At the core of our approach is the construction of a multi-Laplacian quadratic form that is asse...

متن کامل

Discriminative Dimensionality Reduction in Kernel Space

Modern nonlinear dimensionality reduction (DR) techniques enable an efficient visual data inspection in the form of scatter plots, but they suffer from the fact that DR is inherently ill-posed. Discriminative dimensionality reduction (DiDi) offers one remedy, since it allows a practitioner to identify what is relevant and what should be regarded as noise by means of auxiliary information such a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009