Kernel two-dimensional ridge regression for subspace clustering

نویسندگان

چکیده

Subspace clustering methods have been extensively studied in recent years. For 2-dimensional (2D) data, existing subspace usually convert 2D examples to vectors, which severely damages inherent structural information and relationships of the original data. In this paper, we propose a novel method, named KTRR, for The KTRR provides us with way learn most representative features from data learning representation. particular, performs feature low-dimensional representation construction simultaneously, renders two tasks mutually enhance each other. kernel is introduced enhanced capability capturing nonlinear An efficient algorithm developed its optimization provable decreasing convergent property objective value. Extensive experimental results confirm effectiveness efficiency our method.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Subspace Clustering via Thresholding Ridge Regression

In this material, we provide the theoretical analyses to show that the trivial coefficients always correspond to the codes over errors. Lemmas 1–3 show that our errors-removing strategy will perform well when the lp-norm is enforced over the representation, where p = {1, 2,∞}. Let x 6= 0 be a data point in the union of subspaces SD that is spanned by D = [Dx D−x], where Dx and D−x consist of th...

متن کامل

Kernel Truncated Regression Representation for Robust Subspace Clustering

Subspace clustering aims to group data points into multiple clusters of which each corresponds to one subspace. Most existing subspace clustering methods assume that the data could be linearly represented with each other in the input space. In practice, however, this assumption is hard to be satisfied. To achieve nonlinear subspace clustering, we propose a novel method which consists of the fol...

متن کامل

Kernel Ridge Regression via Partitioning

In this paper, we investigate a divide and conquer approach to Kernel Ridge Regression (KRR). Given n samples, the division step involves separating the points based on some underlying disjoint partition of the input space (possibly via clustering), and then computing a KRR estimate for each partition. The conquering step is simple: for each partition, we only consider its own local estimate fo...

متن کامل

An Identity for Kernel Ridge Regression

This paper derives an identity connecting the square loss of ridge regression in on-line mode with the loss of the retrospectively best regressor. Some corollaries about the properties of the cumulative loss of on-line ridge regression are also obtained.

متن کامل

Modelling Issues in Kernel Ridge Regression

Kernel ridge regression is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. This paper investigates the influence of the choice of kernel and the setting of tuning parameters on forecast accuracy. We review several popular kernels, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. We interpret the latter two kern...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Pattern Recognition

سال: 2021

ISSN: ['1873-5142', '0031-3203']

DOI: https://doi.org/10.1016/j.patcog.2020.107749