Asymptotic error bounds for kernel-based Nystrm low-rank approximation matrices

نویسندگان

  • Lo-Bin Chang
  • Zhidong Bai
  • Su-Yun Huang
  • Chii-Ruey Hwang
چکیده

• Many kernel-based learning algorithms have the computational load. • The Nyström low-rank approximation is designed for reducing the computation. • We propose the spectrum decomposition condition with a theoretical justification. • Asymptotic error bounds on eigenvalues and eigenvectors are derived. • Numerical experiments are provided for covariance kernel and Wishart matrix. AMS subject classifications: 60F99 62H30 68T10 68W25 a b s t r a c t Many kernel-based learning algorithms have the computational load scaled with the sample size n due to the column size of a full kernel Gram matrix K. This article considers the Nyström low-rank approximation. It uses a reduced kernel  K , which is n×m, consisting of m columns (say columns i 1 , i 2 , · · · , i m) randomly drawn from K. This approximation takes the form K ≈  K U −1  K T , where U is the reduced m × m matrix formed by rows i 1 , i 2 , · · · , i m of  K. Often m is much smaller than the sample size n resulting in a thin rectangular reduced kernel, and it leads to learning algorithms scaled with the column size m. The quality of matrix approximations can be assessed by the closeness of their eigenvalues and eigenvectors. In this article, asymptotic error bounds on eigenvalues and eigenvectors are derived for the Nyström low-rank approximation matrix.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices

Many kernel-based learning algorithms have the computational load scaled with the sample size n due to the column size of a full kernel Gram matrix K. This article considers the Nyström low-rank approximation. It uses a reduced kernel ?̂?, which is n×m, consisting of m columns (say columns i1, i2,···, im) randomly drawn from K. This approximation takes the form K ≈ ?̂?U?̂?, where U is the reduced ...

متن کامل

Ensemble Nystrom Method

A crucial technique for scaling kernel methods to very large data sets reaching or exceeding millions of instances is based on low-rank approximation of kernel matrices. We introduce a new family of algorithms based on mixtures of Nyström approximations, ensemble Nyström algorithms, that yield more accurate low-rank approximations than the standard Nyström method. We give a detailed study of va...

متن کامل

Revisiting the Nystrom method for improved large-scale machine learning

We reconsider randomized algorithms for the low-rank approximation of symmetric positive semi-definite (SPSD) matrices such as Laplacian and kernel matrices that arise in data analysis and machine learning applications. Our main results consist of an empirical evaluation of the performance quality and running time of sampling and projection methods on a diverse suite of SPSD matrices. Our resul...

متن کامل

On the numerical rank of radial basis function kernels in high dimension

Low-rank approximations are popular methods to reduce the high computational cost of algorithms involving large-scale kernel matrices. The success of low-rank methods hinges on the matrix rank, and in practice, these methods are effective even for high-dimensional datasets. The practical success has elicited the theoretical analysis of the function rank in this paper, which is an upper bound of...

متن کامل

Memory Efficient Kernel Approximation

Scaling kernel machines to massive data sets is a major challenge due to storage and computation issues in handling large kernel matrices, that are usually dense. Recently, many papers have suggested tackling this problem by using a low-rank approximation of the kernel matrix. In this paper, we first make the observation that the structure of shift-invariant kernels changes from low-rank to blo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013