A Block-Sparse Tensor Train Format for Sample-Efficient High-Dimensional Polynomial Regression

نویسندگان

چکیده

Low-rank tensors are an established framework for the parametrization of multivariate polynomials. We propose to extend this by including concept block-sparsity efficiently parametrize homogeneous, polynomials with low-rank tensors. This provides a representation general as sum polynomials, represented block-sparse, show that can be concisely single tensor. further prove cases, where particularly well suited showing banded symmetric homogeneous block sizes in block-sparse polynomial space bounded independent number variables. showcase format applying it high-dimensional least squares regression problems demonstrates improved computational resource utilization and sample efficiency.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convex Regularization for High-Dimensional Tensor Regression

In this paper we present a general convex optimization approach for solving highdimensional tensor regression problems under low-dimensional structural assumptions. We consider using convex and weakly decomposable regularizers assuming that the underlying tensor lies in an unknown low-dimensional subspace. Within our framework, we derive general risk bounds of the resulting estimate under fairl...

متن کامل

Efficient tensor completion: Low-rank tensor train

This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...

متن کامل

High-order Tensor Completion for Data Recovery via Sparse Tensor-train Optimization

In this paper, we aim at the problem of tensor data completion. Tensor-train decomposition is adopted because of its powerful representation ability and linear scalability to tensor order. We propose an algorithm named Sparse Tensortrain Optimization (STTO) which considers incomplete data as sparse tensor and uses first-order optimization method to find the factors of tensor-train decomposition...

متن کامل

High dimensional polynomial interpolation on sparse grids

We study polynomial interpolation on a d-dimensional cube, where d is large. We suggest to use the least solution at sparse grids with the extrema of the Chebyshev polynomials. The polynomial exactness of this method is almost optimal. Our error bounds show that the method is universal, i.e., almost optimal for many different function spaces. We report on numerical experiments for d = 10 using ...

متن کامل

Parallelized Tensor Train Learning of Polynomial Classifiers

In pattern classification, polynomial classifiers are well-studied methods as they are capable of generating complex decision surfaces. Unfortunately, the use of multivariate polynomials is limited to kernels as in support vector machines, because polynomials quickly become impractical for high-dimensional problems. In this paper, we effectively overcome the curse of dimensionality by employing...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Frontiers in Applied Mathematics and Statistics

سال: 2021

ISSN: ['2297-4687']

DOI: https://doi.org/10.3389/fams.2021.702486