Homotopy Analysis for Tensor PCA
نویسندگان
چکیده
Developing efficient and guaranteed nonconvex algorithms has been an important challenge in modern machine learning. Algorithms with good empirical performance such as stochastic gradient descent often lack theoretical guarantees. In this paper, we analyze the class of homotopy or continuation methods for global optimization of nonconvex functions. These methods start from an objective function that is efficient to optimize (e.g. convex), and progressively modify it to obtain the required objective, and the solutions are passed along the homotopy path. For the challenging problem of tensor PCA, we prove global convergence of the homotopy method in the “high noise” regime. The signal-to-noise requirement for our algorithm is tight in the sense that it matches the recovery guarantee for the best degree-4 sum-of-squares algorithm. In addition, we prove a phase transition along the homotopy path for tensor PCA. This allows to simplify the homotopy method to a local search algorithm, viz., tensor power iterations, with a specific initialization and a noise injection procedure, while retaining the theoretical guarantees.
منابع مشابه
On the Exponent of Triple Tensor Product of p-Groups
The non-abelian tensor product of groups which has its origins in algebraic K-theory as well as inhomotopy theory, was introduced by Brown and Loday in 1987. Group theoretical aspects of non-abelian tensor products have been studied extensively. In particular, some studies focused on the relationship between the exponent of a group and exponent of its tensor square. On the other hand, com...
متن کاملTensor principal component analysis via convex optimization
This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is supersymmetric with an even degree. In that case, the tensor can be embedded into a symmetric matrix. We prove that if the tensor...
متن کاملPrincipal Component Analysis with Tensor Train Subspace
Tensor train is a hierarchical tensor network structure that helps alleviate the curse of dimensionality by parameterizing large-scale multidimensional data via a set of network of low-rank tensors. Associated with such a construction is a notion of Tensor Train subspace and in this paper we propose a TTPCA algorithm for estimating this structured subspace from the given data. By maintaining lo...
متن کاملComputing Tensor Eigenvalues via Homotopy Methods
We introduce the concept of mode-k generalized eigenvalues and eigenvectors of a tensor and prove some properties of such eigenpairs. In particular, we derive an upper bound for the number of equivalence classes of generalized tensor eigenpairs using mixed volume. Based on this bound and the structures of tensor eigenvalue problems, we propose two homotopy continuation type algorithms to solve ...
متن کاملSemi-Orthogonal Multilinear PCA with Relaxed Start
Principal component analysis (PCA) is an unsupervised method for learning low-dimensional features with orthogonal projections. Multilinear PCA methods extend PCA to deal with multidimensional data (tensors) directly via tensor-to-tensor projection or tensor-to-vector projection (TVP). However, under the TVP setting, it is difficult to develop an effective multilinear PCA method with the orthog...
متن کامل