Tensor Network Ranks
نویسنده
چکیده
In problems involving approximation, completion, denoising, dimension reduction, estimation, interpolation, modeling, order reduction, regression, etc, we argue that the near-universal practice of assuming that a function, matrix, or tensor (which we will see are all the same object in this context) has low rank may be ill-justified. There are many natural instances where the object in question has high rank with respect to the classical notions of rank: matrix rank, tensor rank, multilinear rank — the latter two being the most straightforward generalizations of the former. To remedy this, we show that one may vastly expand these classical notions of ranks: Given any undirected graph G, there is a notion of G-rank associated with G, which provides us with as many different kinds of ranks as there are undirected graphs. In particular, the popular tensor network states in physics (e.g., mps, ttns, peps, mera) may be regarded as functions of a specific G-rank for various choices of G. Among other things, we will see that a function, matrix, or tensor may have very high matrix, tensor, or multilinear rank and yet very low G-rank for some G. In fact the difference is in the orders of magnitudes and the gaps between G-ranks and these classical ranks are arbitrarily large for some important objects in computer science, mathematics, and physics. Furthermore, we show that there is a G such that almost every tensor has G-rank exponentially lower than its rank or the dimension of its ambient space.
منابع مشابه
Learning Efficient Tensor Representations with Ring Structure Networks
Tensor train (TT) decomposition is a powerful representation for high-order tensors, which has been successfully applied to various machine learning tasks in recent years. However, since the tensor product is not commutative, permutation of data dimensions makes solutions and TT-ranks of TT decomposition inconsistent. To alleviate this problem, we propose a permutation symmetric network structu...
متن کاملNeuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion
In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...
متن کاملNew Ranks for Even-Order Tensors and Their Applications in Low-Rank Tensor Optimization
In this paper, we propose three new tensor decompositions for even-order tensors corresponding respectively to the rank-one decompositions of some unfolded matrices. Consequently such new decompositions lead to three new notions of (even-order) tensor ranks, to be called the M-rank, the symmetric M-rank, and the strongly symmetric M-rank in this paper. We discuss the bounds between these new te...
متن کاملOn the minimal ranks of matrix pencils and the existence of a best approximate block-term tensor decomposition
Under the action of the general linear group, the ranks of matrices A and B forming a m × n pencil A+ λB can change, but in a restricted manner. Specifically, to every pencil one can associate a pair of minimal ranks, which is unique up to a permutation. This notion can be defined for matrix pencils and, more generally, also for matrix polynomials of arbitrary degree. The natural hierarchy it i...
متن کاملTensor-Train Ranks for Matrices and Their Inverses
We show that the recent tensor-train (TT) decompositions of matrices come up from its recursive Kronecker-product representations with a systematic use of common bases. The names TTM and QTT used in this case stress the relation with multilevel matrices or quantization that increases artificially the number of levels. Then we investigate how the tensor-train ranks of a matrix can be related to ...
متن کامل