The Geometry of Rank-one Tensor Completion
نویسندگان
چکیده
The geometry of the set of restrictions of rank-one tensors to some of their coordinates is studied. This gives insight into the problem of rank-one completion of partial tensors. Particular emphasis is put on the semialgebraic nature of the problem, which arises for real tensors with constraints on the parameters. The algebraic boundary of the completable region is described for tensors parametrized by probability distributions and where the number of observed entries equals the number of parameters. If the observations are on the diagonal of a tensor of format d×· · ·×d, the complete semialgebraic description of the completable region is found.
منابع مشابه
Reweighted Low-Rank Tensor Completion and its Applications in Video Recovery
This paper focus on recovering multi-dimensional data called tensor from randomly corrupted incomplete observation. Inspired by reweighted l1 norm minimization for sparsity enhancement, this paper proposes a reweighted singular value enhancement scheme to improve tensor low tubular rank in the tensor completion process. An efficient iterative decomposition scheme based on t-SVD is proposed whic...
متن کاملNeuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion
In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...
متن کاملBeyond Low Rank: A Data-Adaptive Tensor Completion Method
Low rank tensor representation underpins much of recent progress in tensor completion. In real applications, however, this approach is confronted with two challenging problems, namely (1) tensor rank determination; (2) handling real tensor data which only approximately fulfils the low-rank requirement. To address these two issues, we develop a data-adaptive tensor completion model which explici...
متن کاملEfficient tensor completion: Low-rank tensor train
This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...
متن کاملEfficient Sparse Low-Rank Tensor Completion Using the Frank-Wolfe Algorithm
Most tensor problems are NP-hard, and low-rank tensor completion is much more difficult than low-rank matrix completion. In this paper, we propose a time and spaceefficient low-rank tensor completion algorithm by using the scaled latent nuclear norm for regularization and the FrankWolfe (FW) algorithm for optimization. We show that all the steps can be performed efficiently. In particular, FW’s...
متن کامل