نتایج جستجو برای: tensor decomposition
تعداد نتایج: 139824 فیلتر نتایج به سال:
We derive a CUR-type factorization for tensors in the Tucker format based on interpolatory decomposition, which we will denote as Higher Order Interpolatory Decomposition (HOID). Given a tensor X , the algorithm provides a set of column vectors {Cn}n=1 which are columns extracted from the mode-n tensor unfolding, along with a core tensor G and together, they satisfy some error bounds. Compared ...
Tensor decompositions are invaluable tools in analyzing multimodal datasets. In many real-world scenarios, such datasets are far from being static, to the contrary they tend to grow over time. For instance, in an online social network setting, as we observe new interactions over time, our dataset gets updated in its “time” mode. How can we maintain a valid and accurate tensor decomposition of s...
Tensor decomposition is used for many web and user data analysis operations from clustering, trend detection, anomaly detection, to correlation analysis. However, many of the tensor decomposition schemes are sensitive to noisy data, an inevitable problem in the real world that can lead to false conclusions. The problem is compounded by overfitting when the user data is sparse. Recent research h...
We develop a higher-order generalization of the LQ decomposition and show that this decomposition plays an important role in likelihood-based estimation and testing for separable, or Kronecker structured, covariance models, such as the multilinear normal model. This role is analogous to that of the LQ decomposition in likelihood inference for the multivariate normal model. Additionally, this hi...
Tensor decomposition is an important tool for big data analysis. In this paper,we resolve many of the key algorithmic questions regarding robustness, memoryefficiency, and differential privacy of tensor decomposition. We propose simplevariants of the tensor power method which enjoy these strong properties. We presentthe first guarantees for online tensor power method which has a...
In many applications such as data compression, imaging or genomic data analysis, it is important to approximate a given tensor by a tensor that is sparsely representable. For matrices, i.e. 2-tensors, such a representation can be obtained via the singular value decomposition which allows to compute the best rank k approximations. For t-tensors with t > 2 many generalizations of the singular val...
This compendium on tensor approximation (TA) gives an overview on typical tensor approximation notation and definitions. TA is a tool for data approximation in higher orders. Precisely speaking, TA is an higher-order extension of the matrix singular value decomposition and is a generalization of a data factorization of multidimensional datasets into a set of bases and coefficients. TA consists ...
Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model which represents data as an ordered network of sub-tensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network decomposition has been long st...
We present a probabilistic model for tensor decomposition where one or more tensor modes may have sideinformation about the mode entities in form of their features and/or their adjacency network. We consider a Bayesian approach based on the Canonical PARAFAC (CP) decomposition and enrich this single-layer decomposition approach with a two-layer decomposition. The second layer fits a factor mode...
Higher-order tensor analysis is a multi-disciplinary tool widely used in numerous application areas involving data analysis such as psychometrics, chemometrics, and signal processing, just to mention a few. The parallel factor (PARAFAC) decomposition, also known by the acronym CP (standing for “CANDECOMP/PARAFAC” or yet “canonical polyadic”) is the most popular tensor decomposition. Its widespr...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید