نتایج جستجو برای: low rank
تعداد نتایج: 1260992 فیلتر نتایج به سال:
We study the problem of learning mixtures low-rank models, i.e. reconstructing multiple matrices from unlabelled linear measurements each. This enriches two widely studied settings - matrix sensing and mixed regression by bringing latent variables (i.e. unknown labels) structural priors structures) into consideration. To cope with non-convexity issues arising heterogeneous data low-complexity s...
Historically, analysis for multiscale PDEs is largely unified while numerical schemes tend to be equation-specific. In this paper, we propose a framework computing problems through random sampling. This achieved by incorporating randomized SVD solvers and manifold learning techniques numerically reconstruct the low-rank features of PDEs. We use radiative transfer equation elliptic with rough me...
For a prime p and a matrix A ∈ Zn×n, write A as A = p(A quo p)+ (A rem p) where the remainder and quotient operations are applied element-wise. Write the p-adic expansion of A as A = A[0] + pA[1] + p2A[2] + · · · where each A[i] ∈ Zn×n has entries between [0, p − 1]. Upper bounds are proven for the Z-ranks of A rem p, and A quo p. Also, upper bounds are proven for the Z/pZ-rank of A[i] for all ...
5.1. Proof of results in Section 3.1. Under degree-corrected block models, let us denote by Ā the conditional expectation of A given the degree parameters θ = (θ1, ..., θn) T . Note that if θi ≡ 1 then Ā = EA. Since Ā depends on θ, its eigenvalues and eigenvectors may not have a closed form. Nevertheless, we can approximate them using ρi and ūi from Lemma 3. To do so, we need the following lemma.
Nuclear norm minimization (NNM) has recently gained significant attention for its use in rank minimization problems. Similar to compressed sensing, using null space characterizations, recovery thresholds for NNM have been studied in [12, 4]. However simulations show that the thresholds are far from optimal, especially in the low rank region. In this paper we apply the recent analysis of Stojnic...
This paper extends the Weighted Low Rank Approximation (WLRA) approach towards linearly structured matrices. In the case of Hankel matrices with a special block structure an equivalent unconstrained optimization problem is derived and an algorithm for solving it is proposed.
Recall that in Lecture 13, a randomized algorithm was described for computing a low rank approximation to the eigendecomposition of a matrix A. A drawback to this method is that the matrix A must be accessed multiple times (twice), which may not be possible in streaming models where A cannot be stored in memory [1]. For the streaming model, we require a single pass algorithm, where A is accesse...
Abstract. We propose a novel and constructive algorithm that decomposes an arbitrary tensor into a finite sum of orthonormal rank-1 outer factors. The algorithm, named TTr1SVD, works by converting the tensor into a rank-1 tensor train (TT) series via singular value decomposition (SVD). TTr1SVD naturally generalizes the SVD to the tensor regime and delivers elegant notions of tensor rank and err...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید