نتایج جستجو برای: low rank

تعداد نتایج: 1260992  

2016
Emil Kieri

In low-rank approximation, separation of variables is used to reduce the amount of data in computations with high-dimensional functions. Such techniques have proved their value, e.g., in quantum mechanics and recommendation algorithms. It is also possible to fold a low-dimensional grid into a high-dimensional object, and use low-rank techniques to compress the data. Here, we consider low-rank t...

Journal: :SIAM J. Matrix Analysis Applications 2007
Othmar Koch Christian Lubich

For the low rank approximation of time-dependent data matrices and of solutions to matrix differential equations, an increment-based computational approach is proposed and analyzed. In this method, the derivative is projected onto the tangent space of the manifold of rank-r matrices at the current approximation. With an appropriate decomposition of rank-r matrices and their tangent matrices, th...

2003
Nathan Srebro Tommi S. Jaakkola

We study the common problem of approximating a target matrix with a matrix of lower rank. We provide a simple and efficient (EM) algorithm for solving weighted low-rank approximation problems, which, unlike their unweighted version, do not admit a closedform solution in general. We analyze, in addition, the nature of locally optimal solutions that arise in this context, demonstrate the utility ...

2014
Alex Kulesza N. Raj Rao Satinder P. Singh

Spectral learning methods have recently been proposed as alternatives to slow, non-convex optimization algorithms like EM for a variety of probabilistic models in which hidden information must be inferred by the learner. These methods are typically controlled by a rank hyperparameter that sets the complexity of the model; when the model rank matches the true rank of the process generating the d...

Journal: :Foundations and Trends in Machine Learning 2016
Madeleine Udell Corinne Horn Reza Bosagh Zadeh Stephen P. Boyd

Principal components analysis (PCA) is a well-known technique for approximating a data set represented by a matrix by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework encompasses many well known techniques in data analysis, such as nonnegative matrix factorization, matrix...

2013
Ryan Kennedy

While datasets are frequently represented as matrices, real-word data is imperfect and entries are often missing. In many cases, the data are very sparse and the matrix must be filled in before any subsequent work can be done. This optimization problem, known as matrix completion, can be made well-defined by assuming the matrix to be low rank. The resulting rank-minimization problem is NP-hard,...

2006
Hueihan Jhuang Lior Wolf

Recently, some research try to incorporate the 2D structure of images into dimensionality reduction process, like 2DPCA [3] and CSA [4]. Some work use high order tensor to represent image ensembles, where the factors may include different faces, facial expression, viewpoints and illuminations [5, 6]. All these efforts indeed provide good performance but tend to separate the dimensionality reduc...

2018
Zhengyu Chen Gauri Jagatap Seyedehsara Nayer Chinmay Hegde Namrata Vaswani

In this paper, we introduce a principled algorithmic approach for Fourier ptychographic imaging of dynamic, time-varying targets. To the best of our knowledge, this setting has not been explicitly addressed in the ptychography literature. We argue that such a setting is very natural, and that our methods provide an important first step towards helping reduce the sample complexity (and hence acq...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید