Querying a Matrix through Matrix-Vector Products

نویسندگان

چکیده

We consider algorithms with access to an unknown matrix M ε F n×d via matrix-vector products , namely, the algorithm chooses vectors v 1 ⃛ q and observes Mv . Here i can be randomized as well chosen adaptively a function of i-1 Motivated by applications sketching in distributed computation, linear algebra, streaming models, connections areas such communication complexity property testing, we initiate study number queries needed solve various fundamental problems. problems three broad categories, including statistics problems, graph For example, required approximate rank, trace, maximum eigenvalue, norms M; compute AND/OR/Parity each column or row M, decide whether there are identical columns rows is symmetric, diagonal, unitary; defined connected triangle-free. also show separations for that allowed obtain only querying on right, versus query both left right. depending underlying field product occurs in. form (bipartite adjacency signed edge-vertex incidence matrix) represent graph. Surprisingly, very few works discuss this model, believe thorough investigation model would beneficial different application areas.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Restarted GMRES with Inexact Matrix-Vector Products

This paper discusses how to control the accuracy of inexact matrix-vector products in restarted GMRES. We will show that the GMRES iterations can be performed with relatively low accuracy. Furthermore, we will study how to compute the residual at restart and propose suitable strategies to control the accuracy of the matrix-vector products in this computation.

متن کامل

Semantic Compositionality through Recursive Matrix-Vector Spaces

Single-word vector space models have been very successful at learning lexical information. However, they cannot capture the compositional meaning of longer phrases, preventing them from a deeper understanding of language. We introduce a recursive neural network (RNN) model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length. Our mode...

متن کامل

Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent

We propose a generic method for iteratively approximating various second-order gradient steps - Newton, Gauss-Newton, Levenberg-Marquardt, and natural gradient - in linear time per iteration, using special curvature matrix-vector products that can be computed in O(n). Two recent acceleration techniques for on-line learning, matrix momentum and stochastic meta-descent (SMD), implement this appro...

متن کامل

Iterative linear system solvers with approximate matrix-vector products

There are classes of linear problems for which a matrix-vector product is a time consuming operation because an expensive approximation method is required to compute it to a given accuracy. One important example is simulations in lattice QCD with Neuberger fermions where a matrix multiply requires the product of the matrix sign function of a large sparse matrix times a vector. The recent intere...

متن کامل

Evaluating products of matrix pencils and collapsing matrix products

This paper describes three numerical methods to collapse a formal product of p pairs of matrices P = Q p?1 k=0 E ?1 k A k down to the product of a single pair ^ E ?1 ^ A. In the setting of linear relations, the product formally extends to the case in which some of the E k 's are singular and it is impossible to explicitly form P as a single matrix. The methods diier in op count, work space, and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Algorithms

سال: 2021

ISSN: ['1549-6333', '1549-6325']

DOI: https://doi.org/10.1145/3470566