Dense-Sparse Matrix Multiplication : Algorithms and Performance Evaluation
نویسندگان
چکیده
. In this paper, we address the dense-sparse matrix product (DSMP) problem i.e. where the first matrix is dense and the second is sparse. We first present initial versions of loop nest structured algorithms corresponding to the most used sparse matrix storing formats i.e. DNS, CSR, CSC and COO. Afterwards, we derive several versions obtained by applying loop interchange techniques, loop invariant motion and loop unrolling on the previous loop nest algorithms. Theoretical multifold comparisons are then made between the different designed versions. Our contribution is validated through a series of experiments achieved on a set of sparse matrices with different sizes and densities. Keywords— Algorithm complexity; compressed/storage format; loop nest optimization; performance evaluation; sparse matrix product
منابع مشابه
Experimental Evaluation of Multi-Round Matrix Multiplication on MapReduce
This paper proposes an Hadoop library, named M3, for performing dense and sparse matrix multiplication in MapReduce. The library features multi-round MapReduce algorithms that allow to tradeoff round number with the amount of data shuffled in each round and the amount of memory required by reduce functions. We claim that multi-round MapReduce algorithms are preferable in cloud settings to tradi...
متن کاملCache Oblivious Dense and Sparse Matrix Multiplication Based on Peano Curves
Cache oblivious algorithms are designed to benefit from any existing cache hierarchy—regardless of cache size or architecture. In matrix computations, cache oblivious approaches are usually obtained from block-recursive approaches. In this article, we extend an existing cache oblivious approach for matrix operations, which is based on Peano space-filling curves, for multiplication of sparse and...
متن کاملBlock Algorithms for Sparse Matrix by Dense Matrix Multiplication
Sparse matrix computations appear in many linear algebra kernels of scienti c applications. The study, evaluation and optimization of sparse matrix codes is more complex than the dense case. Moreover, the irregularity of some memory accesses and the a-priory lack of knowledge of the number of iterations to be perfomed in some loops (both depending on the sparsity pettern) limit the succes of pr...
متن کاملMatrix Multiplication Algorithm Selection with Support Vector Machines
We present a machine learning technique for the algorithm selection problem, specifically focusing on algorithms for dense matrix multiplication. Dense matrix multiplication is a core component of many high-performance computing and machine learning algorithms [1], but the performance of matrix multiplication algorithms can vary significantly based on input parameters and hardware architecture....
متن کاملInvestigating the Effects of Hardware Parameters on Power Consumptions in SPMV Algorithms on Graphics Processing Units (GPUs)
Although Sparse matrix-vector multiplication (SPMVs) algorithms are simple, they include important parts of Linear Algebra algorithms in Mathematics and Physics areas. As these algorithms can be run in parallel, Graphics Processing Units (GPUs) has been considered as one of the best candidates to run these algorithms. In the recent years, power consumption has been considered as one of the metr...
متن کامل