Optimizing Sparse Linear Algebra Through Automatic Format Selection and Machine Learning

被引:0
|
作者
Stylianou, Christodoulos [1 ]
Weiland, Michele [1 ]
机构
[1] Univ Edinburgh, EPCC, Edinburgh, Midlothian, Scotland
基金
英国工程与自然科学研究理事会;
关键词
sparse matrix storage formats; machine learning; automatic format selection;
D O I
10.1109/IPDPSW59300.2023.00125
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse matrices are an integral part of scientific simulations. As hardware evolves new sparse matrix storage formats are proposed aiming to exploit optimizations specific to the new hardware. In the era of heterogeneous computing, users often are required to use multiple formats for their applications to remain optimal across the different available hardware, resulting in larger development times and maintenance overhead. A potential solution to this problem is the use of a lightweight auto-tuner driven by Machine Learning (ML) that would select for the user an optimal format from a pool of available formats that will match the characteristics of the sparsity pattern, target hardware and operation to execute. In this paper, we introduce Morpheus-Oracle, a library that provides a lightweight ML auto-tuner capable of accurately predicting the optimal format across multiple backends, targeting the major HPC architectures aiming to eliminate any format selection input by the end-user. From more than 2000 reallife matrices, we achieve an average classification accuracy and balanced accuracy of 92.63% and 80.22% respectively across the available systems. The adoption of the auto-tuner results in average speedup of 1.1x on CPUs and 1.5x to 8x on NVIDIA and AMD GPUs, with maximum speedups reaching up to 7x and 1000x respectively.
引用
收藏
页码:734 / 743
页数:10
相关论文
共 50 条
  • [21] Compressed linear algebra for large-scale machine learning
    Ahmed Elgohary
    Matthias Boehm
    Peter J. Haas
    Frederick R. Reiss
    Berthold Reinwald
    The VLDB Journal, 2018, 27 : 719 - 744
  • [22] Accelerating Machine Learning Queries with Linear Algebra Query Processing
    Sun, Wenbo
    Katsifodimos, Asterios
    Hai, Rihan
    35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023, 2023,
  • [23] Compressed linear algebra for large-scale machine learning
    Elgohary, Ahmed
    Boehm, Matthias
    Haas, Peter J.
    Reiss, Frederick R.
    Reinwald, Berthold
    VLDB JOURNAL, 2018, 27 (05): : 719 - 744
  • [24] Accelerating machine learning queries with linear algebra query processing
    Sun, Wenbo
    Katsifodimos, Asterios
    Hai, Rihan
    DISTRIBUTED AND PARALLEL DATABASES, 2025, 43 (01)
  • [25] Linear Algebra and Optimization with Applications to Machine Learning, Volume I
    Huyer, W.
    MONATSHEFTE FUR MATHEMATIK, 2023, 201 (03): : 962 - 962
  • [26] Nonlinear learning of linear algebra: active learning through journal writing
    Hamdan, May
    INTERNATIONAL JOURNAL OF MATHEMATICAL EDUCATION IN SCIENCE AND TECHNOLOGY, 2005, 36 (06) : 607 - 615
  • [27] Reinforcement Learning for Automated Performance Tuning: Initial Evaluation for Sparse Matrix Format Selection
    Armstrong, Warren
    Rendell, Alistair P.
    2008 IEEE INTERNATIONAL CONFERENCE ON CLUSTER COMPUTING, 2008, : 411 - 420
  • [28] Application of Machine Learning to Synthesis of Maximally Sparse Linear Arrays
    Zhao, Xiaowen
    Yang, Qingshan
    Zhang, Yunhua
    2019 PHOTONICS & ELECTROMAGNETICS RESEARCH SYMPOSIUM - SPRING (PIERS-SPRING), 2019, : 2917 - 2921
  • [29] Optimizing Learning Path Selection through Memetic Algorithms
    Acampora, Giovanni
    Gaeta, Matteo
    Loia, Vincenzo
    Ritrovato, Pierluigi
    Salerno, Saverio
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3869 - +
  • [30] Technical Perspective: Scaling Machine Learning via Compressed Linear Algebra
    Ives, Zachary G.
    SIGMOD RECORD, 2017, 46 (01) : 41 - 41