OMPGPT: A Generative Pre-trained Transformer Model for OpenMP

被引:0
|
作者
Chen, Le [1 ]
Bhattacharjee, Arijit [1 ]
Ahmed, Nesreen [2 ]
Hasabnis, Niranjan [2 ]
Oren, Gal [3 ]
Vo, Vy [2 ]
Jannesari, Ali [1 ]
机构
[1] Iowa State Univ, Ames, IA 50011 USA
[2] Intel Labs, Hillsboro, OR USA
[3] Technion Israel Inst Technol, NRCN, Hiafa, Israel
关键词
Large Language model; OpenMP; HPC;
D O I
10.1007/978-3-031-69577-3_9
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Large language models (LLMs)such as ChatGPT have significantly advanced the field of Natural Language Processing (NLP). This trend led to the development of code-based large language models such as StarCoder, WizardCoder, and CodeLlama, which are trained extensively on vast repositories of code and programming languages. While the generic abilities of these code LLMs are helpful for many programmers in tasks like code generation, the area of high-performance computing (HPC) has a narrower set of requirements that make a smaller and more domain-specific model a smarter choice. This paper presents OMPGPT, a novel domain-specific model meticulously designed to harness the inherent strengths of language models for OpenMP pragma generation. Furthermore, we leverage prompt engineering techniques from the NLP domain to create Chain-of-OMP, an innovative strategy designed to enhance OMPGPT's effectiveness. Our extensive evaluations demonstrate that OMPGPT outperforms existing large language models specialized in OpenMP tasks and maintains a notably smaller size, aligning it more closely with the typical hardware constraints of HPC environments. We consider our contribution as a pivotal bridge, connecting the advantage of language models with the specific demands of HPC tasks.
引用
收藏
页码:121 / 134
页数:14
相关论文
共 50 条
  • [1] ShellGPT: Generative Pre-trained Transformer Model for Shell Language Understanding
    Shi, Jie
    Jiang, Sihang
    Xu, Bo
    Liang, Jiaqing
    Xiao, Yanghua
    Wang, Wei
    [J]. 2023 IEEE 34TH INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING, ISSRE, 2023, : 671 - 682
  • [2] Generative Pre-Trained Transformer for Cardiac Abnormality Detection
    Gaudilliere, Pierre Louis
    Sigurthorsdottir, Halla
    Aguet, Clementine
    Van Zaen, Jerome
    Lemay, Mathieu
    Delgado-Gonzalo, Ricard
    [J]. 2021 COMPUTING IN CARDIOLOGY (CINC), 2021,
  • [3] The application of Chat Generative Pre-trained Transformer in nursing education
    Liu, Jialin
    Liu, Fan
    Fang, Jinbo
    Liu, Siru
    [J]. NURSING OUTLOOK, 2023, 71 (06)
  • [4] The impact of Chat Generative Pre-trained Transformer (ChatGPT) on medical education
    Heng, Jonathan J. Y.
    Teo, Desmond B.
    Tan, L. F.
    [J]. POSTGRADUATE MEDICAL JOURNAL, 2023, 99 (1176) : 1125 - 1127
  • [5] Enhancing rumor detection with data augmentation and generative pre-trained transformer
    Askarizade, Mojgan
    [J]. Expert Systems with Applications, 2025, 262
  • [6] BioGPT: generative pre-trained transformer for biomedical text generation and mining
    Luo, Renqian
    Sun, Liai
    Xia, Yingce
    Qin, Tao
    Zhang, Sheng
    Poon, Hoifung
    Liu, Tie-Yan
    [J]. BRIEFINGS IN BIOINFORMATICS, 2022, 23 (06)
  • [7] Generative Pre-trained Transformer for Pediatric Stroke Research: A Pilot Study
    Fiedler, Anna K.
    Zhang, Kai
    Lal, Tia S.
    Jiang, Xiaoqian
    Fraser, Stuart M.
    [J]. PEDIATRIC NEUROLOGY, 2024, 160
  • [8] Industrial-generative pre-trained transformer for intelligent manufacturing systems
    Wang, Han
    Liu, Min
    Shen, Weiming
    [J]. IET COLLABORATIVE INTELLIGENT MANUFACTURING, 2023, 5 (02)
  • [9] Enhancing clinical reasoning with Chat Generative Pre-trained Transformer: a practical guide
    Hirosawa, Takanobu
    Shimizu, Taro
    [J]. DIAGNOSIS, 2024, 11 (01) : 102 - 105
  • [10] Chat generative pre-trained transformer (ChatGPT): potential implications for rheumatology practice
    Arvind Nune
    Karthikeyan. P. Iyengar
    Ciro Manzo
    Bhupen Barman
    Rajesh Botchu
    [J]. Rheumatology International, 2023, 43 : 1379 - 1380