OMPGPT: A Generative Pre-trained Transformer Model for OpenMP

被引:0
|
作者
Chen, Le [1 ]
Bhattacharjee, Arijit [1 ]
Ahmed, Nesreen [2 ]
Hasabnis, Niranjan [2 ]
Oren, Gal [3 ]
Vo, Vy [2 ]
Jannesari, Ali [1 ]
机构
[1] Iowa State Univ, Ames, IA 50011 USA
[2] Intel Labs, Hillsboro, OR USA
[3] Technion Israel Inst Technol, NRCN, Hiafa, Israel
关键词
Large Language model; OpenMP; HPC;
D O I
10.1007/978-3-031-69577-3_9
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Large language models (LLMs)such as ChatGPT have significantly advanced the field of Natural Language Processing (NLP). This trend led to the development of code-based large language models such as StarCoder, WizardCoder, and CodeLlama, which are trained extensively on vast repositories of code and programming languages. While the generic abilities of these code LLMs are helpful for many programmers in tasks like code generation, the area of high-performance computing (HPC) has a narrower set of requirements that make a smaller and more domain-specific model a smarter choice. This paper presents OMPGPT, a novel domain-specific model meticulously designed to harness the inherent strengths of language models for OpenMP pragma generation. Furthermore, we leverage prompt engineering techniques from the NLP domain to create Chain-of-OMP, an innovative strategy designed to enhance OMPGPT's effectiveness. Our extensive evaluations demonstrate that OMPGPT outperforms existing large language models specialized in OpenMP tasks and maintains a notably smaller size, aligning it more closely with the typical hardware constraints of HPC environments. We consider our contribution as a pivotal bridge, connecting the advantage of language models with the specific demands of HPC tasks.
引用
收藏
页码:121 / 134
页数:14
相关论文
共 50 条
  • [21] Universal skepticism of ChatGPT: a review of early literature on chat generative pre-trained transformer
    Watters, Casey
    Lemanski, Michal K.
    [J]. FRONTIERS IN BIG DATA, 2023, 6
  • [22] Technological Advancements in Menstrual Health: The Role of Generative Pre-Trained Transformer and Bees Algorithm
    Shiny Irene, D.
    Indra Priyadharshini, S.
    Ponnuviji, N.P.
    Kalaivani, A.
    [J]. IETE Journal of Research, 2024, 70 (12) : 8476 - 8491
  • [23] AlarmGPT: an intelligent alarm analyzer for optical networks using a generative pre-trained transformer
    Wang, Yidi
    Zhang, Chunyu
    Li, Jin
    Pang, Yue
    Zhang, Lifang
    Zhang, Min
    Wang, Danshi
    [J]. JOURNAL OF OPTICAL COMMUNICATIONS AND NETWORKING, 2024, 16 (06) : 681 - 694
  • [24] Chatbots Attempt Physics Homework-ChatGPT: Chat Generative Pre-Trained Transformer
    MacIsaac, Dan
    [J]. PHYSICS TEACHER, 2023, 61 (04): : 318 - 318
  • [25] Using the Chat Generative Pre-trained Transformer in academic health writing: a scoping review
    Pinto Costa, Isabelle Cristinne
    do Nascimento, Murilo Cesar
    Treviso, Patricia
    Chini, Lucelia Terra
    Roza, Bartira de Aguiar
    Barbosa, Sayonara De Fatima Faria
    Mendes, Karina Dal Sasso
    [J]. REVISTA LATINO-AMERICANA DE ENFERMAGEM, 2024, 32
  • [26] Using the Chat Generative Pre-trained Transformer in academic writing in health: a scoping review
    Costa, Isabelle Cristinne Pinto
    do Nascimento, Murilo Cesar
    Treviso, Patricia
    Chini, Lucelia Terra
    Roza, Bartira de Aguiar
    Barbosa, Sayonara De Fatima Faria
    Mendes, Karina Dal Sasso
    [J]. REVISTA LATINO-AMERICANA DE ENFERMAGEM, 2024, 32
  • [27] Enhancing catalysis studies with chat generative pre-trained transformer (ChatGPT): Conversation with ChatGPT
    Ansari, Navid
    Babaei, Vahid
    Najafpour, Mohammad Mahdi
    [J]. DALTON TRANSACTIONS, 2024, 53 (08) : 3534 - 3547
  • [29] Evolutionary Game Analysis of Artificial Intelligence Such as the Generative Pre-Trained Transformer in Future Education
    You, Yanwei
    Chen, Yuquan
    You, Yujun
    Zhang, Qi
    Cao, Qiang
    [J]. SUSTAINABILITY, 2023, 15 (12)
  • [30] Role of chat-generative pre-trained transformer (ChatGPT) in anaesthesia: Merits and pitfalls
    Reddy, Ashwini
    Patel, Swati
    Barik, Amiya Kumar
    Gowda, Punith
    [J]. INDIAN JOURNAL OF ANAESTHESIA, 2023, 67 (10) : 942 - 944