Plan, Generate and Match: Scientific Workflow Recommendation with Large Language Models

被引:2
|
作者
Gu, Yang [1 ]
Cao, Jian [1 ]
Guo, Yuan [1 ]
Qian, Shiyou [1 ]
Guan, Wei [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
基金
美国国家科学基金会;
关键词
Scientific Workflow Recommendation; Large Language Models; Planning; Prompting;
D O I
10.1007/978-3-031-48421-6_7
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The recommendation of scientific workflows from public repositories that meet users' natural language requirements is becoming increasingly essential in the scientific community. Nevertheless, existing methods that rely on direct text matching encounter difficulties when it comes to handling complex queries, which ultimately results in poor performance. Large language models (LLMs) have recently exhibited exceptional ability in planning and reasoning. We propose " Plan, Generate and Match" (PGM), a scientific workflow recommendation method leveraging LLMs. PGM consists of three stages: utilizing LLMs to conduct planning upon receiving a user query, generating a structured workflow specification guided by the solution steps, and using these plans and specifications to match with candidate workflows. By incorporating the planning mechanism, PGM leverages few-shot prompting to automatically generate well-considered steps for instructing the recommendation of reliable workflows. This method represents the first exploration of incorporating LLMs into the scientific workflow domain. Experimental results on real-world benchmarks demonstrate that PGM outperforms state-of-the-art methods with statistical significance, highlighting its immense potential in addressing complex requirements.
引用
收藏
页码:86 / 102
页数:17
相关论文
共 50 条
  • [1] Instruct Large Language Models to Generate Scientific Literature Survey Step by Step
    Lai, Yuxuan
    Wu, Yupeng
    Wang, Yidan
    Hu, Wenpeng
    Zheng, Chen
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT V, NLPCC 2024, 2025, 15363 : 484 - 496
  • [2] A survey on large language models for recommendation
    Wu, Likang
    Zheng, Zhi
    Qiu, Zhaopeng
    Wang, Hao
    Gu, Hongchao
    Shen, Tingjia
    Qin, Chuan
    Zhu, Chen
    Zhu, Hengshu
    Liu, Qi
    Xiong, Hui
    Chen, Enhong
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2024, 27 (05):
  • [3] Tutorial on Large Language Models for Recommendation
    Hua, Wenyue
    Li, Lei
    Xu, Shuyuan
    Chen, Li
    Zhang, Yongfeng
    PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 1281 - 1283
  • [4] Scientific Workflow Clustering and Recommendation
    Cheng, Zehui
    Zhou, ZhangBing
    Wang, Xiaolei
    2015 11TH INTERNATIONAL CONFERENCE ON SEMANTICS, KNOWLEDGE AND GRIDS (SKG), 2015, : 272 - 274
  • [5] Leveraging Large Language Models for Sequential Recommendation
    Harte, Jesse
    Zorgdrager, Wouter
    Louridas, Panos
    Katsifodimos, Asterios
    Jannach, Dietmar
    Fragkoulis, Marios
    PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 1096 - 1102
  • [6] Large Language Models as Evaluators for Recommendation Explanations
    Zhang, Xiaoyu
    Li, Yishan
    Wang, Jiayin
    Sun, Bowen
    Ma, Weizhi
    Sun, Peijie
    Zhang, Min
    PROCEEDINGS OF THE EIGHTEENTH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2024, 2024, : 33 - 42
  • [7] Fairness identification of large language models in recommendation
    Liu, Wei
    Liu, Baisong
    Qin, Jiangcheng
    Zhang, Xueyuan
    Huang, Weiming
    Wang, Yangyang
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [8] Large Language Models as Recommendation Systems in Museums
    Trichopoulos, Georgios
    Konstantakis, Markos
    Alexandridis, Georgios
    Caridakis, George
    ELECTRONICS, 2023, 12 (18)
  • [9] Large language models and scientific publishing
    Rousseau, Ronald
    Yang, Liying
    Bollen, Johan
    Shen, Zhesi
    JOURNAL OF DATA AND INFORMATION SCIENCE, 2023, 8 (01) : 1 - 1
  • [10] Large language models and scientific publishing
    Ronald Rousseau
    Liying Yang
    Johan Bollen
    Zhesi Shen
    Journal of Data and Information Science, 2023, 8 (01) : 1