Introducing pre-trained transformers for high entropy alloy informatics

被引:0
|
作者
Kamnis, Spyros [1 ]
机构
[1] Castolin Eutectic UK, Newcastle upon Tyne,NE29 8SE, United Kingdom
关键词
Compendex;
D O I
暂无
中图分类号
学科分类号
摘要
Machine learning
引用
收藏
相关论文
共 50 条
  • [41] Detecting Propaganda Techniques in English News Articles using Pre-trained Transformers
    Abdullah, Malak
    Altiti, Ola
    Obiedat, Rasha
    2022 13TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION SYSTEMS (ICICS), 2022, : 301 - 308
  • [42] EAPT: An encrypted traffic classification model via adversarial pre-trained transformers
    Zhan, Mingming
    Yang, Jin
    Jia, Dongqing
    Fu, Geyuan
    COMPUTER NETWORKS, 2025, 257
  • [43] CopiFilter: An Auxiliary Module Adapts Pre-trained Transformers for Medical Dialogue Summarization
    Duan, Jiaxin
    Liu, Junfei
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 99 - 114
  • [44] Math-LLMs: AI Cyberinfrastructure with Pre-trained Transformers for Math Education
    Zhang, Fan
    Li, Chenglu
    Henkel, Owen
    Xing, Wanli
    Baral, Sami
    Heffernan, Neil
    Li, Hai
    INTERNATIONAL JOURNAL OF ARTIFICIAL INTELLIGENCE IN EDUCATION, 2024,
  • [45] Towards a Comprehensive Understanding and Accurate Evaluation of Societal Biases in Pre-Trained Transformers
    Silva, Andrew
    Tambwekar, Pradyumna
    Gombolay, Matthew
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2383 - 2389
  • [46] Harnessing Generative Pre-Trained Transformers for Construction Accident Prediction with Saliency Visualization
    Yoo, Byunghee
    Kim, Jinwoo
    Park, Seongeun
    Ahn, Changbum R.
    Oh, Taekeun
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [47] BERT-QPP: Contextualized Pre-trained Transformers for Query Performance Prediction
    Arabzadeh, Negar
    Khodabakhsh, Maryam
    Bagheri, Ebrahim
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2857 - 2861
  • [48] PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation
    Hua, Xinyu
    Wang, Lu
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 781 - 793
  • [49] Pre-trained Vision and Language Transformers Are Few-Shot Incremental Learners
    Park, Keon-Hee
    Song, Kyungwoo
    Park, Gyeong-Moon
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23881 - 23890
  • [50] ProdRev: A DNN framework for empowering customers using generative pre-trained transformers
    Gupta, Aakash
    Das, Nataraj
    2022 INTERNATIONAL CONFERENCE ON DECISION AID SCIENCES AND APPLICATIONS (DASA), 2022, : 895 - 899