A Heideggerian analysis of generative pretrained transformer models

被引:1
|
作者
Floroiu, Iustin [1 ]
Timisica, Daniela [1 ,2 ]
机构
[1] Natl Inst Res & Dev Informat ICI Bucharest, Bucharest, Romania
[2] Natl Univ Sci & Technol Politehn Bucharest, Bucharest, Romania
关键词
Martin Heidegger; GPT; Artificial Intelligence; Dasein; TURING TEST;
D O I
10.33436/v34i1y202402
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
To better understand the emergence of new large language models in the context of future possibilities with regard to developing novel artificial general intelligence, it is essential to analyse and conclude the existential implications of these algorithms. Given the high speed of technological advancements in the field of deep learning, generative pretrained transformers (GPT) are the closest thing related to the invention of highly independent and intelligent programs, because they manifest creativity and convey an accurate formation of a worldview model that was never seen before. Because of these aspects, this article proposes an analysis of the concept of Dasein, defined by Heidegger, in the vast description of advancements added in the field of computational intelligence. The analysis methods described here are meant to bypass the complex problems of cognitive sciences with regard to computational intelligence and to create a highly accurate model of mental representation and hierarchisation of emergent intelligent algorithms.
引用
收藏
页码:13 / 22
页数:10
相关论文
共 50 条
  • [1] Chat generative pretrained transformer: A disruptive or constructive technology?
    Deshmukh, Sonali Vijay
    [J]. JOURNAL OF THE INTERNATIONAL CLINICAL DENTAL RESEARCH ORGANIZATION, 2023, 15 (01) : 1 - 2
  • [2] Medical Text Prediction and Suggestion Using Generative Pretrained Transformer Models with Dental Medical Notes
    Sirrianni, Joseph
    Sezgin, Emre
    Claman, Daniel
    Linwood, Simon L.
    [J]. METHODS OF INFORMATION IN MEDICINE, 2022, 61 (05/06) : 195 - 200
  • [3] Analyzing Redundancy in Pretrained Transformer Models
    Dalvi, Fahim
    Sajjad, Hassan
    Durrani, Nadir
    Belinkov, Yonatan
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4908 - 4926
  • [4] A Future of Smarter Digital Health Empowered by Generative Pretrained Transformer
    Miao, Hongyu
    Li, Chengdong
    Wang, Jing
    [J]. JOURNAL OF MEDICAL INTERNET RESEARCH, 2023, 25
  • [5] Compact binary systems waveform generation with a generative pretrained transformer
    Shi, Ruijun
    Zhou, Yue
    Zhao, Tianyu
    Cao, Zhoujian
    Ren, Zhixiang
    [J]. PHYSICAL REVIEW D, 2024, 109 (08)
  • [6] Transforming the generative pretrained transformer into augmented business text writer
    Khalil, Faisal
    Pipa, Gordon
    [J]. JOURNAL OF BIG DATA, 2022, 9 (01)
  • [7] Foresight-generative pretrained transformer for the prediction of patient timelines
    Hofmann-Apitius, Martin
    Froehlich, Holger
    [J]. LANCET DIGITAL HEALTH, 2024, 6 (04): : e233 - e234
  • [8] The Heap, the Hype, the Reality: Generative Pretrained Transformer for Systematic Reviews
    Li, Tianjing
    Chang, Stephanie
    [J]. ANNALS OF INTERNAL MEDICINE, 2024, 177 (06) : 828 - 829
  • [9] Colorectal Cancer Prevention and Chat Generative Pretrained Transformer (ChatGPT)
    Daungsupawong, Hinpetch
    Wiwanitkit, Viroj
    [J]. JOURNAL OF CLINICAL GASTROENTEROLOGY, 2024, 58 (05) : 531 - 531
  • [10] Transforming the generative pretrained transformer into augmented business text writer
    Faisal Khalil
    Gordon Pipa
    [J]. Journal of Big Data, 9