A Heideggerian analysis of generative pretrained transformer models

被引:1
|
作者
Floroiu, Iustin [1 ]
Timisica, Daniela [1 ,2 ]
机构
[1] Natl Inst Res & Dev Informat ICI Bucharest, Bucharest, Romania
[2] Natl Univ Sci & Technol Politehn Bucharest, Bucharest, Romania
关键词
Martin Heidegger; GPT; Artificial Intelligence; Dasein; TURING TEST;
D O I
10.33436/v34i1y202402
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
To better understand the emergence of new large language models in the context of future possibilities with regard to developing novel artificial general intelligence, it is essential to analyse and conclude the existential implications of these algorithms. Given the high speed of technological advancements in the field of deep learning, generative pretrained transformers (GPT) are the closest thing related to the invention of highly independent and intelligent programs, because they manifest creativity and convey an accurate formation of a worldview model that was never seen before. Because of these aspects, this article proposes an analysis of the concept of Dasein, defined by Heidegger, in the vast description of advancements added in the field of computational intelligence. The analysis methods described here are meant to bypass the complex problems of cognitive sciences with regard to computational intelligence and to create a highly accurate model of mental representation and hierarchisation of emergent intelligent algorithms.
引用
收藏
页码:13 / 22
页数:10
相关论文
共 50 条
  • [31] medigan: a Python']Python library of pretrained generative models for medical image synthesis
    Osuala, Richard
    Skorupko, Grzegorz
    Lazrak, Noussair
    Garrucho, Lidia
    Garcia, Eloy
    Joshi, Smriti
    Jouide, Socayna
    Rutherford, Michael
    Prior, Fred
    Kushibar, Kaisar
    Diaz, Oliver
    Lekadir, Karim
    [J]. JOURNAL OF MEDICAL IMAGING, 2023, 10 (06)
  • [32] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    [J]. PeerJ Computer Science, 2024, 10
  • [33] Utilizing Artificial Intelligence and Chat Generative Pretrained Transformer to Answer Questions About Clinical Scenarios in Neuroanesthesiology
    Blacker, Samuel N.
    Kang, Mia
    Chakraborty, Indranil
    Chowdhury, Tumul
    Williams, James
    Lewis, Carol
    Zimmer, Michael
    Wilson, Brad
    Lele, Abhijit V.
    [J]. JOURNAL OF NEUROSURGICAL ANESTHESIOLOGY, 2024, 36 (04) : 346 - 351
  • [34] Ethical Considerations of Artificial Intelligence in Health Care: Examining the Role of Generative Pretrained Transformer-4
    Sheth, Suraj
    Baker, Hayden P.
    Prescher, Hannes
    Strelzow, Jason A.
    [J]. JOURNAL OF THE AMERICAN ACADEMY OF ORTHOPAEDIC SURGEONS, 2024, 32 (05) : 205 - 210
  • [35] Chatbot Generative Pretrained Transformer (ChatGPT) responses to questions about orthodontics in an updated version Authors ' response
    Kilinc, Delal Dara
    Mansiz, Duygu
    [J]. AMERICAN JOURNAL OF ORTHODONTICS AND DENTOFACIAL ORTHOPEDICS, 2024, 165 (06) : 614 - 616
  • [36] Utility of Allergen-Specific Patient-Directed Handouts Generated by Chat Generative Pretrained Transformer
    Chandra, Aditi
    Davis, Matthew J.
    Hamann, Dathan
    Hamann, Carsten R.
    [J]. DERMATITIS, 2023, : 448 - 449
  • [37] Are Generative Pretrained Transformer 4 Responses to Developmental Dysplasia of the Hip Clinical Scenarios Universal? An International Review
    Luo, Shaoting
    Canavese, Federico
    Aroojis, Alaric
    Andreacchio, Antonio
    Anticevic, Darko
    Bouchard, Maryse
    Castaneda, Pablo
    De Rosa, Vincenzo
    Fiogbe, Michel Armand
    Frick, Steven L.
    Hui, James H.
    Johari, Ashok N.
    Loro, Antonio
    Lyu, Xuemin
    Matsushita, Masaki
    Omeroglu, Hakan
    Roye, David P.
    Shah, Maulin M.
    Yong, Bicheng
    Li, Lianyong
    [J]. JOURNAL OF PEDIATRIC ORTHOPAEDICS, 2024, 44 (06) : e504 - e511
  • [38] Korean automatic spacing using pretrained transformer encoder and analysis
    Hwang, Taewook
    Jung, Sangkeun
    Roh, Yoon-Hyung
    [J]. ETRI JOURNAL, 2021, 43 (06) : 1049 - 1057
  • [39] Extraction of Substance Use Information From Clinical Notes:Generative Pretrained Transformer-Based Investigation
    Shah-Mohammadi, Fatemeh
    Finkelstein, Joseph
    [J]. JMIR MEDICAL INFORMATICS, 2024, 12
  • [40] ENHANCING PREDICTIVE MODELS FOR PE IN ICU PATIENTS USING PRETRAINED GENERATIVE ADVERSARIAL NETWORKS
    Rivera, Troy
    Patel, Sharad
    Green, Adam
    Puri, Nitin
    [J]. CRITICAL CARE MEDICINE, 2024, 52