Large-scale modeling of wordform learning and representation

被引:36
|
作者
Sibley, Daragh E. [1 ]
Kello, Christopher T. [1 ]
Plaut, David C. [2 ]
Elman, Jeffrey L. [3 ]
机构
[1] George Mason Univ, Dept Psychol, Fairfax, VA 22030 USA
[2] Carnegie Mellon Univ, Dept Psychol, Ctr Neural Basis Cognit, Pittsburgh, PA 15213 USA
[3] Univ Calif San Diego, Dept Cognit Sci, San Diego, CA 92103 USA
关键词
large-scale connectionist modeling; sequence encoder; simple recurrent network; lexical processing; orthography; phonology; wordforms;
D O I
10.1080/03640210802066964
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
The forms of words as they appear in text and speech are central to theories and models of lexical processing. Nonetheless, current methods for simulating their learning and representation fail to approach the scale and heterogeneity of real wordform lexicons. A connectionist architecture termed the sequence encoder is used to learn nearly 75,000 wordform representations through exposure to strings of stress-marked phonemes or letters. First, the mechanisms and efficacy of the sequence encoder are demonstrated and shown to overcome problems with traditional slot-based codes. Then, two large-scale simulations are reported that learned to represent lexicons of either phonological or orthographic wordforms. In doing so, the models learned the statistics of their lexicons as shown by better processing of well-formed pseudowords as opposed to ill-formed (scrambled) pseudowords, and by accounting for variance in well-formedness ratings. It is discussed how the sequence encoder may be integrated into broader models of lexical processing.
引用
收藏
页码:741 / 754
页数:14
相关论文
共 50 条
  • [1] Large-scale Graph Representation Learning
    Leskovec, Jure
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 4 - 4
  • [2] Large-scale knowledge graph representation learning
    Badrouni, Marwa
    Katar, Chaker
    Inoubli, Wissem
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (09) : 5479 - 5499
  • [3] Learning Deep Representation with Large-scale Attributes
    Ouyang, Wanli
    Li, Hongyang
    Zeng, Xingyu
    Wang, Xiaogang
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 1895 - 1903
  • [4] Representation Learning for Large-Scale Dynamic Networks
    Yu, Yanwei
    Yao, Huaxiu
    Wang, Hongjian
    Tang, Xianfeng
    Li, Zhenhui
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2018), PT II, 2018, 10828 : 526 - 541
  • [5] A Large-Scale Study on Unsupervised Spatiotemporal Representation Learning
    Feichtenhofer, Christoph
    Fan, Haoqi
    Xiong, Bo
    Girshick, Ross
    He, Kaiming
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 3298 - 3308
  • [6] Prototype Memory for Large-Scale Face Representation Learning
    Smirnov, Evgeny
    Garaev, Nikita
    Galyuk, Vasiliy
    Lukyanets, Evgeny
    [J]. IEEE ACCESS, 2022, 10 : 12031 - 12046
  • [7] Dynamic Representation Learning for Large-Scale Attributed Networks
    Liu, Zhijun
    Huang, Chao
    Yu, Yanwei
    Song, Peng
    Fan, Baode
    Dong, Junyu
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1005 - 1014
  • [8] Modeling Large-Scale Heatwave by Incorporating Enhanced Urban Representation
    Patel, Pratiman
    Jamshidi, Sajad
    Nadimpalli, Raghu
    Aliaga, Daniel G.
    Mills, Gerald
    Chen, Fei
    Demuzere, Matthias
    Niyogi, Dev
    [J]. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES, 2022, 127 (02)
  • [9] Kernel-Based Autoencoders for Large-Scale Representation Learning
    Bao, Jinzhou
    Zhao, Bo
    Guo, Ping
    [J]. 2021 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND ARTIFICIAL INTELLIGENCE, ICRAI 2021, 2021, : 112 - 117
  • [10] PartNRL: Partial Nodes Representation Learning in Large-Scale Network
    Li, Juan-Hui
    Huang, Ling
    Wang, Chang-Dong
    Huang, Dong
    Lai, Jian-Huang
    [J]. IEEE ACCESS, 2019, 7 : 56457 - 56468