Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech

被引:35
|
作者
Huebner, Philip A. [1 ]
Willits, Jon A. [2 ]
机构
[1] Univ Calif Riverside, Interdept Neurosci Grad Program, Riverside, CA 92521 USA
[2] Univ Calif Riverside, Dept Psychol, Riverside, CA 92521 USA
来源
FRONTIERS IN PSYCHOLOGY | 2018年 / 9卷
关键词
semantic development; language learning; neural networks; statistical learning; REPRESENTATIONS; NETWORKS; MODEL; STATISTICS; LANGUAGE; MEMORY; CONNECTIONISM; COOCCURRENCE; RECOGNITION; ACQUISITION;
D O I
10.3389/fpsyg.2018.00133
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary "deep learning" approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory) to predict word sequences in a 5-million-word corpus of speech directed to children ages 0-3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM) and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing semantic system.
引用
收藏
页数:18
相关论文
共 38 条
  • [1] An Automatically Aligned Corpus of Child-directed Speech
    Elsner, Micha
    Ito, Kiwako
    [J]. 18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, : 1736 - 1740
  • [2] The Influence of Child-Directed Speech on Word Learning and Comprehension
    Foursha-Stevenson, Cassandra
    Schembri, Taylor
    Nicoladis, Elena
    Eriksen, Cody
    [J]. JOURNAL OF PSYCHOLINGUISTIC RESEARCH, 2017, 46 (02) : 329 - 343
  • [3] The Influence of Child-Directed Speech on Word Learning and Comprehension
    Cassandra Foursha-Stevenson
    Taylor Schembri
    Elena Nicoladis
    Cody Eriksen
    [J]. Journal of Psycholinguistic Research, 2017, 46 : 329 - 343
  • [4] Statistical speech segmentation and word learning in parallel: scaffolding from child-directed speech
    Yurovsky, Daniel
    Yu, Chen
    Smith, Linda B.
    [J]. FRONTIERS IN PSYCHOLOGY, 2012, 3
  • [5] Word stress correlates in spontaneous child-directed speech in German
    Schneider, Katrin
    Moebius, Bernd
    [J]. INTERSPEECH 2007: 8TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION, VOLS 1-4, 2007, : 637 - 640
  • [6] How can child-directed speech facilitate the acquisition of morphology?
    Kempe, V
    Brooks, PJ
    Pirott, L
    [J]. RESEARCH ON CHILD LANGUAGE ACQUISITION, VOLS 1 AND 2, 2001, : 1234 - 1244
  • [7] Child-directed speech is optimized for syntax-free semantic inference
    Guanghao You
    Balthasar Bickel
    Moritz M. Daum
    Sabine Stoll
    [J]. Scientific Reports, 11
  • [8] Word Segmentation Cues in German Child-Directed Speech: A Corpus Analysis
    Stark, Katja
    Kidd, Evan
    Frost, Rebecca L. A.
    [J]. LANGUAGE AND SPEECH, 2022, 65 (01) : 3 - 27
  • [9] Child-directed speech is optimized for syntax-free semantic inference
    You, Guanghao
    Bickel, Balthasar
    Daum, Moritz M.
    Stoll, Sabine
    [J]. SCIENTIFIC REPORTS, 2021, 11 (01)
  • [10] The use of complex structures with a word class change in Inuktitut child-directed speech
    Johnson, Olga Alice
    Allen, Shanley E. M.
    [J]. FRONTIERS IN PSYCHOLOGY, 2022, 13