Tracking Child Language Development With Neural Network Language Models

被引:1
|
作者
Sagae, Kenji [1 ]
机构
[1] Univ Calif Davis, Dept Linguist, Davis, CA 95616 USA
来源
FRONTIERS IN PSYCHOLOGY | 2021年 / 12卷
关键词
child language assessment; natural language processing; computational linguistics; language model; IPSyn; neural network; PRODUCTIVE SYNTAX; INDEX;
D O I
10.3389/fpsyg.2021.674402
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Recent work on the application of neural networks to language modeling has shown that models based on certain neural architectures can capture syntactic information from utterances and sentences even when not given an explicitly syntactic objective. We examine whether a fully data-driven model of language development that uses a recurrent neural network encoder for utterances can track how child language utterances change over the course of language development in a way that is comparable to what is achieved using established language assessment metrics that use language-specific information carefully designed by experts. Given only transcripts of child language utterances from the CHILDES Database and no pre-specified information about language, our model captures not just the structural characteristics of child language utterances, but how these structures reflect language development over time. We establish an evaluation methodology with which we can examine how well our model tracks language development compared to three known approaches: Mean Length of Utterance, the Developmental Sentence Score, and the Index of Productive Syntax. We discuss the applicability of our model to data-driven assessment of child language development, including how a fully data-driven approach supports the possibility of increased research in multilingual and cross-lingual issues.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] A Primer on Neural Network Models for Natural Language Processing
    Goldberg, Yoav
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2016, 57 : 345 - 420
  • [22] Latent Words Recurrent Neural Network Language Models
    Masumura, Ryo
    Asami, Taichi
    Oba, Takanobu
    Masataki, Hirokazu
    Sakauchi, Sumitaka
    Ito, Akinori
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2380 - 2384
  • [23] Large Vocabulary SOUL Neural Network Language Models
    Le, Hai-Son
    Oparin, Ilya
    Messaoudi, Abdel
    Allauzen, Alexandre
    Gauvain, Jean-Luc
    Yvon, Francois
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 1480 - +
  • [24] OPTIMIZATION OF NEURAL NETWORK LANGUAGE MODELS FOR KEYWORD SEARCH
    Gandhe, Ankur
    Metze, Florian
    Waibel, Alex
    Lane, Ian
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [25] COMPARISON OF FEEDFORWARD AND RECURRENT NEURAL NETWORK LANGUAGE MODELS
    Sundermeyer, M.
    Oparin, I.
    Gauvain, J. -L.
    Freiberg, B.
    Schlueter, R.
    Ney, H.
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8430 - 8434
  • [26] FUTURE WORD CONTEXTS IN NEURAL NETWORK LANGUAGE MODELS
    Chen, X.
    Liu, X.
    Ragni, A.
    Wang, Y.
    Gales, M. J. F.
    2017 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU), 2017, : 97 - 103
  • [27] Child language development
    Schrey-Dern, D
    CLINICAL LINGUISTICS & PHONETICS, 1998, 12 (01) : 79 - 80
  • [28] Entity Tracking in Language Models
    Kim, Najoung
    Schuster, Sebastian
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 3835 - 3855
  • [29] Studies at the interface of child language and models of language acquisition
    Thornton, Rosalind
    FIRST LANGUAGE, 2012, 32 (1-2) : 281 - 297
  • [30] Recurrent Neural Network-based Language Models with Variation in Net Topology, Language, and Granularity
    Yang, Tzu-Hsuan
    Tseng, Tzu-Hsuan
    Chen, Chia-Ping
    PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2016, : 71 - 74