Less is more: why all paradigms are defective, and why that is a good thing

被引:11
|
作者
Janda, Laura A. [1 ]
Tyers, Francis M. [2 ]
机构
[1] UiT Norges Arktiske Univ, HSL, Tromso, Norway
[2] Nacl Issledovatelskij Univ Vyssaa Skola Ekon, Sch Linguist, Moscow, Russia
关键词
morphology; paradigm; Russian; corpus; computational experiment;
D O I
10.1515/cllt-2018-0031
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
Only a fraction of lexemes are encountered in all their paradigm forms in any corpus or even in the lifetime of any speaker. This raises a question as to how it is that native speakers confidently produce and comprehend word forms that they have never witnessed. We present the results of an experiment using a recurrent neural network computational learning model. In particular, we compare the model's production of unencountered forms using two types of training data: full paradigms vs. single word forms for Russian nouns, verbs, and adjectives. In the long run, the model displays better performance when exposed to the more naturalistic training on single word forms, even though the other training data is much larger as it includes full paradigms for each and every word. We discuss why "defective" paradigms may be better for human learners as well.
引用
收藏
页码:109 / 141
页数:33
相关论文
共 50 条