Translation of word vectors by exploiting translatable context pairs

被引:0
|
作者
Ishiwatari S. [1 ]
Kaji N. [2 ]
Yoshinaga N. [3 ]
Toyoda M. [3 ]
Kitsuregawa M. [3 ,4 ]
机构
[1] Graduate School of Information Science and Technology, University of Tokyo
[2] Institute of Industrial Science, University of Tokyo
[3] National Institute of Informatics, University of Tokyo
来源
| 1600年 / Japanese Society for Artificial Intelligence卷 / 31期
基金
日本学术振兴会;
关键词
Bilingual dictionary induction; Cross-lingual projection; Distributional semantics; Word representation;
D O I
10.1527/tjsai.AI30-A
中图分类号
学科分类号
摘要
While vector-based representations of word meanings (word vectors) have been widely used in a variety of natural language processing applications, they are not meant for capturing the similarity between words in different languages. This prevents using word vectors in multilingual-applications such as cross-lingual information retrieval and machine translation. To solve this problem, we propose a method that learns a cross-lingual projection of word representations from one language into another. Our method utilizes translatable context pairs obtained from a bilingual dictionary and surface similarity as bonus terms of the objective function. In the experiments, we evaluated the effectiveness of the proposed method in four languages, Japanese, Chinese, English and Spanish. Experiments shows that our method outperformed existing methods without any additional supervisions. © 2016, Japanese Society for Artificial Intelligence. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Accurate Cross-lingual projection between count-based word vectors by exploiting translatable context pairs
    Graduate School of Information Science and Technology, University of Tokyo, Japan
    不详
    不详
    不详
    CoNLL - Conf. Comput. Nat. Lang. Learn., Proc., (300-304):
  • [2] To Swap or Not to Swap? Exploiting Dependency Word Pairs for Reordering in Statistical Machine Translation
    Hadiwinoto, Christian
    Liu, Yang
    Ng, Hwee Tou
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2943 - 2949
  • [3] Learned in Translation: Contextualized Word Vectors
    McCann, Bryan
    Bradbury, James
    Xiong, Caiming
    Socher, Richard
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Context Vectors Are Reflections of Word Vectors in Half the Dimensions
    Assylbekov, Zhenisbek
    Takhanov, Rustem
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2019, 66 : 225 - 242
  • [5] Context Vectors Are Reflections of Word Vectors in Half the Dimensions
    Assylbekov, Zhenisbek
    Takhanov, Rustem
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 5115 - 5119
  • [6] Context and Word Choosing in Translation
    张向阳
    科技信息(学术研究), 2007, (23) : 130 - 132
  • [7] Exploiting the translation context for multilingual WSD
    Specia, Lucia
    Volpe Nunes, Maria das Gracas
    TEXT, SPEECH AND DIALOGUE, PROCEEDINGS, 2006, 4188 : 269 - 276
  • [8] Learning Word Embeddings without Context Vectors
    Zobnin, Alexey
    Elistratova, Evgenia
    4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 244 - 249
  • [9] Context Enhanced Word Vectors for Sentiment Analysis
    Ye, Zhe
    Li, Fang
    SOCIAL MEDIA PROCESSING, SMP 2017, 2017, 774 : 256 - 267
  • [10] Exploiting Sentential Context for Neural Machine Translation
    Wang, Xing
    Tu, Zhaopeng
    Wang, Longyue
    Shi, Shuming
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 6197 - 6203