Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer

被引:3
|
作者
Wang, Deqing [1 ]
Wu, Junjie [2 ,3 ,4 ]
Yang, Jingyuan [5 ]
Jing, Baoyu [6 ]
Zhang, Wenjie [1 ]
He, Xiaonan [7 ]
Zhang, Hui [1 ]
机构
[1] Beihang Univ, Sch Comp Sci, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Beihang Univ, Beijing Key Lab Emergency Support Simulat Technol, Beijing 100191, Peoples R China
[5] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
[6] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
[7] Baidu Inc, Dept Search, Beijing 100094, Peoples R China
关键词
Task analysis; Machine translation; Analytical models; Transfer learning; Dictionaries; Electronic mail; Time complexity; Cross-lingual sentiment classification; space transfer; structural correspondence learning (SCL); SENTIMENT CLASSIFICATION;
D O I
10.1109/TCYB.2021.3051005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The cross-lingual sentiment analysis (CLSA) aims to leverage label-rich resources in the source language to improve the models of a resource-scarce domain in the target language, where monolingual approaches based on machine learning usually suffer from the unavailability of sentiment knowledge. Recently, the transfer learning paradigm that can transfer sentiment knowledge from resource-rich languages, for example, English, to resource-poor languages, for example, Chinese, has gained particular interest. Along this line, in this article, we propose semisupervised learning with SCL and space transfer (ssSCL-ST), a semisupervised transfer learning approach that makes use of structural correspondence learning as well as space transfer for cross-lingual sentiment analysis. The key idea behind ssSCL-ST, at a high level, is to explore the intrinsic sentiment knowledge in the target-lingual domain and to reduce the loss of valuable knowledge due to the knowledge transfer via semisupervised learning. ssSCL-ST also features in pivot set extension and space transfer, which helps to enhance the efficiency of knowledge transfer and improve the classification accuracy in the target language domain. Extensive experimental results demonstrate the superiority of ssSCL-ST to the state-of-the-art approaches without using any parallel corpora.
引用
收藏
页码:6555 / 6566
页数:12
相关论文
共 50 条
  • [1] Cross-Lingual Adaptation Using Structural Correspondence Learning
    Prettenhofer, Peter
    Stein, Benno
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2012, 3 (01)
  • [2] Cross-lingual adaptation using structural correspondence learning
    Prettenhofer, Peter
    Stein, Benno
    ACM Transactions on Intelligent Systems and Technology, 2011, 3 (01)
  • [3] Cross-Lingual Knowledge Transfer for Clinical Phenotyping
    Papaioannou, Jens-Michalis
    Grundmann, Paul
    van Aken, Betty
    Samaras, Athanasios
    Kyparissidis, Ilias
    Giannakoulas, George
    Gers, Felix
    Loeser, Alexander
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 900 - 909
  • [4] Monolingual and Cross-Lingual Knowledge Transfer for Topic Classification
    D. Karpov
    M. Burtsev
    Journal of Mathematical Sciences, 2024, 285 (1) : 36 - 48
  • [5] CL2CM: Improving Cross-Lingual Cross-Modal Retrieval via Cross-Lingual Knowledge Transfer
    Wang, Yabing
    Wang, Fan
    Dong, Jianfeng
    Luo, Hao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 6, 2024, : 5651 - 5659
  • [6] Cross-lingual distillation for domain knowledge transfer with sentence transformers
    Piperno, Ruben
    Bacco, Luca
    Dell'Orletta, Felice
    Merone, Mario
    Pecchia, Leandro
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [7] Conversations Powered by Cross-Lingual Knowledge
    Sun, Weiwei
    Meng, Chuan
    Meng, Qi
    Ren, Zhaochun
    Ren, Pengjie
    Chen, Zhumin
    de Rijke, Maarten
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 1442 - 1451
  • [8] Cross-lingual transfer of knowledge in distributional language models: Experiments in Hungarian
    Novak, Attila
    Novak, Borbala
    ACTA LINGUISTICA ACADEMICA, 2022, 69 (04): : 405 - 449
  • [9] UNSUPERVISED CROSS-LINGUAL KNOWLEDGE TRANSFER IN DNN-BASED LVCSR
    Swietojanski, Pawel
    Ghoshal, Arnab
    Renals, Steve
    2012 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY (SLT 2012), 2012, : 246 - 251
  • [10] Analyzing the Evaluation of Cross-Lingual Knowledge Transfer in Multilingual Language Models
    Rajaee, Sara
    Monz, Christof
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 2895 - 2914