Context-Aware Dynamic Word Embeddings for Aspect Term Extraction

被引:1
|
作者
Xu, Jingyun [1 ,2 ]
Xie, Jiayuan [1 ,2 ]
Cai, Yi [1 ,2 ,3 ]
Lin, Zehang [4 ]
Leung, Ho-Fung [5 ]
Li, Qing [4 ]
Chua, Tat-Seng [6 ,7 ]
机构
[1] South China Univ Technol SCUT, Sch Software Engn, Guangzhou 510006, Peoples R China
[2] South China Univ Technol SCUT, Key Lab Big Data & Intelligent Robot SCUT, Minist Educ, Guangzhou 510006, Peoples R China
[3] Pazhou Lab, Guangdong Artificial Intelligence & Digital Econ L, Guangzhou 510335, Peoples R China
[4] Hong Kong Polytech Univ, Dept Comp, Kowloon, Hong Kong, Peoples R China
[5] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[6] Natl Univ Singapore, Sch Comp, Singapore 119077, Singapore
[7] Sea NExT Joint Lab, Singapore 117417, Singapore
基金
中国国家自然科学基金;
关键词
Aspect term extraction; attention mechanism; sentiment analysis; word embedding;
D O I
10.1109/TAFFC.2023.3262941
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The aspect term extraction (ATE) task aims to extract aspect terms describing a part or an attribute of a product from review sentences. Most existing works rely on either general or domain embedding to address this problem. Despite the promising results, the importance of general and domain embeddings is still ignored by most methods, resulting in degraded performances. Besides, word embedding is also related to downstream tasks, and how to regularize word embeddings to capture context-aware information is an unresolved problem. To solve these issues, we first propose context-aware dynamic word embedding (CDWE), which could simultaneously consider general meanings, domain-specific meanings, and the context information of words. Based on CDWE, we propose an attention-based convolution neural network, called ADWE-CNN for ATE, which could adaptively capture the previous meanings of words by utilizing an attention mechanism to assign different importance to the respective embeddings. The experimental results show that ADWE-CNN achieves a comparable performance with the state-of-the-art approaches. Various ablation studies have been conducted to explore the benefit of each component. Our code is publicly available at http://github.com/xiejiajia2018/ADWE-CNN.
引用
收藏
页码:144 / 156
页数:13
相关论文
共 50 条
  • [1] Context-Aware Misinformation Detection: A Benchmark of Deep Learning Architectures Using Word Embeddings
    Ilie, Vlad-Iulian
    Truica, Ciprian-Octavian
    Apostol, Elena-Simona
    Paschke, Adrian
    [J]. IEEE ACCESS, 2021, 9 : 162122 - 162146
  • [2] Context-Aware Embeddings for Automatic Art Analysis
    Garcia, Noa
    Renoust, Benjamin
    Nakashima, Yuta
    [J]. ICMR'19: PROCEEDINGS OF THE 2019 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, 2019, : 25 - 33
  • [3] Temporal knowledge completion with context-aware embeddings
    Liu, Yu
    Hua, Wen
    Qu, Jianfeng
    Xin, Kexuan
    Zhou, Xiaofang
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2021, 24 (02): : 675 - 695
  • [4] Temporal knowledge completion with context-aware embeddings
    Yu Liu
    Wen Hua
    Jianfeng Qu
    Kexuan Xin
    Xiaofang Zhou
    [J]. World Wide Web, 2021, 24 : 675 - 695
  • [5] Analyzing Document Collections via Context-Aware Term Extraction
    Keim, Daniel A.
    Oelke, Daniela
    Rohrdantz, Christian
    [J]. NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, 2010, 5723 : 154 - 168
  • [6] Context-Aware Dynamic Architecture
    Tadj, C.
    Ramdane-Cherif, Amar
    [J]. WMSCI 2008: 12TH WORLD MULTI-CONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL I, PROCEEDINGS, 2008, : 68 - +
  • [7] Aspect-driven Context-aware Services
    Cemus, Karel
    Klimes, Filip
    Cerny, Tomas
    [J]. PROCEEDINGS OF THE 2017 FEDERATED CONFERENCE ON COMPUTER SCIENCE AND INFORMATION SYSTEMS (FEDCSIS), 2017, : 1307 - 1314
  • [8] Unsupervised Learning of Paragraph Embeddings for Context-Aware Recommendation
    Xie, Jin
    Zhu, Fuxi
    Huang, Minxue
    Xiong, Naixue
    Huang, Sheng
    Xiong, Wei
    [J]. IEEE ACCESS, 2019, 7 : 43100 - 43109
  • [9] CAPE: Context-Aware Private Embeddings for Private Language Learning
    Plant, Richard
    Giuffrida, Valerio
    Gkatzia, Dimitra
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 7970 - +
  • [10] Extended Dependency-Based Word Embeddings for Aspect Extraction
    Wang, Xin
    Liu, Yuanchao
    Sun, Chengjie
    Liu, Ming
    Wang, Xiaolong
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT IV, 2016, 9950 : 104 - 111