Beyond Word Embeddings: Temporal Representations of Words using Google Trends

被引:0
|
作者
Haque, Md Enamul [1 ]
Maiti, Aniruddha [2 ]
Tozal, Mehmet Engin [3 ]
机构
[1] Stanford Univ, Spencer Ctr Vis Res, Palo Alto, CA 94303 USA
[2] Temple Univ, Dept Comp & Informat Sci, Philadelphia, PA 19140 USA
[3] Univ Louisiana Lafayette, Sch Comp & Informat, Lafayette, LA 70504 USA
关键词
D O I
10.1109/ICSC50631.2021.00055
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main essence of word representation in a vector space is to preserve the similarity between the words. Traditional measures of word similarity retain the contextual or semantic affinity among the words. In this study, we propose an alternative word embedding scheme which considers the temporal relationships among the words. We employ the Google Trends search queries along with the respective time series information to represent words in a vector space. Our experiments show that the proposed representation is capable of incorporating temporal context that is otherwise unavailable in conventional word representations.
引用
收藏
页码:280 / 287
页数:8
相关论文
共 50 条
  • [1] Words as a window: Using word embeddings to explore the learned representations of Convolutional Neural Networks
    Dharmaretnam, Dhanush
    Foster, Chris
    Fyshe, Alona
    NEURAL NETWORKS, 2021, 137 : 63 - 74
  • [2] Historical representations of social groups across 200 years of word embeddings from Google Books
    Charlesworth, Tessa E. S.
    Caliskan, Aylin
    Banaji, Mahzarin R.
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2022, 119 (28)
  • [3] Beyond word embeddings: A survey
    Incitti, Francesca
    Urli, Federico
    Snidaro, Lauro
    INFORMATION FUSION, 2023, 89 : 418 - 436
  • [4] Temporal Lecture Video Fragmentation Using Word Embeddings
    Galanopoulos, Damianos
    Mezaris, Vasileios
    MULTIMEDIA MODELING, MMM 2019, PT II, 2019, 11296 : 254 - 265
  • [5] Hash Embeddings for Efficient Word Representations
    Svenstrup, Dan
    Hansen, Jonas Meinertz
    Winther, Ole
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [6] Beyond Word Frequency: Bursts, Lulls, and Scaling in the Temporal Distributions of Words
    Altmann, Eduardo G.
    Pierrehumbert, Janet B.
    Motter, Adilson E.
    PLOS ONE, 2009, 4 (11): : A31 - A37
  • [7] Word Embeddings of Monosemous Words in Dictionary for Word Sense Disambiguation
    Sasaki, Minoru
    SEMAPRO 2018: THE TWELFTH INTERNATIONAL CONFERENCE ON ADVANCES IN SEMANTIC PROCESSING, 2018, : 4 - 7
  • [8] Clustering of Words using Dictionary-Learnt Word Representations
    Menon, Remya R. K.
    Gargi, S.
    Samili, S.
    2016 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2016, : 1539 - 1545
  • [9] Beyond word embeddings: learning entity and concept representations from large scale knowledge bases
    Shalaby, Walid
    Zadrozny, Wlodek
    Jin, Hongxia
    INFORMATION RETRIEVAL JOURNAL, 2019, 22 (06): : 525 - 542
  • [10] Beyond word embeddings: learning entity and concept representations from large scale knowledge bases
    Walid Shalaby
    Wlodek Zadrozny
    Hongxia Jin
    Information Retrieval Journal, 2019, 22 : 525 - 542