A Low-Dimensional Vector Representation for Words using an Extreme Learning Machine

被引:0
|
作者
Lauren, Paula [1 ]
Qu, Guangzhi [1 ]
Huang, Guang-Bin [2 ]
Watta, Paul [3 ]
Lendasse, Amaury [4 ]
机构
[1] Oakland Univ, CSE Dept, Rochester, MI 48063 USA
[2] Nanyang Technol Univ, Sch EEE, Singapore, Singapore
[3] Univ Mich Dearborn, Dept ECE, Dearborn, MI USA
[4] Univ Iowa, Dept MIE, Iowa City, IA USA
基金
美国国家科学基金会;
关键词
MODEL;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Word embeddings are a low-dimensional vector representation of words that incorporates context. Two popular methods are word2vec and global vectors (GloVe). Word2vec is a single-hidden layer feedforward neural network (SLFN) that has an auto-encoder influence for computing a word context matrix using backpropagation for training. GloVe computes the word context matrix first then performs matrix factorization on the matrix to arrive at word embeddings. Backpropagation is a typical training method for SLFN's, which is time consuming and requires iterative tuning. Extreme learning machines (ELM) have the universal approximation capability of SLFN's, based on a randomly generated hidden layer weight matrix in lieu of backpropagation. In this research, we propose an efficient method for generating word embeddings that uses an auto-encoder architecture based on ELM that works on a word context matrix. Word similarity is done using the cosine similarity measure on a dozen various words and the results are reported.
引用
收藏
页码:1817 / 1822
页数:6
相关论文
共 50 条
  • [1] Low-Dimensional Vector Representation Learning for Text Visualization Using Task-Oriented Dialogue Dataset
    Hwang T.
    Jung S.
    Roh Y.-H.
    Journal of Computing Science and Engineering, 2022, 16 (03) : 165 - 177
  • [2] Learning as formation of low-dimensional representation spaces
    Edelman, S
    Intrator, N
    PROCEEDINGS OF THE NINETEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1997, : 199 - 204
  • [3] Learning Low-Dimensional Representation of Bivariate Histogram Data
    Vaiciukynas, Evaldas
    Ulicny, Matej
    Pashami, Sepideh
    Nowaczyk, Slawomir
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2018, 19 (11) : 3723 - 3735
  • [4] Low-dimensional topology, low-dimensional field theory and representation theory
    Fuchs, Juergen
    Schweigert, Christoph
    REPRESENTATION THEORY - CURRENT TRENDS AND PERSPECTIVES, 2017, : 255 - 267
  • [5] An influence maximization algorithm based on low-dimensional representation learning
    Liu, Yuening
    Qiu, Liqing
    Sun, Chengai
    APPLIED INTELLIGENCE, 2022, 52 (14) : 15865 - 15882
  • [6] Low-dimensional Representation of OCT Volumes with Supervised Contrastive Learning
    Marginean, Anca
    Bianca, Vesa
    Nicoara, Simona Delia
    Muntean, George
    2022 IEEE 18TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, ICCP, 2022, : 47 - 54
  • [7] An influence maximization algorithm based on low-dimensional representation learning
    Yuening Liu
    Liqing Qiu
    Chengai Sun
    Applied Intelligence, 2022, 52 : 15865 - 15882
  • [8] Learning Interpretable Low-dimensional Representation via Physical Symmetry
    Liu, Xuanjie
    Chin, Daniel
    Huang, Yichen
    Xia, Gus
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Low-Dimensional Representation Learning from Imbalanced Data Streams
    Korycki, Lukasz
    Krawczyk, Bartosz
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT I, 2021, 12712 : 629 - 641
  • [10] Low-dimensional representation of error covariance
    Tippett, MK
    Cohn, SE
    Todling, R
    Marchesin, D
    TELLUS SERIES A-DYNAMIC METEOROLOGY AND OCEANOGRAPHY, 2000, 52 (05) : 533 - 553