A Low-Dimensional Vector Representation for Words using an Extreme Learning Machine

被引:0
|
作者
Lauren, Paula [1 ]
Qu, Guangzhi [1 ]
Huang, Guang-Bin [2 ]
Watta, Paul [3 ]
Lendasse, Amaury [4 ]
机构
[1] Oakland Univ, CSE Dept, Rochester, MI 48063 USA
[2] Nanyang Technol Univ, Sch EEE, Singapore, Singapore
[3] Univ Mich Dearborn, Dept ECE, Dearborn, MI USA
[4] Univ Iowa, Dept MIE, Iowa City, IA USA
基金
美国国家科学基金会;
关键词
MODEL;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Word embeddings are a low-dimensional vector representation of words that incorporates context. Two popular methods are word2vec and global vectors (GloVe). Word2vec is a single-hidden layer feedforward neural network (SLFN) that has an auto-encoder influence for computing a word context matrix using backpropagation for training. GloVe computes the word context matrix first then performs matrix factorization on the matrix to arrive at word embeddings. Backpropagation is a typical training method for SLFN's, which is time consuming and requires iterative tuning. Extreme learning machines (ELM) have the universal approximation capability of SLFN's, based on a randomly generated hidden layer weight matrix in lieu of backpropagation. In this research, we propose an efficient method for generating word embeddings that uses an auto-encoder architecture based on ELM that works on a word context matrix. Word similarity is done using the cosine similarity measure on a dozen various words and the results are reported.
引用
收藏
页码:1817 / 1822
页数:6
相关论文
共 50 条
  • [31] Low-Dimensional Learning Control using Generic Signal Parametrizations
    Willems, Jeroen
    Kikken, Edward
    Depraetere, Bruno
    IFAC PAPERSONLINE, 2019, 52 (29): : 280 - 285
  • [32] Low-Dimensional Learning for Complex Robots
    O'Flaherty, Rowland
    Egerstedt, Magnus
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2015, 12 (01) : 19 - 27
  • [33] Learning Low-Dimensional Temporal Representations
    Su, Bing
    Wu, Ying
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [34] Learning Low-Dimensional Models of Microscopes
    Debarnot, Valentin
    Escande, Paul
    Mangeat, Thomas
    Weiss, Pierre
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2021, 7 (07) : 178 - 190
  • [35] Estimation of voting behavior in election using support vector machine, extreme learning machine and deep learning
    Tanyildizi, Nural Imik
    Tanyildizi, Harun
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (20): : 17329 - 17342
  • [36] Mixed Multi-relational Representation Learning for Low-Dimensional Knowledge Graph Embedding
    Thanh Le
    Chi Tran
    Bac Le
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, PT I, 2022, 13757 : 428 - 441
  • [37] Estimation of voting behavior in election using support vector machine, extreme learning machine and deep learning
    Nural Imik Tanyildizi
    Harun Tanyildizi
    Neural Computing and Applications, 2022, 34 : 17329 - 17342
  • [38] Learning Low-Dimensional Signal Models
    Carin, Lawrence
    Baraniuk, Richard G.
    Cevher, Volkan
    Dunson, David
    Jordan, Michael I.
    Sapiro, Guillermo
    Wakin, Michael B.
    IEEE SIGNAL PROCESSING MAGAZINE, 2011, 28 (02) : 39 - 51
  • [39] Tensor-Based Low-Dimensional Representation Learning for Multi-View Clustering
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (05) : 2399 - 2414
  • [40] Recommendation Model Based on Polarization Relation Representation and Low-Dimensional Data Association Learning
    Cai X.
    Hong T.
    Cao Y.
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2022, 50 (01): : 122 - 131