SeqCondenser: Inductive Representation Learning of Sequences by Sampling Characteristic Functions

被引:0
|
作者
Chenebaux, Maixent [1 ]
Cazenave, Tristan [2 ]
机构
[1] Vectors Grp, Paris, France
[2] Univ Paris Dauphine PSL, CNRS, LAMSADE, Paris, France
来源
关键词
D O I
10.1007/978-3-031-70563-2_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we introduce SeqCondenser, a neural network layer that compresses a variable-length input sequence into a fixed-size vector representation. The SeqCondenser layer samples the empirical characteristic function and its derivatives for each input dimension, and uses an attention mechanism to determine the associated probability distribution. We argue that the features extracted through this process effectively represent the entire sequence and that the SeqCondenser layer is particularly well-suited for inductive sequence classification tasks, such as text and time series classification. Our experiments show that SCoMo, a SeqCondenser-based architecture, outperforms the state-of-the-art inductive methods on nearly all examined text classification datasets and also outperforms the current best transductive method on one dataset.
引用
收藏
页码:3 / 16
页数:14
相关论文
共 50 条
  • [41] Unsupervised Graph Representation Learning with Inductive Shallow Node Embedding
    Kiss, Richard
    Szucs, Gabor
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (05) : 7333 - 7348
  • [42] BiasedWalk: Biased Sampling for Representation Learning on Graphs
    Duong Nguyen
    Malliaros, Fragkiskos D.
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 4045 - 4053
  • [43] Zero sequences of holomorphic functions, representation of meromorphic functions. II. Entire functions
    Khabibullin, B. N.
    SBORNIK MATHEMATICS, 2009, 200 (1-2) : 283 - 312
  • [44] Reciprocal Adversarial Learning via Characteristic Functions
    Li, Shengxi
    Yu, Zeyang
    Xiang, Min
    Mandic, Danilo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [45] An operator representation for weighted inductive limits of spaces of vector valued holomorphic functions
    Bierstedt, KD
    Holtmanns, S
    BULLETIN OF THE BELGIAN MATHEMATICAL SOCIETY-SIMON STEVIN, 2001, 8 (04) : 577 - 589
  • [46] Quasi-interpolation Representation and Sampling Recovery of Multivariate Functions
    Cuong N.M.
    Thao M.X.
    Acta Mathematica Vietnamica, 2018, 43 (2) : 373 - 389
  • [47] MULTIVARIATE BAND-LIMITED FUNCTIONS - SAMPLING REPRESENTATION AND APPROXIMATION
    DINHDUNG
    MULTIVARIATE APPROXIMATION THEORY IV, 1989, 90 : 123 - 134
  • [48] SUBGRAPH REPRESENTATION LEARNING WITH HARD NEGATIVE SAMPLES FOR INDUCTIVE LINK PREDICTION
    Kwak, Heeyoung
    Bae, Hyunkyung
    Jung, Kyomin
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4768 - 4772
  • [49] Effectiveness for machine translation method using inductive learning on number representation
    Matsuhara, M
    Araki, K
    Tochinai, K
    AL 2002: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2002, 2557 : 648 - 659
  • [50] HireGC: Hierarchical inductive network representation learning via graph coarsening
    Zhao, Shu
    Xu, Ci
    Du, Ziwei
    Duan, Zhen
    Chen, Jie
    KNOWLEDGE-BASED SYSTEMS, 2025, 311