Learning Time Series Counterfactuals via Latent Space Representations

被引:7
|
作者
Wang, Zhendong [1 ]
Samsten, Isak [1 ]
Mochaourab, Rami [2 ]
Papapetrou, Panagiotis [1 ]
机构
[1] Stockholm Univ, Stockholm, Sweden
[2] RISE Res Inst Sweden, Stockholm, Sweden
来源
DISCOVERY SCIENCE (DS 2021) | 2021年 / 12986卷
关键词
Time series classification; Interpretability; Counterfactual explanations; Deep learning;
D O I
10.1007/978-3-030-88942-5_29
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Counterfactual explanations can provide sample-based explanations of features required to modify from the original sample to change the classification result from an undesired state to a desired state; hence it provides interpretability of the model. Previous work of LatentCF presents an algorithm for image data that employs auto-encoder models to directly transform original samples into counterfactuals in a latent space representation. In our paper, we adapt the approach to time series classification and propose an improved algorithm named LatentCF++ which introduces additional constraints in the counterfactual generation process. We conduct an extensive experiment on a total of 40 datasets from the UCR archive, comparing to current state-of-the-art methods. Based on our evaluation metrics, we show that the LatentCF++ framework can with high probability generate valid counterfactuals and achieve comparable explanations to current state-of-the-art. Our proposed approach can also generate counterfactuals that are considerably closer to the decision boundary in terms of margin difference.
引用
收藏
页码:369 / 384
页数:16
相关论文
共 50 条
  • [41] Predictive learning as a network mechanism for extracting low-dimensional latent space representations
    Stefano Recanatesi
    Matthew Farrell
    Guillaume Lajoie
    Sophie Deneve
    Mattia Rigotti
    Eric Shea-Brown
    Nature Communications, 12
  • [42] Latent Space Smoothing for Individually Fair Representations
    Peychev, Momchil
    Ruoss, Anian
    Balunovic, Mislav
    Baader, Maximilian
    Vechev, Martin
    COMPUTER VISION, ECCV 2022, PT XIII, 2022, 13673 : 535 - 554
  • [43] Alternatives of Unsupervised Representations of Variables on the Latent Space
    Glushkovsky, Alex
    arXiv,
  • [44] Latent Space Representations of Neural Algorithmic Reasoners
    Mirjanic, Vladimir V.
    Pascanu, Razvan
    Velickovic, Petar
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [45] Representations of space and time
    Montello, DR
    GEOGRAPHICAL REVIEW, 2004, 94 (01) : 124 - 127
  • [46] Representations of space and time
    Dragicevic, S
    ENVIRONMENT AND PLANNING B-PLANNING & DESIGN, 2003, 30 (04): : 636 - 637
  • [47] Representations of space and time
    Boots, B
    CANADIAN GEOGRAPHER-GEOGRAPHE CANADIEN, 2004, 48 (02): : 240 - 242
  • [48] Deep time-series clustering via latent representation alignment
    Lee, Sangho
    Choi, Chihyeon
    Son, Youngdoo
    KNOWLEDGE-BASED SYSTEMS, 2024, 303
  • [49] Semi-Supervised Learning via Compact Latent Space Clustering
    Kamnitsas, Konstantinos
    Castro, Daniel C.
    Le Folgoc, Loic
    Walker, Ian
    Tanno, Ryutaro
    Rueckert, Daniel
    Ben Glocker
    Criminisi, Antonio
    Nori, Aditya
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [50] Image generation via latent space learning using improved combination
    Chen, Yanxiang
    Wu, Guang
    Zhou, Jie
    Qi, Guojun
    NEUROCOMPUTING, 2019, 340 : 8 - 18