Predicting User Preferences of Dimensionality Reduction Embedding Quality

被引:1
|
作者
Morariu C. [1 ]
Bibal A. [2 ,3 ]
Cutura R. [1 ]
Frenay B. [3 ]
Sedlmair M. [1 ]
机构
[1] University of Stuttgart, Germany
[2] Université Catholique de Louvain, Belgium
[3] University of Namur, Belgium
关键词
Dimensionality reduction; Human-centered computing; Manifold learning;
D O I
10.1109/TVCG.2022.3209449
中图分类号
学科分类号
摘要
A plethora of dimensionality reduction techniques have emerged over the past decades, leaving researchers and analysts with a wide variety of choices for reducing their data, all the more so given some techniques come with additional hyper-parametrization (e.g., t-SNE, UMAP, etc.). Recent studies are showing that people often use dimensionality reduction as a black-box regardless of the specific properties the method itself preserves. Hence, evaluating and comparing 2D embeddings is usually qualitatively decided, by setting embeddings side-by-side and letting human judgment decide which embedding is the best. In this work, we propose a quantitative way of evaluating embeddings, that nonetheless places human perception at the center. We run a comparative study, where we ask people to select 'good' and 'misleading' views between scatterplots of low-dimensional embeddings of image datasets, simulating the way people usually select embeddings. We use the study data as labels for a set of quality metrics for a supervised machine learning model whose purpose is to discover and quantify what exactly people are looking for when deciding between embeddings. With the model as a proxy for human judgments, we use it to rank embeddings on new datasets, explain why they are relevant, and quantify the degree of subjectivity when people select preferred embeddings. © 2022 IEEE.
引用
收藏
页码:745 / 755
页数:10
相关论文
共 50 条
  • [1] Unsupervised Adaptive Embedding for Dimensionality Reduction
    Wang, Jingyu
    Xie, Fangyuan
    Nie, Feiping
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6844 - 6855
  • [2] Maximal Linear Embedding for Dimensionality Reduction
    Wang, Ruiping
    Shan, Shiguang
    Chen, Xilin
    Chen, Jie
    Gao, Wen
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (09) : 1776 - 1792
  • [3] Dimensionality Reduction for Graph of Words Embedding
    Gibert, Jaume
    Valveny, Ernest
    Bunke, Horst
    GRAPH-BASED REPRESENTATIONS IN PATTERN RECOGNITION, 2011, 6658 : 22 - 31
  • [4] Spline embedding for nonlinear dimensionality reduction
    Xiang, Shiming
    Niel, Feiping
    Zhang, Changshui
    Zhang, Chunxia
    MACHINE LEARNING: ECML 2006, PROCEEDINGS, 2006, 4212 : 825 - 832
  • [5] Dimensionality reduction for graph of words embedding
    Gibert J.
    Valveny E.
    Bunke H.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2011, 6658 LNCS : 22 - 31
  • [6] Robust jointly sparse embedding for dimensionality reduction
    Lai, Zhihui
    Chen, Yudong
    Mo, Dongmei
    Wen, Jiajun
    Kong, Heng
    NEUROCOMPUTING, 2018, 314 : 30 - 38
  • [7] Word Embedding of Dimensionality Reduction for Document Clustering
    Zhu, Pengyu
    Lang, Qi
    Liu, Xiaodong
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 4371 - 4376
  • [8] Nonlinear dimensionality reduction by locally linear embedding
    Roweis, ST
    Saul, LK
    SCIENCE, 2000, 290 (5500) : 2323 - +
  • [9] Sketching, Embedding, and Dimensionality Reduction for Information Spaces
    Abdullah, Amirali
    Kumar, Ravi
    McGregor, Andrew
    Vassilvitskii, Sergei
    Venkatasubramanian, Suresh
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 948 - 956
  • [10] Dimensionality Reduction by Using Sparse Reconstruction Embedding
    Huang, Shaoli
    Cai, Cheng
    Zhang, Yang
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING-PCM 2010, PT II, 2010, 6298 : 167 - 178