Multi-View Graph Matching for 3D Model Retrieval

被引:4
|
作者
Su, Yu-Ting [1 ]
Li, Wen-Hui [1 ]
Nie, Wei-Zhi [1 ]
Liu, An-An [1 ]
机构
[1] Tianjin Univ, 92 Weijin Rd, Tianjin 300072, Peoples R China
基金
中国国家自然科学基金;
关键词
3D model retrieval; graph matching; unsupervised learning; OBJECT RETRIEVAL; RECOGNITION; CLASSIFICATION; SEARCH;
D O I
10.1145/3387920
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
3D model retrieval has been widely utilized in numerous domains, such as computer-aided design, digital entertainment, and virtual reality. Recently, many graph-based methods have been proposed to address this task by using multi-viewinformation of 3D models. However, these methods are always constrained bymanyto-many graph matching for the similarity measure between pairwise models. In this article, we propose a multi-view graph matching method (MVGM) for 3D model retrieval. The proposed method can decompose the complicated multi-view graph-based similarity measure into multiple single-view graph-based similarity measures and fusion. First, we present the method for single-view graph generation, and we further propose the novel method for the similarity measure in a single-view graph by leveraging both node-wise context and model-wise context. Then, we propose multi-view fusion with diffusion, which can collaboratively integrate multiple single-view similarities w.r.t. different viewpoints and adaptively learn their weights, to compute the multi-view similarity between pairwise models. In this way, the proposed method can avoid the difficulty in the definition and computation of the traditional high-order graph. Moreover, this method is unsupervised and does not require a large-scale 3D dataset for model learning. We conduct evaluations on four popular and challenging datasets. The extensive experiments demonstrate the superiority and effectiveness of the proposed method compared against the state of the art. In particular, this unsupervised method can achieve competitive performances against the most recent supervised and deep learning method.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Hierarchical Graph Structure Learning for Multi-View 3D Model Retrieval
    Su, Yuting
    Li, Wenhui
    Liu, Anan
    Nie, Weizhi
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 913 - 919
  • [2] View-Based 3D Model Retrieval via Multi-graph Matching
    Nie, Weizhi
    Liu, Anan
    Hao, Yahui
    Su, Yuting
    NEURAL PROCESSING LETTERS, 2018, 48 (03) : 1395 - 1404
  • [3] View-Based 3D Model Retrieval via Multi-graph Matching
    Weizhi Nie
    Anan Liu
    Yahui Hao
    Yuting Su
    Neural Processing Letters, 2018, 48 : 1395 - 1404
  • [4] 3D Object Retrieval Based on Multi-View Latent Variable Model
    Liu, An-An
    Nie, Wei-Zhi
    Su, Yu-Ting
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2019, 29 (03) : 868 - 880
  • [5] Multi-Modal Clique-Graph Matching for View-Based 3D Model Retrieval
    Liu, An-An
    Nie, Wei-Zhi
    Gao, Yue
    Su, Yu-Ting
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2016, 25 (05) : 2103 - 2116
  • [6] A Compact Multi-View Descriptor for 3D Object Retrieval
    Daras, Petros
    Axenopoulos, Apostolos
    CBMI: 2009 INTERNATIONAL WORKSHOP ON CONTENT-BASED MULTIMEDIA INDEXING, 2009, : 115 - 119
  • [7] 3D model retrieval based on multi-view attentional convolutional neural network
    Liu, An-An
    Zhou, He-Yu
    Li, Meng-Jie
    Nie, Wei-Zhi
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (7-8) : 4699 - 4711
  • [8] 3D model retrieval based on multi-view attentional convolutional neural network
    An-An Liu
    He-Yu Zhou
    Meng-Jie Li
    Wei-Zhi Nie
    Multimedia Tools and Applications, 2020, 79 : 4699 - 4711
  • [9] Graph-based characteristic view set extraction and matching for 3D model retrieval
    Liu, Anan
    Wang, Zhongyang
    Nie, Weizhi
    Su, Yuting
    INFORMATION SCIENCES, 2015, 320 : 429 - 442
  • [10] Multi-view expressive graph neural networks for 3D CAD model classification
    Li, Shuang
    Corney, Jonathan
    COMPUTERS IN INDUSTRY, 2023, 151