KD-INR: Time-Varying Volumetric Data Compression via Knowledge Distillation-Based Implicit Neural Representation

被引:0
|
作者
Han, Jun [1 ,2 ]
Zheng, Hao [3 ,4 ]
Bi, Chongke [5 ]
机构
[1] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 518172, Peoples R China
[2] Hong Kong Univ Sci & Technol, Hong Kong 999077, Peoples R China
[3] Univ Notre Dame, Notre Dame, IN 46556 USA
[4] Univ Louisiana Lafayette, Sch Comp & Informat, Lafayette, LA 70504 USA
[5] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300072, Peoples R China
基金
中国国家自然科学基金;
关键词
Time-varying data compression; implicit neural representation; knowledge distillation; volume visualization; MULTILEVEL TECHNIQUES; SUPERRESOLUTION; REDUCTION;
D O I
10.1109/TVCG.2945
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Traditional deep learning algorithms assume that all data is available during training, which presents challenges when handling large-scale time-varying data. To address this issue, we propose a data reduction pipeline called knowledge distillation-based implicit neural representation (KD-INR) for compressing large-scale time-varying data. The approach consists of two stages: spatial compression and model aggregation. In the first stage, each time step is compressed using an implicit neural representation with bottleneck layers and features of interest preservation-based sampling. In the second stage, we utilize an offline knowledge distillation algorithm to extract knowledge from the trained models and aggregate it into a single model. We evaluated our approach on a variety of time-varying volumetric data sets. Both quantitative and qualitative results, such as PSNR, LPIPS, and rendered images, demonstrate that KD-INR surpasses the state-of-the-art approaches, including learning-based (i.e., CoordNet, NeurComp, and SIREN) and lossy compression (i.e., SZ3, ZFP, and TTHRESH) methods, at various compression ratios ranging from hundreds to ten thousand.
引用
收藏
页码:6826 / 6838
页数:13
相关论文
共 50 条
  • [1] STSR-INR: Spatiotemporal super-resolution for multivariate time-varying volumetric data via implicit neural representation
    Tang, Kaiyuan
    Wang, Chaoli
    COMPUTERS & GRAPHICS-UK, 2024, 119
  • [2] Adaptive Volumetric Data Compression Based on Implicit Neural Representation
    Yang, Yumeng
    Jiao, Chenyue
    Gao, Xin
    Tian, Xiaoxian
    Bi, Chongke
    17TH INTERNATIONAL SYMPOSIUM ON VISUAL INFORMATION COMMUNICATION AND INTERACTION, VINCI 2024, 2024,
  • [3] ECNR: Efficient Compressive Neural Representation of Time-Varying Volumetric Datasets
    Tang, Kaiyuan
    Wang, Chaoli
    2024 IEEE 17TH PACIFIC VISUALIZATION CONFERENCE, PACIFICVIS, 2024, : 72 - 81
  • [4] Efficient Neural Data Compression for Machine Type Communications via Knowledge Distillation
    Hussien, Mostafa
    Xu, Yi Tian
    Wu, Di
    Liu, Xue
    Dudek, Gregory
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1169 - 1174
  • [5] A unified framework for exploring time-varying volumetric data based on block correspondence
    Lu, Kecheng
    Wang, Chaoli
    Wu, Keqin
    Gong, Minglun
    Wang, Yunhai
    VISUAL INFORMATICS, 2019, 3 (04) : 157 - 165
  • [6] Real Time Volumetric MRI From Ultra-Sparse Samples Via Implicit Neural Representation Learning
    Yesiloglu, R.
    Shen, L.
    Johansson, A.
    Cao, Y.
    Balter, J.
    Xing, L.
    Liu, L.
    MEDICAL PHYSICS, 2022, 49 (06) : E291 - E291
  • [7] CoordNet: Data Generation and Visualization Generation for Time-Varying Volumes via a Coordinate-Based Neural Network
    Han, Jun
    Wang, Chaoli
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2023, 29 (12) : 4951 - 4963
  • [8] Stability of Inertial Neural Network with Time-Varying Delays Via Sampled-Data Control
    Jingfeng Wang
    Lixin Tian
    Neural Processing Letters, 2019, 50 : 1123 - 1138
  • [9] Synchronization of Switched Neural Networks with Time-varying Delays via Sampled-data Control
    Han, Yuchen
    Lian, Jie
    ASIAN JOURNAL OF CONTROL, 2019, 21 (03) : 1260 - 1269
  • [10] Stability of Inertial Neural Network with Time-Varying Delays Via Sampled-Data Control
    Wang, Jingfeng
    Tian, Lixin
    NEURAL PROCESSING LETTERS, 2019, 50 (02) : 1123 - 1138