Role of Mixup in Topological Persistence-Based Knowledge Distillation for Wearable Sensor Data

被引:0
|
作者
Jeon, Eun Som [1 ]
Choi, Hongjun [2 ]
Buman, Matthew P. [3 ]
Turaga, Pavan [4 ,5 ]
机构
[1] Seoul Natl Univ Sci & Technol, Dept Comp Sci & Engn, Seoul 01811, South Korea
[2] Lawrence Livermore Natl Lab, Livermore, CA 94550 USA
[3] Arizona State Univ, Coll Hlth Solut, Phoenix, AZ 85004 USA
[4] Arizona State Univ, Geometr Media Lab, Sch Arts Media & Engn, Tempe, AZ 85281 USA
[5] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85281 USA
基金
美国国家卫生研究院; 美国国家科学基金会;
关键词
Wearable sensors; Sensors; Feature extraction; Training; Data models; Computational modeling; Knowledge engineering; Temperature distribution; Analytical models; Training data; Knowledge distillation (KD); time series; topological persistence; wearable sensor data;
D O I
10.1109/JSEN.2024.3517653
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The analysis of wearable sensor data has enabled many successes in several applications. To represent the high-sampling rate time series with sufficient detail, the use of topological data analysis (TDA) has been considered, and it is found that TDA can complement other time-series features. Nonetheless, due to the large time consumption and high computational resource requirements of extracting topological features through TDA, it is difficult to deploy topological knowledge in machine learning and various applications. In order to tackle this problem, knowledge distillation (KD) can be adopted, which is a technique facilitating model compression and transfer learning to generate a smaller model by transferring knowledge from a larger network. By leveraging multiple teachers in KD, both time-series and topological features can be transferred, and finally, a superior student using only time-series data is distilled. On the other hand, mixup has been popularly used as a robust data augmentation technique to enhance model performance during training. Mixup and KD employ similar learning strategies. In KD, the student model learns from the smoothed distribution generated by the teacher model, while mixup creates smoothed labels by blending two labels. Hence, this common smoothness serves as the connecting link that establishes a connection between these two methods. Even though it has been widely studied to understand the interplay between mixup and KD, most of them are focused on image-based analysis only, and it still remains to be understood how mixup behaves in the context of KD for incorporating multimodal data, such as both time-series and topological knowledge using wearable sensor data. In this article, we analyze the role of mixup in KD with time series as well as topological persistence, employing multiple teachers. We present a comprehensive analysis of various methods in KD and mixup, supported by empirical results on wearable sensor data. We observe that applying a mixup to training a student in KD improves performance. We suggest a general set of recommendations to obtain an enhanced student.
引用
收藏
页码:5853 / 5865
页数:13
相关论文
共 50 条
  • [1] Robustness of topological persistence in knowledge distillation for wearable sensor data
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Wang, Yuan
    Buman, Matthew P.
    Lee, Hyunglae
    Turaga, Pavan
    EPJ Data Science, 2024, 13 (01)
  • [2] Topological persistence guided knowledge distillation for wearable sensor data
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Wang, Yuan
    Lee, Hyunglae
    Buman, Matthew P.
    Turaga, Pavan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 130
  • [3] Topological Knowledge Distillation for Wearable Sensor Data
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Wang, Yuan
    Buman, Matthew P.
    Turaga, Pavan
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 837 - 842
  • [4] Constrained Adaptive Distillation Based on Topological Persistence for Wearable Sensor Data
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Wang, Yuan
    Buman, Matthew P.
    Turaga, Pavan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72 : 1 - 14
  • [5] Uncertainty-Aware Topological Persistence Guided Knowledge Distillation on Wearable Sensor Data
    Jeon, Eun Som
    Buman, Matthew P.
    Turaga, Pavan
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (18): : 30413 - 30429
  • [6] Role of Data Augmentation Strategies in Knowledge Distillation for Wearable Sensor Data
    Jeon, Eun Som
    Som, Anirudh
    Shukla, Ankita
    Hasanaj, Kristina
    Buman, Matthew P.
    Turaga, Pavan
    IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14): : 12848 - 12860
  • [7] Multivariate Data Analysis Using Persistence-Based Filtering and Topological Signatures
    Rieck, Bastian
    Mara, Hubert
    Leitte, Heike
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2012, 18 (12) : 2382 - 2391
  • [8] Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study
    Choi, Hongjun
    Jeon, Eun Som
    Shukla, Ankita
    Turaga, Pavan
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2318 - 2327
  • [9] Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach
    Park, Hyunseok
    Magee, Christopher L.
    PLOS ONE, 2017, 12 (01):
  • [10] A Study of the Locality of Persistence-Based Queries and Its Implications for the Efficiency of Localized Data Structures
    Klacansky, Pavol
    Gyulassy, Attila
    Bremer, Peer-Timo
    Pascucci, Valerio
    2022 IEEE 15TH PACIFIC VISUALIZATION SYMPOSIUM (PACIFICVIS 2022), 2022, : 121 - 130