Multiple-Level Distillation for Video Fine-Grained Accident Detection

被引:0
|
作者
Yu, Hongyang [1 ]
Zhang, Xinfeng [2 ]
Wang, Yaowei [1 ]
Huang, Qingming [2 ]
Yin, Baocai [1 ,3 ]
机构
[1] Peng Cheng Lab, Shenzhen 518066, Peoples R China
[2] Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100039, Peoples R China
[3] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
基金
中国博士后科学基金;
关键词
Video accident detection; fine-grained accident detection; knowledge distillation; multiple-level distillation; EVENT DETECTION;
D O I
10.1109/TCSVT.2023.3338743
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Accident detection in surveillance or dashcam videos is a common task in the field of traffic accident analysis by using videos. However, as accidents occur sparsely and randomly in the real world, the data records are more scarce than the training data for standard detection tasks such as object detection or instance detection. Moreover, the limited and diverse accident data makes it more difficult to model the accident pattern for fine-grained accident detection tasks analyzing the accident in detail. Extra prior information should be introduced in the tasks such as the common vision feature which could offer relatively effective information for many vision tasks. The big model could generate the common vision feature by training on abundant data and consuming a lot of computing time and resources. Even though the accident video data is special, the big model could also extract common vision features. Thus, in this paper, we propose to apply knowledge distillation to fine-grained accident detection which analyzes the spatial temporal existence and severity for solving the issues of complex computing (distillation to the small model) and keeping good performance under limited accident data. Knowledge distillation could offer extra general vision feature information from the pre-trained big model. Common knowledge distillation guides the student network to learn the same representations from the teacher network by logit mimicking or feature imitation. However, single-level distillation could only focus on one aspect of mimicking classification logit or deep features. Multiple tasks with different focuses are required for fine-grained accident detection, such as multiple accident classification, temporal-spatial accident region detection, and accident severity estimation. Thus in this paper, multiple-level distillation is proposed for the different modules to generate the unified video feature concerning all the tasks in fine-grained accident detection analysis. The various experimental results on a fine-grained accident detection dataset which provides more detailed annotations of accidents demonstrate that our method could effectively model the video feature for multiple tasks.
引用
收藏
页码:4445 / 4457
页数:13
相关论文
共 50 条
  • [41] Fine-grained Topic Detection and Tracking on Twitter
    Mamo, Nicholas
    Azzopardi, Joel
    Layfield, Colin
    PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE DISCOVERY, KNOWLEDGE ENGINEERING AND KNOWLEDGE MANAGEMENT (KDIR), VOL 1:, 2021, : 79 - 86
  • [42] CANCEREMO : A Dataset for Fine-Grained Emotion Detection
    Sosea, Tiberiu
    Caragea, Cornelia
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8892 - 8904
  • [43] Fine-grained Conflict Detection of IoT Services
    Chaki, Dipankar
    Bouguettaya, Athman
    2020 IEEE 13TH INTERNATIONAL CONFERENCE ON SERVICES COMPUTING (SCC 2020), 2020, : 321 - 328
  • [44] CEKD:Cross ensemble knowledge distillation for augmented fine-grained data
    Zhang, Ke
    Fan, Jin
    Huang, Shaoli
    Qiao, Yongliang
    Yu, Xiaofeng
    Qin, Feiwei
    APPLIED INTELLIGENCE, 2022, 52 (14) : 16640 - 16650
  • [45] Loop and distillation: Attention weights fusion transformer for fine-grained representation
    Fayou, Sun
    Ngo, Hea Choon
    Meng, Zuqiang
    Sek, Yong Wee
    IET COMPUTER VISION, 2023, 17 (04) : 473 - 482
  • [46] Identifying apple leaf disease using a fine-grained distillation model
    Li D.
    Hua C.
    Liu Y.
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2023, 39 (07): : 185 - 194
  • [47] Filtration and Distillation: Enhancing Region Attention for Fine-Grained Visual Categorization
    Liu, Chuanbin
    Xie, Hongtao
    Zha, Zheng-Jun
    Ma, Lingfeng
    Yu, Lingyun
    Zhang, Yongdong
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 11555 - 11562
  • [48] CNN-Transformer with Stepped Distillation for Fine-Grained Visual Classification
    Xu, Qin
    Liu, Peng
    Wang, Jiahui
    Huang, Lili
    Tang, Jin
    PATTERN RECOGNITION AND COMPUTER VISION, PT IX, PRCV 2024, 2025, 15039 : 364 - 377
  • [49] Data-free Knowledge Distillation for Fine-grained Visual Categorization
    Shao, Renrong
    Zhang, Wei
    Yin, Jianhua
    Wang, Jun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 1515 - 1525
  • [50] CEKD:Cross ensemble knowledge distillation for augmented fine-grained data
    Ke Zhang
    Jin Fan
    Shaoli Huang
    Yongliang Qiao
    Xiaofeng Yu
    Feiwei Qin
    Applied Intelligence, 2022, 52 : 16640 - 16650