ZSTAD: Zero-Shot Temporal Activity Detection

被引:19
|
作者
Zhang, Lingling [1 ,2 ]
Chang, Xiaojun [3 ]
Liu, Jun [1 ,4 ]
Luo, Minnan [1 ,4 ]
Wang, Sen [5 ]
Ge, Zongyuan [3 ]
Hauptmann, Alexander [6 ]
机构
[1] Xi An Jiao Tong Univ, Sch Comp Sci & Technol, Xian, Peoples R China
[2] Minist Educ, Key Lab Intelligent Networks & Network Secur, Xian, Peoples R China
[3] Monash Univ, Fac Informat Technol, Melbourne, Vic, Australia
[4] Xi An Jiao Tong Univ, Natl Engn Lab Big Data Analyt, Xian, Peoples R China
[5] Univ Queensland, Sch Informat Technol & Elect Engn, Brisbane, Qld, Australia
[6] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
基金
澳大利亚研究理事会; 中国国家自然科学基金;
关键词
D O I
10.1109/CVPR42600.2020.00096
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An integral part of video analysis and surveillance is temporal activity detection, which means to simultaneously recognize and localize activities in long untrimmed videos. Currently, the most effective methods of temporal activity detection are based on deep learning, and they typically perform very well with large scale annotated videos for training. However, these methods are limited in real applications due to the unavailable videos about certain activity classes and the time-consuming data annotation. To solve this challenging problem, we propose a novel task setting called zero-shot temporal activity detection (ZSTAD), where activities that have never been seen in training can still be detected. We design an end-to-end deep network based on R-C3D as the architecture for this solution. The proposed network is optimized with an innovative loss function that considers the embeddings of activity labels and their superclasses while learning the common semantics of seen and unseen activities. Experiments on both the THUMOS'14 and the Charades datasets show promising performance in terms of detecting unseen activities.
引用
收藏
页码:876 / 885
页数:10
相关论文
共 50 条
  • [21] Synthetic Feature Assessment for Zero-Shot Object Detection
    Dai, Xinmiao
    Wang, Chong
    Li, Haohe
    Lin, Sunqi
    Dong, Li
    Wu, Jiafei
    Wang, Jun
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 444 - 449
  • [22] Zero-Shot Stance Detection via Contrastive Learning
    Liang, Bin
    Chen, Zixiao
    Gui, Lin
    He, Yulan
    Yang, Min
    Xu, Ruifeng
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2738 - 2747
  • [23] Unknown Attack Detection Based on Zero-Shot Learning
    Zhang, Zhun
    Liu, Qihe
    Qiu, Shilin
    Zhou, Shijie
    Zhang, Cheng
    IEEE ACCESS, 2020, 8 : 193981 - 193991
  • [24] ZERO-SHOT DETECTION WITH TRANSFERABLE OBJECT PROPOSAL MECHANISM
    Shao, Yilan
    Li, Yanan
    Wang, Donghui
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 3666 - 3670
  • [25] Knowledge Enhanced Zero-Shot Visual Relationship Detection
    Ding, Nan
    Lai, Yong
    Liu, Jie
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT III, KSEM 2024, 2024, 14886 : 3 - 15
  • [26] Zero-Shot Detection of Machine-Generated Codes
    Yang, Xianjun
    Zhang, Kexun
    Chen, Haifeng
    Petzold, Linda
    Wang, William Yang
    Cheng, Wei
    arXiv, 2023,
  • [27] Zero-Shot Detection of AI-Generated Images
    Cozzolino, Davide
    Poggi, Giovanni
    Niessner, Matthias
    Verdoliva, Luisa
    COMPUTER VISION-ECCV 2024, PT XVIII, 2025, 15076 : 54 - 72
  • [28] Incrementally Zero-Shot Detection by an Extreme Value Analyzer
    Zheng, Sixiao
    Fu, Yanwei
    Hou, Yanxi
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8992 - 8999
  • [29] Zero-Shot Temporal Action Detection by Learning Multimodal Prompts and Text-Enhanced Actionness
    Raza, Asif
    Yang, Bang
    Zou, Yuexian
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (11) : 11000 - 11012
  • [30] Zero-Shot Anomaly Detection via Batch Normalization
    Li, Aodong
    Qiu, Chen
    Kloft, Marius
    Smyth, Padhraic
    Rudolph, Maja
    Mandt, Stephan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,