Concept Drift Detection for Multivariate Data Streams and Temporal Segmentation of Daylong Egocentric Videos

被引:6
|
作者
Nagar, Pravin [1 ]
Khemka, Mansi [2 ]
Arora, Chetan [3 ]
机构
[1] Indraprastha Inst Informat Technol Delhi, Delhi, India
[2] Columbia Univ, New York, NY 10027 USA
[3] Indian Inst Technol Delhi, Delhi, India
关键词
Temporal segmentation; Concept drift detection; Egocentric video; Multivariate data; Long videos;
D O I
10.1145/3394171.3413713
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The long and unconstrained nature of egocentric videos makes it imperative to use temporal segmentation as an important pre-processing step for many higher-level inference tasks. Activities of the wearer in an egocentric video typically span over hours and are often separated by slow, gradual changes. Furthermore, the change of camera viewpoint due to the wearer's head motion causes frequent and extreme, but, spurious scene changes. The continuous nature of boundaries makes it difficult to apply traditional Markov Random Field (MRF) pipelines relying on temporal discontinuity, whereas deep Long Short Term Memory (LSTM) networks gather context only upto a few hundred frames, rendering them ineffective for egocentric videos. In this paper, we present a novel unsupervised temporal segmentation technique especially suited for day-long egocentric videos. We formulate the problem as detecting concept drift in a time-varying, non i.i.d. sequence of frames. Statistically bounded thresholds are calculated to detect concept drift between two temporally adjacent multivariate data segments with different underlying distributions while establishing guarantees on false positives. Since the derived threshold indicates confidence in the prediction, it can also be used to control the granularity of the output segmentation. Using our technique, we report significantly improved state of the art f-measure for daylong egocentric video datasets, as well as photostream datasets derived from them: HUJI (73.01%, 59.44%), UTEgo (58.41%, 60.61%) and Disney (67.63%, 68.83%).
引用
收藏
页码:1065 / 1074
页数:10
相关论文
共 50 条
  • [1] Temporal Segmentation of Egocentric Videos
    Poleg, Yair
    Arora, Chetan
    Peleg, Shmuel
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 2537 - 2544
  • [2] Handling Concept Drift in Data Streams by Using Drift Detection Methods
    Patil, Malini M.
    DATA MANAGEMENT, ANALYTICS AND INNOVATION, ICDMAI 2018, VOL 2, 2019, 839 : 155 - 166
  • [3] On learning guarantees to unsupervised concept drift detection on data streams
    de Mello, Rodrigo F.
    Vaz, Yule
    Grossi, Carlos H.
    Bifet, Albert
    EXPERT SYSTEMS WITH APPLICATIONS, 2019, 117 : 90 - 102
  • [4] Nacre: Proactive Recurrent Concept Drift Detection in Data Streams
    Wu, Ocean
    Koh, Yun Sing
    Dobbie, Gillian
    Lacombe, Thomas
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [5] Concept drift robust adaptive novelty detection for data streams
    Cejnek, Matous
    Bukovsky, Ivo
    NEUROCOMPUTING, 2018, 309 : 46 - 53
  • [6] Online Clustering for Novelty Detection and Concept Drift in Data Streams
    Garcia, Kemilly Dearo
    Poel, Mannes
    Kok, Joost N.
    de Carvalho, Andre C. P. L. F.
    PROGRESS IN ARTIFICIAL INTELLIGENCE, PT II, 2019, 11805 : 448 - 459
  • [7] Classification of concept drift data streams
    Padmalatha, E.
    Reddy, C. R. K.
    Rani, B. Padmaja
    2014 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND APPLICATIONS (ICISA), 2014,
  • [8] Intrusion detection in the IoT data streams using concept drift localization
    Chu, Renjie
    Jin, Peiyuan
    Qiao, Hanli
    Feng, Quanxi
    AIMS MATHEMATICS, 2024, 9 (01): : 1535 - 1561
  • [9] Accumulating regional density dissimilarity for concept drift detection in data streams
    Liu, Anjin
    Lu, Jie
    Liu, Feng
    Zhang, Guangquan
    PATTERN RECOGNITION, 2018, 76 : 256 - 272
  • [10] A Multiscale Concept Drift Detection Method for Learning from Data Streams
    Wang, XueSong
    Kang, Qi
    Zhou, MengChu
    Yao, SiYa
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2018, : 786 - 790