A Smart Data Annotation Tool for Multi-Sensor Activity Recognition

被引:0
|
作者
Diete, Alexander [1 ]
Sztyler, Timo [1 ]
Stuckenschmidt, Heiner [1 ]
机构
[1] Univ Mannheim, Mannheim, Germany
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Annotation of multimodal data sets is often a time consuming and a challenging task as many approaches require an accurate labeling. This includes in particular video recordings as often labeling exact to a frame is required. For that purpose, we created an annotation tool that enables to annotate data sets of video and inertial sensor data. However, in contrast to the most existing approaches, we focus on semi-supervised labeling support to infer labels for the whole dataset. More precisely, after labeling a small set of instances our system is able to provide labeling recommendations and in turn it makes learning of image features more feasible by speeding up the labeling time for single frames. We aim to rely on the inertial sensors of our wristband to support the labeling of video recordings. For that purpose, we apply template matching in context of dynamic time warping to identify time intervals of certain actions. To investigate the feasibility of our approach we focus on a real world scenario, i.e., we gathered a data set which describes an order picking scenario of a logistic company. In this context, we focus on the picking process as the selection of the correct items can be prone to errors. Preliminary results show that we are able to identify 69% of the grabbing motion periods of time.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Smart Annotation Tool for Multi-sensor Gait-based Daily Activity Data
    Martindale, Christine F.
    Roth, Nils
    Hannink, Julius
    Sprager, Sebastijan
    Eskofier, Bjoern M.
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS (PERCOM WORKSHOPS), 2018,
  • [2] Deep Neural Networks for Activity Recognition with Multi-Sensor Data in a Smart Home
    Park, Jiho
    Jang, Kiyoung
    Yang, Sung-Bong
    [J]. 2018 IEEE 4TH WORLD FORUM ON INTERNET OF THINGS (WF-IOT), 2018, : 155 - 160
  • [3] A Multi-Sensor Setting Activity Recognition Simulation Tool
    Takeda, Shingo
    Okita, Tsuyoshi
    Lago, Paula
    Inoue, Sozo
    [J]. PROCEEDINGS OF THE 2018 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2018 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS (UBICOMP/ISWC'18 ADJUNCT), 2018, : 1444 - 1448
  • [4] Random k-Labelsets Method for Human Activity Recognition with Multi-Sensor Data in Smart Home
    Jethanandani, Manan
    Perumal, Thinagaran
    Sharma, Abhishek
    [J]. 2019 IEEE 16TH INDIA COUNCIL INTERNATIONAL CONFERENCE (IEEE INDICON 2019), 2019,
  • [5] Multi-sensor Based Gestures Recognition with a Smart Finger Ring
    Roshandel, Mehran
    Munjal, Aarti
    Moghadam, Peyman
    Tajik, Shahin
    Ketabdar, Hamed
    [J]. HUMAN-COMPUTER INTERACTION: ADVANCED INTERACTION MODALITIES AND TECHNIQUES, PT II, 2014, 8511 : 316 - 324
  • [6] Classification Model for Multi-Sensor Data Fusion Apply for Human Activity Recognition
    Arnon, Paranyu
    [J]. 2014 INTERNATIONAL CONFERENCE ON COMPUTER, COMMUNICATIONS, AND CONTROL TECHNOLOGY (I4CT), 2014, : 415 - 419
  • [7] Emotion-relevant activity recognition based on smart cushion using multi-sensor fusion
    Gravina, Raffaele
    Li, Qimeng
    [J]. INFORMATION FUSION, 2019, 48 : 1 - 10
  • [8] Multi-Sensor Fusion for Activity Recognition-A Survey
    Aguileta, Antonio A.
    Brena, Ramon F.
    Mayora, Oscar
    Molino-Minero-Re, Erik
    Trejo, Luis A.
    [J]. SENSORS, 2019, 19 (17)
  • [9] A System for Activity Recognition Using Multi-Sensor Fusion
    Gao, Lei
    Bourke, Alan K.
    Nelson, John
    [J]. 2011 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2011, : 7869 - 7872
  • [10] Developing smart multi-sensor monitoring for tool wear in stamping process
    Shanbhag, V. V.
    Pereira, M. P.
    Voss, B.
    Ubhayaratne, I.
    Rolfe, B. F.
    [J]. 38TH INTERNATIONAL DEEP DRAWING RESEARCH GROUP ANNUAL CONFERENCE (IDDRG 2019), 2019, 651