InHARD - Industrial Human Action Recognition Dataset in the Context of Industrial Collaborative Robotics

被引:18
|
作者
Dallel, Mejdi [1 ]
Havard, Vincent [2 ]
Baudry, David [2 ]
Savatier, Xavier [3 ]
机构
[1] Univ Rouen Normandy, CESI, LINEACT Lab, Rouen, France
[2] CESI, LINEACT Lab, Rouen, France
[3] ESIGELEC, IRSEEM Lab, Rouen, France
关键词
Human Action Recognition; Dataset; Deep Learning; Human-Robot Collaboration (HRC); Industry; 4.0; RGB plus D; Skeleton; LSTM; RNN;
D O I
10.1109/ichms49158.2020.9209531
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nowadays, humans and robots are working more closely together. This increases business productivity and product quality, leading to efficiency and growth. However, human and robot collaboration is rather static; robots move to a specific position then humans perform their tasks while being assisted by the robots. In order to get a dynamic collaboration, robots need to understand the human's intention and learn to recognize the performed actions complementing therefore his capabilities and relieving him of arduous tasks. Consequently, there is a need for a human action recognition dataset for Machine Learning algorithms. Currently available depth-based and RGB+D+S based human action recognition datasets have a number of limitations, counting the lack of training samples along with distinct class labels, camera views, diversity of subjects and more importantly the absence of actual industrial human actions in an industrial environment. Actual action recognition datasets include simple daily, mutual, or health-related actions. Therefore, in this paper we introduce an RGB+S dataset named "Industrial Human Action Recognition Dataset" (InHARD) from a real-world setting for industrial human action recognition with over 2 million frames, collected from 16 distinct subjects. This dataset contains 13 different industrial action classes and over 4800 action samples. The introduction of this dataset should allow us the study and development of various learning techniques for the task of human actions analysis inside industrial environments involving human robot collaborations.
引用
收藏
页码:393 / 398
页数:6
相关论文
共 50 条
  • [41] Enhancing flexibility and safety: collaborative robotics for material handling in end -of-line industrial operations
    Dmytriyev, Yevheniy
    Carnevale, Marco
    Giberti, Hermes
    [J]. 5TH INTERNATIONAL CONFERENCE ON INDUSTRY 4.0 AND SMART MANUFACTURING, ISM 2023, 2024, 232 : 2588 - 2597
  • [42] Online human motion analysis in industrial context: A review
    Benmessabih, Toufik
    Slama, Rim
    Havard, Vincent
    Baudry, David
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 131
  • [43] Characteristic Kernels on Structured Domains Excel in Robotics and Human Action Recognition
    Danafar, Somayeh
    Gretton, Arthur
    Schmidhuber, Juergen
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I: EUROPEAN CONFERENCE, ECML PKDD 2010, 2010, 6321 : 264 - 279
  • [44] ACTION RESEARCH INTO THE USE OF PARAMETRIC ASSOCIATIVE CAD SYSTEMS IN AN INDUSTRIAL CONTEXT
    Salehi, Vahid
    McMahon, Chris
    [J]. ICED 09 - THE 17TH INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN, VOL 5: DESIGN METHODS AND TOOLS, PT 1, 2009, : 133 - +
  • [45] Semi-Supervised Learning Approach for Fine Grained Human Hand Action Recognition in Industrial Assembly
    Sturm F.
    Sathiyababu R.
    Hergenroether E.
    Siegel M.
    [J]. Computer Science Research Notes, 2023, 31 (1-2): : 340 - 350
  • [46] Human-robot collaborative interaction with human perception and action recognition
    Yu, Xinyi
    Zhang, Xin
    Xu, Chengjun
    Ou, Linlin
    [J]. NEUROCOMPUTING, 2024, 563
  • [47] Cross-Modal Analysis of Human Detection for Robotics: An Industrial Case Study
    Linder, Timm
    Vaskevicius, Narunas
    Schirmer, Robert
    Arras, Kai O.
    [J]. 2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 971 - 978
  • [48] Motion Context: A New Representation for Human Action Recognition
    Zhang, Ziming
    Hu, Yiqun
    Chan, Syin
    Chia, Liang-Tien
    [J]. COMPUTER VISION - ECCV 2008, PT IV, PROCEEDINGS, 2008, 5305 : 817 - 829
  • [49] Digital twin of an industrial workstation: A novel method of an auto-labeled data generator using virtual reality for human action recognition in the context of human-robot collaboration
    Dallel, Mejdi
    Havard, Vincent
    Dupuis, Yohan
    Baudry, David
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 118
  • [50] Context-Aware Cyber-Physical Assistance Systems in Industrial Systems: A Human Activity Recognition Approach
    Roth, Elisa
    Monks, Mirco
    Bohne, Thomas
    Pumplun, Luisa
    [J]. PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL CONFERENCE ON HUMAN-MACHINE SYSTEMS (ICHMS), 2020, : 217 - 222