Towards cognition-augmented human-centric assembly: A visual computation perspective

被引:0
|
作者
Pang, Jiazhen [1 ]
Zheng, Pai [1 ]
Fan, Junming [1 ]
Liu, Tianyuan [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Ind & Syst Engn, Hong Kong, Peoples R China
关键词
Cognitive assistance; Human-centric assembly; Computer vision; Metaverse; Cloud service; Large language model; Brain computer interface; PARTS RECOGNITION; SYSTEM; REGISTRATION; SIMULATION; MACHINE; REALITY;
D O I
10.1016/j.rcim.2024.102852
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Human-centric assembly is emerging as a promising paradigm for achieving mass personalization in the context of Industry 5.0, as it fully capitalizes on the advantages of human flexibility with robot assistance. However, in small-batch and highly customized assembly tasks, frequently changes in production procedures pose significant cognition challenges. To address this, leveraging computer vision technology to enhance human cognition becomes a feasible solution. Therefore, this review aims to explore the cognitive characteristics of human beings and classify existing computer vision technologies in a manner that discusses the future development of cognition-augmented human-centric assembly. The concept of cognition-augmented assembly is first proposed based on the brain's functional structure - the frontal, parietal, temporal, and occipital lobes. Corresponding to these brain regions, cognitive issues in spatiality, memory, knowledge, and decision-making are summarized. Recent studies conducted between 2014 and 2023 on visual computation of assembly are categorized into four groups: position registration, multi-layer recognition, contextual perception, and mixed-reality fusion, all aimed at addressing these cognitive challenges. The applications and limitations of current computer vision technology are discussed. Furthermore, considering the rapidly evolving technologies such as the metaverse, cloud services, large language models, and brain-computer interfaces, future trends on computer vision are prospected to augment human cognition corresponding to the cognitive issues.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Towards TRUE human-centric computation
    Rabaey, Jan M.
    [J]. COMPUTER COMMUNICATIONS, 2018, 131 : 73 - 76
  • [2] A futuristic perspective on human-centric assembly
    Wang, Lihui
    [J]. JOURNAL OF MANUFACTURING SYSTEMS, 2022, 62 : 199 - 201
  • [3] Towards Human-Centric Visual Access Control for Clinical Data Management
    Fahl, Sascha
    Harbach, Marian
    Smith, Matthew
    [J]. QUALITY OF LIFE THROUGH QUALITY OF INFORMATION, 2012, 180 : 756 - 760
  • [4] EMPATHIC SPACE: THE COMPUTATION OF HUMAN-CENTRIC ARCHITECTURE
    Castle, Helen
    [J]. ARCHITECTURAL DESIGN, 2014, 84 (05) : 5 - 5
  • [5] Human-Centric Assembly Cell & Line Validation
    Rueckert, Andre
    Niemann, Marc
    Kam, Eric
    [J]. PROCEEDINGS OF THE 6TH INTERNATIONAL DIGITAL HUMAN MODELING SYMPOSIUM (DHM2020), 2020, 11 : 279 - 287
  • [6] Radiating Centers: Augmented Reality and Human-Centric Design
    Pedersen, Isabel
    [J]. 2009 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY - ARTS, MEDIA, AND HUMANITIES, 2009, : 11 - 16
  • [7] Fault Detection in WSNs - An Energy Efficiency Perspective Towards Human-Centric WSNs
    Orfanidis, Charalampos
    Zhang, Yue
    Dragoni, Nicola
    [J]. AGENT AND MULTI-AGENT SYSTEMS: TECHNOLOGIES AND APPLICATIONS, 2015, 38 : 285 - 300
  • [8] On human-centric and robot-centric perspective of a building model
    Turek, Wojciech
    Cetnarowicz, Krzysztof
    Borkowski, Adam
    [J]. AUTOMATION IN CONSTRUCTION, 2017, 81 : 2 - 16
  • [9] HumVis: Human-Centric Visual Analysis System
    Wang, Dongkai
    Zhang, Shiliang
    Wang, Yaowei
    Tian, Yonghong
    Huang, Tiejun
    Gao, Wen
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 9396 - 9398
  • [10] Towards Human-Centric Psychomotor Recommender Systems
    Portaz, Miguel
    Manjarres, Angeles
    Santos, Olga C.
    [J]. 2023 ADJUNCT PROCEEDINGS OF THE 31ST ACM CONFERENCE ON USER MODELING, ADAPTATION AND PERSONALIZATION, UMAP 2023, 2023, : 337 - 342