Efficient and Lightweight Visual Tracking with Differentiable Neural Architecture Search

被引:0
|
作者
Gao, Peng [1 ,2 ]
Liu, Xiao [1 ]
Sang, Hong-Chuan [1 ]
Wang, Yu [1 ,3 ]
Wang, Fei [4 ]
机构
[1] Qufu Normal Univ, Sch Cyber Sci & Engn, Qufu 273165, Peoples R China
[2] Yuntian Grp, Dezhou 253700, Peoples R China
[3] Qufu Normal Univ, Network & Informat Ctr, Qufu 273165, Peoples R China
[4] Harbin Inst Technol, Sch Elect & Informat Engn, Shenzhen 518055, Peoples R China
基金
中国博士后科学基金;
关键词
visual tracking; deep learning; neural networks; neural architecture search; OBJECT TRACKING; NETWORK;
D O I
10.3390/electronics12173623
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Over the last decade, Siamese network architectures have emerged as dominating tracking paradigms, which have led to significant progress. These architectures are made up of a backbone network and a head network. The backbone network comprises two identical feature extraction sub-branches, one for the target template and one for the search candidate. The head network takes both the template and candidate features as inputs and produces a local similarity score for the target object in each location of the search candidate. Despite promising results that have been attained in visual tracking, challenges persist in developing efficient and lightweight models due to the inherent complexity of the task. Specifically, manually designed tracking models that rely heavily on the knowledge and experience of relevant experts are lacking. In addition, the existing tracking approaches achieve excellent performance at the cost of large numbers of parameters and vast amounts of computations. A novel Siamese tracking approach called TrackNAS based on neural architecture search is proposed to reduce the complexity of the neural architecture applied in visual tracking. First, according to the principle of the Siamese network, backbone and head network search spaces are constructed, constituting the search space for the network architecture. Next, under the given resource constraints, the network architecture that meets the tracking performance requirements is obtained by optimizing a hybrid search strategy that combines distributed and joint approaches. Then, an evolutionary method is used to lighten the network architecture obtained from the search phase to facilitate deployment to devices with resource constraints (FLOPs). Finally, to verify the performance of TrackNAS, comparison and ablation experiments are conducted using several large-scale visual tracking benchmark datasets, such as OTB100, VOT2018, UAV123, LaSOT, and GOT-10k. The results indicate that the proposed TrackNAS achieves competitive performance in terms of accuracy and robustness, and the number of network parameters and computation volume are far smaller than those of other advanced Siamese trackers, meeting the requirements for lightweight deployment to resource-constrained devices.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Differentiable Neural Architecture Search for Extremely Lightweight Image Super-Resolution
    Huang, Han
    Shen, Li
    He, Chaoyang
    Dong, Weisheng
    Liu, Wei
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (06) : 2672 - 2682
  • [2] Efficient Automation of Neural Network Design: A Survey on Differentiable Neural Architecture Search
    Heuillet, Alexandre
    Nasser, Ahmad
    Arioui, Hichem
    Tabia, Hedi
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (11)
  • [3] An architecture entropy regularizer for differentiable neural architecture search
    Jing, Kun
    Chen, Luoyu
    Xu, Jungang
    [J]. NEURAL NETWORKS, 2023, 158 : 111 - 120
  • [4] RADARS: Memory Efficient Reinforcement Learning Aided Differentiable Neural Architecture Search
    Yan, Zheyu
    Jiang, Weiwen
    Hu, Xiaobo Sharon
    Shi, Yiyu
    [J]. 27TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE, ASP-DAC 2022, 2022, : 128 - 133
  • [5] Decoupled differentiable graph neural architecture search
    Chen, Jiamin
    Gao, Jianliang
    Wu, Zhenpeng
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    [J]. INFORMATION SCIENCES, 2024, 673
  • [6] Differentiable neural architecture learning for efficient neural networks
    Guo, Qingbei
    Wu, Xiao-Jun
    Kittler, Josef
    Feng, Zhiquan
    [J]. PATTERN RECOGNITION, 2022, 126
  • [7] LRNAS: Differentiable Searching for Adversarially Robust Lightweight Neural Architecture
    Feng, Yuqi
    Lv, Zeqiong
    Chen, Hongyang
    Gao, Shangce
    An, Fengping
    Sun, Yanan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [8] DASS: Differentiable Architecture Search for Sparse Neural Networks
    Mousavi, Hamid
    Loni, Mohammad
    Alibeigi, Mina
    Daneshtalab, Masoud
    [J]. ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [9] Differentiable neural architecture search with channel performance measurement
    Pan, Jie
    Zheng, Xue-Chi
    Zou, Xiao-Yu
    [J]. Kongzhi yu Juece/Control and Decision, 2024, 39 (07): : 2151 - 2160
  • [10] Understanding the wiring evolution in differentiable neural architecture search
    Xie, Sirui
    Hu, Shoukang
    Wang, Xinjiang
    Liu, Chunxiao
    Shi, Jianping
    Liu, Xunying
    Lin, Dahua
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130