EfficientWord-Net: An Open Source Hotword Detection Engine Based on Few-Shot Learning

被引:1
|
作者
Chidhambararajan, R. [1 ]
Rangapur, Aman [1 ]
Chakkaravarthy S., Sibi [1 ]
Cherukuri, Aswani Kumar [3 ]
Cruz, Meenalosini Vimal [4 ]
Ilango, S. Sudhakar [2 ]
机构
[1] VIT AP Univ, Ctr Excellence Artificial Intelligence & Robot AI, Sch Comp Sci & Engn, Amaravati, Andhra Pradesh, India
[2] VIT AP Univ, Sch Comp Sci & Engn, Amaravati, Andhra Pradesh, India
[3] VIT Univ, Sch Informat Technol & Engn, Vellore, Tamil Nadu, India
[4] Georgia Southern Univ, Allen E Paulson Coll Engn & Comp, Dept Informat Technol, Statesboro, GA USA
关键词
Deep learning; hotword detection; one-shot learning; Siamese neural network;
D O I
10.1142/S0219649222500599
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
Voice assistants like Siri, Google Assistant and Alexa are used widely across the globe for home automation. They require the use of unique phrases, also known as hotwords, to wake them up and perform an action like "Hey Alexa!", "Ok, Google!", "Hey, Siri!". These hotword detectors are lightweight real-time engines whose purpose is to detect the hotwords uttered by the user. However, existing engines require thousands of training samples or is closed source seeking a fee. This paper attempts to solve the same, by presenting the design and implementation of a lightweight, easy-to-implement hotword detection engine based on few-shot learning. The engine detects the hotword uttered by the user in real-time with just a few training samples of the hotword. This approach is efficient when compared to existing implementations because the process of adding a new hotword to the existing systems requires enormous amounts of positive and negative training samples, and the model needs to retrain for every hotword, making the existing implementations inefficient in terms of computation and cost. The architecture proposed in this paper has achieved an accuracy of 95.40%.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] EfficientWord-Net: An Open Source Hotword Detection Engine Based on Few-Shot Learning
    Chidhambararajan, R.
    Rangapur, Aman
    Sibi Chakkaravarthy, S.
    Cherukuri, Aswani Kumar
    Cruz, Meenalosini Vimal
    Ilango, S. Sudhakar
    Journal of Information and Knowledge Management, 2022, 21 (04):
  • [2] Few-Shot Learning with Novelty Detection
    Bjerge, Kim
    Bodesheim, Paul
    Karstoft, Henrik
    DEEP LEARNING THEORY AND APPLICATIONS, PT I, DELTA 2024, 2024, 2171 : 340 - 363
  • [3] Voltage Sag Source Identification Based on Few-Shot Learning
    Sun, Haotian
    Yi, Hao
    Yang, Guangyu
    Zhuo, Fang
    Hu, Anping
    IEEE ACCESS, 2019, 7 : 164398 - 164406
  • [4] Scaling Few-Shot Learning for the Open World
    Lin, Zhipeng
    Yang, Wenjing
    Wang, Haotian
    Chi, Haoang
    Lan, Long
    Wang, Ji
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13846 - 13854
  • [5] Meta-BN Net for few-shot learning
    GAO Wei
    SHAO Mingwen
    SHU Jun
    ZHUANG Xinkai
    Frontiers of Computer Science, 2023, 17 (01)
  • [6] Meta-BN Net for few-shot learning
    Wei Gao
    Mingwen Shao
    Jun Shu
    Xinkai Zhuang
    Frontiers of Computer Science, 2023, 17
  • [7] Meta-BN Net for few-shot learning
    Gao, Wei
    Shao, Mingwen
    Shu, Jun
    Zhuang, Xinkai
    FRONTIERS OF COMPUTER SCIENCE, 2023, 17 (01)
  • [8] Insulator Anomaly Detection Method Based on Few-Shot Learning
    Wang, Zhaoyang
    Gao, Qiang
    Li, Dong
    Liu, Junjie
    Wang, Hongwei
    Yu, Xiao
    Wang, Yipin
    IEEE ACCESS, 2021, 9 : 94970 - 94980
  • [9] Few-Shot Learning for Misinformation Detection Based on Contrastive Models
    Zheng, Peng
    Chen, Hao
    Hu, Shu
    Zhu, Bin
    Hu, Jinrong
    Lin, Ching-Sheng
    Wu, Xi
    Lyu, Siwei
    Huang, Guo
    Wang, Xin
    ELECTRONICS, 2024, 13 (04)
  • [10] Few-shot learning for defect detection in manufacturing
    Zajec, Patrik
    Rozanec, Joze M.
    Theodoropoulos, Spyros
    Fontul, Mihail
    Koehorst, Erik
    Fortuna, Blaz
    Mladenic, Dunja
    INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH, 2024, 62 (19) : 6979 - 6998