Sensor-Based Human Activity Recognition in Smart Homes Using Depthwise Separable Convolutions

被引:3
|
作者
Alghazzawi, Daniyal [1 ]
Rabie, Osama [1 ]
Bamasaq, Omaima [2 ]
Albeshri, Aiiad [2 ]
Asghar, Muhammad Zubair [3 ]
机构
[1] King Abdulaziz Univ, Fac Comp & Informat Technol, Dept Informat Syst, Jeddah, Saudi Arabia
[2] King Abdulaziz Univ, Fac Comp & Informat Technol, Dept Comp Sci, Jeddah, Saudi Arabia
[3] Gomal Univ, Inst Comp & Informat Technol ICIT, Dera Ismail Khan, KP, Pakistan
关键词
Human Activity Recognition; Smart Homes; Depthwise Separable Convolutions; Sensors; HYBRID;
D O I
10.22967/HCIS.2022.12.050
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The recent enhancement of computerized electronic gadgets has led to the acceptance of smart home sensing applications, stimulating a need for related services and products. As a result, the ever-increasing volume of data necessitates the application of advanced deep learning to the automated identification of human activity. Over the years, several deep learning models that learn to categorize human activities have been proposed, and several experts have used convolutional neural networks. To tackle the human activity recognition (HAR) problem in smart homes, we suggest employing a depthwise separable convolution neural network (DS-CNN). Instead of standard 2D convolution layers, the network uses depth-wise separable convolution layers. DS-CNN is a fantastic performer, particularly with limited datasets. DS-CNN also minimizes the number of trainable parameters while improving learning efficiency by using a compact network. We tested our technique on benchmark HAR-based smart home datasets, and the findings reveal that it outperforms the current state of the art. This study shows that using depthwise separable convolutions significantly improves performance (accuracy=92.960, precision=91.6, recall=90, F-score=93) compared to classical CNN and baseline methods.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Improved Deep Learning Structure with Lightweight Depthwise Convolutions for Human Activity Recognition
    Jang, Seoungwoo
    Jung, Im Y.
    HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2025, 15
  • [22] Exploration and Deduction of Sensor-Based Human Activity Recognition System of Smart-Phone Data
    Lavanya, B.
    Gayathri, G. S.
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH (ICCIC), 2017, : 194 - 198
  • [23] Enhancing Sensor-Based Human Activity Recognition using Efficient Channel Attention
    Jitpattanakul, Anuchit
    Mekruksavanich, Sakorn
    2023 IEEE SENSORS, 2023,
  • [24] Event Recognition in Sensor-Based Smart Environments
    Salah, Albert Ali
    IEEE PERVASIVE COMPUTING, 2009, 8 (03) : 80 - 80
  • [25] Invariant Feature Learning for Sensor-Based Human Activity Recognition
    Hao, Yujiao
    Zheng, Rong
    Wang, Boyu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (11) : 4013 - 4024
  • [26] GRU with Dual Attentions for Sensor-Based Human Activity Recognition
    Pan, Jianguo
    Hu, Zhengxin
    Yin, Sisi
    Li, Meizi
    ELECTRONICS, 2022, 11 (11)
  • [27] On the Generality of Codebook Approach for Sensor-Based Human Activity Recognition
    Shirahama, Kimiaki
    Grzegorzek, Marcin
    ELECTRONICS, 2017, 6 (02)
  • [28] A Pattern Mining Approach to Sensor-Based Human Activity Recognition
    Gu, Tao
    Wang, Liang
    Wu, Zhanqing
    Tao, Xianping
    Lu, Jian
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2011, 23 (09) : 1359 - 1372
  • [29] Unsupervised Diffusion Model for Sensor-based Human Activity Recognition
    Zuo, Si
    Rey, Vitor Fortes
    Suh, Sungho
    Sigg, Stephan
    Lukowicz, Paul
    ADJUNCT PROCEEDINGS OF THE 2023 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING & THE 2023 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTING, UBICOMP/ISWC 2023 ADJUNCT, 2023, : 205 - 205
  • [30] A Study on Diffusion Modelling For Sensor-based Human Activity Recognition
    Shao, Shuai
    Sanchez, Victor
    2023 11TH INTERNATIONAL WORKSHOP ON BIOMETRICS AND FORENSICS, IWBF, 2023,