Human Activity Recognition for the Identification of Bullying and Cyberbullying Using Smartphone Sensors

被引:4
|
作者
Gattulli, Vincenzo [1 ]
Impedovo, Donato [1 ]
Pirlo, Giuseppe [1 ]
Sarcinella, Lucia [1 ]
机构
[1] Univ Bari Aldo Moro, Dipartimento Informat, I-70125 Bari, Italy
关键词
human activity recognition; deep learning; machine learning; smartphone; bullying; cyberbullying;
D O I
10.3390/electronics12020261
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The smartphone is an excellent source of data; it is possible to extrapolate smartphone sensor values and, through Machine Learning approaches, perform anomaly detection analysis characterized by human behavior. This work exploits Human Activity Recognition (HAR) models and techniques to identify human activity performed while filling out a questionnaire via a smartphone application, which aims to classify users as Bullying, Cyberbullying, Victims of Bullying, and Victims of Cyberbullying. The purpose of the work is to discuss a new smartphone methodology that combines the final label elicited from the cyberbullying/bullying questionnaire (Bully, Cyberbully, Bullying Victim, and Cyberbullying Victim) and the human activity performed (Human Activity Recognition) while the individual fills out the questionnaire. The paper starts with a state-of-the-art analysis of HAR to arrive at the design of a model that could recognize everyday life actions and discriminate them from actions resulting from alleged bullying activities. Five activities were considered for recognition: Walking, Jumping, Sitting, Running and Falling. The best HAR activity identification model then is applied to the Dataset derived from the "Smartphone Questionnaire Application" experiment to perform the analysis previously described.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Human Activity Recognition through Smartphone Inertial Sensors with ML Approach
    Alanazi, Munid
    Aldahr, Raghdah Saem
    Ilyas, Mohammad
    ENGINEERING TECHNOLOGY & APPLIED SCIENCE RESEARCH, 2024, 14 (01) : 12780 - 12787
  • [32] EnsemConvNet: a deep learning approach for human activity recognition using smartphone sensors for healthcare applications
    Mukherjee, Debadyuti
    Mondal, Riktim
    Singh, Pawan Kumar
    Sarkar, Ram
    Bhattacharjee, Debotosh
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (41-42) : 31663 - 31690
  • [33] EnsemConvNet: a deep learning approach for human activity recognition using smartphone sensors for healthcare applications
    Debadyuti Mukherjee
    Riktim Mondal
    Pawan Kumar Singh
    Ram Sarkar
    Debotosh Bhattacharjee
    Multimedia Tools and Applications, 2020, 79 : 31663 - 31690
  • [34] Deep Convolutional Neural Networks for Human Activity Recognition with Smartphone Sensors
    Ronao, Charissa Ann
    Cho, Sung-Bae
    NEURAL INFORMATION PROCESSING, ICONIP 2015, PT IV, 2015, 9492 : 46 - 53
  • [35] A Novel Deep BiGRU-ResNet Model for Human Activity Recognition using Smartphone Sensors
    Mekruksavanich, Sakorn
    Jantawong, Ponnipa
    Hnoohom, Narit
    Jitpattanakul, Anuchit
    2022 19TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE 2022), 2022,
  • [36] HDL: Hierarchical Deep Learning Model based Human Activity Recognition using Smartphone Sensors
    Su, Tongtong
    Sun, Huazhi
    Ma, Chunmei
    Jiang, Lifen
    Xu, Tongtong
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [37] A Public Domain Dataset for Real-Life Human Activity Recognition Using Smartphone Sensors
    Garcia-Gonzalez, Daniel
    Rivero, Daniel
    Fernandez-Blanco, Enrique
    Luaces, Miguel R.
    SENSORS, 2020, 20 (08)
  • [38] A framework for group activity detection and recognition using smartphone sensors and beacons
    Chen, Hao
    Cha, Seung Hyun
    Kim, Tae Wan
    BUILDING AND ENVIRONMENT, 2019, 158 : 205 - 216
  • [40] Human Activity Recognition by smartphone
    Tuan Dinh Le
    Chung Van Nguyen
    PROCEEDINGS OF 2015 2ND NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT CONFERENCE ON INFORMATION AND COMPUTER SCIENCE NICS 2015, 2015, : 219 - 224