Infant Cry Classification Using Semi-supervised K-Nearest Neighbor Approach

被引:1
|
作者
Mahmoud, Amany Mounes [1 ]
Swilem, Sarah Mohamed [1 ]
Alqarni, Abrar Saeed [1 ]
Haron, Fazilah [2 ]
机构
[1] Taibah Univ, Coll Comp Sci & Engn, Medina, Saudi Arabia
[2] Prince Muqrin Univ, Coll Comp & Cyber Sci, Medina, Saudi Arabia
关键词
Cry; Classification; Machine learning; KNN; SSKNN; Semi-supervised learning; Self-training;
D O I
10.1109/DeSE51703.2020.9450239
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Infants cry for many different reasons. Understanding the infant's language is a critical challenge that many parents suffer from, thus, it is hard to know precisely for what reason infants are crying. The purpose of our study is to determine whether the infant cry is due to hunger or not, using semi-supervised machine learning techniques. There are two commonly used datasets in the literature, the Dunstan Baby Language and Baby Chillanto database. The total length of each of the datasets is only between 8 and 32 minutes, which is very short. For this reason, we proposed a semi-supervised learning approach (also known as self-training), which can increase the dataset by classifying the unlabeled data from Google AudioSet. We have chosen the k-nearest neighbors (KNN) classifier to determine whether the cry is due to hunger or not. The KNN is known to produce low-performance results if trained with limited data. Thus, we proposed our semi-supervised k-nearest neighbor (SSKNN) that can benefit from unlabeled data to increase the training set. As for feature extraction, we chose Mel Frequency Cepstral Coefficient. To evaluate the performance of the semi-supervised approach, we used the supervised KNN as our baseline model and compared the accuracy between the two approaches. The SSKNN yields better accuracy, which is 94% compared to the supervised KNN which has only an accuracy of 87%.
引用
收藏
页码:305 / 310
页数:6
相关论文
共 50 条
  • [1] A probabilistic approach for semi-supervised nearest neighbor classification
    Ghosh, Anil K.
    [J]. PATTERN RECOGNITION LETTERS, 2012, 33 (09) : 1127 - 1133
  • [2] A Semi-supervised K-nearest Neighbor Algorithm Based on Data Editing
    Xie Yongfang
    Jiang Youwei
    Tang Mingzhu
    [J]. 2011 CHINESE CONTROL AND DECISION CONFERENCE, VOLS 1-6, 2011, : 41 - 45
  • [3] Mutual k-nearest neighbor graph construction in graph-based semi-supervised classification
    [J]. 1600, Japanese Society for Artificial Intelligence (28):
  • [4] Semi-supervised Naive Hubness Bayesian k-Nearest Neighbor for Gene Expression Data
    Buza, Krisztian
    [J]. PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON COMPUTER RECOGNITION SYSTEMS, CORES 2015, 2016, 403 : 101 - 110
  • [5] Semi-Supervised Multi-label k-Nearest Neighbors Classification Algorithms
    de Lucena, Danilo C. G.
    Prudencio, Ricardo B. C.
    [J]. 2015 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2015), 2015, : 49 - 54
  • [6] Semi-supervised Nearest Neighbor Editing
    Guan, Donghai
    Yuan, Weiwei
    Lee, Young-Koo
    Lee, Sungyoung
    [J]. 2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 1183 - 1187
  • [7] A k-nearest neighbor approach for chromosome shape classification
    Serbanescu, Mircea Sebastian
    [J]. ANNALS OF THE UNIVERSITY OF CRAIOVA-MATHEMATICS AND COMPUTER SCIENCE SERIES, 2010, 37 (03): : 142 - 146
  • [8] Semi-supervised LDA pedestrian re-identification algorithm based on K-nearest neighbor resampling
    Li, Bin
    Tian, Ying
    Liu, Xiaopeng
    Yan, Qinghua
    Hu, Zhigang
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 44 (04) : 5647 - 5658
  • [9] Gene function classification using fuzzy K-Nearest Neighbor approach
    Li, Dan
    Deogun, Jitender S.
    Wang, Kefei
    [J]. GRC: 2007 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, PROCEEDINGS, 2007, : 644 - +
  • [10] Improved k-nearest neighbor classification
    Wu, YQ
    Ianakiev, K
    Govindaraju, V
    [J]. PATTERN RECOGNITION, 2002, 35 (10) : 2311 - 2318