Muscle Categorization Using PDF Estimation and Naive Bayes Classification

被引:0
|
作者
Adel, Tameem M. [1 ]
Smith, Benn E.
Stashuk, Daniel W. [1 ]
机构
[1] Univ Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, Canada
关键词
PATTERN DISCOVERY; QUANTITATIVE ELECTROMYOGRAPHY; DECOMPOSITION;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The structure of motor unit potentials (MUPs) and their times of occurrence provide information about the motor units (MUs) that created them. As such, electromyographic (EMG) data can be used to categorize muscles as normal or suffering from a neuromuscular disease. Using pattern discovery (PD) allows clinicians to understand the rationale underlying a certain muscle characterization; i.e. it is transparent. Discretization is required in PD, which leads to some loss in accuracy. In this work, characterization techniques that are based on estimating probability density functions (PDFs) for each muscle category are implemented. Characterization probabilities of each motor unit potential train (MUPT) are obtained from these PDFs and then Bayes rule is used to aggregate the MUPT characterization probabilities to calculate muscle level probabilities. Even though this technique is not as transparent as PD, its accuracy is higher than the discrete PD. Ultimately, the goal is to use a technique that is based on both PDFs and PD and make it as transparent and as efficient as possible, but first it was necessary to thoroughly assess how accurate a fully continuous approach can be. Using Gaussian PDF estimation achieved improvements in muscle categorization accuracy over PD and further improvements resulted from using feature value histograms to choose more representative PDFs; for instance, using log-normal distribution to represent skewed histograms.
引用
收藏
页码:2619 / 2622
页数:4
相关论文
共 50 条
  • [1] Educational data Classification using Selective Naive Bayes for Quota categorization
    Dangi, Abhilasha
    Srivastava, Sumit
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON MOOC, INNOVATION AND TECHNOLOGY IN EDUCATION (MITE), 2014, : 118 - 121
  • [2] Lyrics Classification using Naive Bayes
    Buzic, Dalibor
    Dobsa, Jasminka
    [J]. 2018 41ST INTERNATIONAL CONVENTION ON INFORMATION AND COMMUNICATION TECHNOLOGY, ELECTRONICS AND MICROELECTRONICS (MIPRO), 2018, : 1011 - 1015
  • [3] Naive Bayes classification given probability estimation trees
    Qin, Zengchang
    [J]. ICMLA 2006: 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2006, : 34 - 39
  • [4] Boosting Naive Bayes Text Categorization by Using Cloud Model
    Wan, Jian
    He, Tingting
    Chen, Jinguang
    Dong, Jinling
    [J]. 2011 INTERNATIONAL CONFERENCE ON COMPUTER, ELECTRICAL, AND SYSTEMS SCIENCES, AND ENGINEERING (CESSE 2011), 2011, : 165 - +
  • [5] Naive bayes text categorization using improved feature selection
    Lin, Kunhui
    Kang, Kai
    Huang, Yunping
    Zhou, Changle
    Wang, Beizhan
    [J]. Journal of Computational Information Systems, 2007, 3 (03): : 1159 - 1164
  • [6] Texture Classification using Naive Bayes Classifier
    Mansour, Ayman M.
    [J]. INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2018, 18 (01): : 112 - 120
  • [7] Classification using Hierarchical Naive Bayes models
    Langseth, H
    Nielsen, TD
    [J]. MACHINE LEARNING, 2006, 63 (02) : 135 - 159
  • [8] Content Abstract Classification Using Naive Bayes
    Latif, Syukriyanto
    Suwardoyo, Untung
    Sanadi, Edwin Aldrin Wihelmus
    [J]. 2ND INTERNATIONAL CONFERENCE ON SCIENCE (ICOS), 2018, 979
  • [9] Self-Adaptive Probability Estimation for Naive Bayes Classification
    Wu, Jia
    Cai, Zhihua
    Zhu, Xingquan
    [J]. 2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [10] Multinomial naive Bayes for text categorization revisited
    Kibriya, AM
    Frank, E
    Pfahringer, B
    Holmes, G
    [J]. AI 2004: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 3339 : 488 - 499