Machine learning and EEG can classify passive viewing of discrete categories of visual stimuli but not the observation of pain

被引:0
|
作者
Mari, Tyler [1 ]
Henderson, Jessica [1 ]
Ali, S. Hasan [1 ]
Hewitt, Danielle [1 ]
Brown, Christopher [1 ]
Stancak, Andrej [1 ]
Fallon, Nicholas [1 ]
机构
[1] Univ Liverpool, Inst Populat Hlth, Dept Psychol, 2-21 Eleanor Rathbone Bldg,Bedford St South, Liverpool L69 7ZA, England
关键词
Empathy; Electroencephalography; N170; Event-related potential; External validation; EXTERNAL VALIDATION; PREDICTION MODELS; RACIAL BIAS; FACE; CLASSIFICATION; EMPATHY; N170; RECOGNITION; CALIBRATION; COMPONENTS;
D O I
10.1186/s12868-023-00819-y
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Previous studies have demonstrated the potential of machine learning (ML) in classifying physical pain from non-pain states using electroencephalographic (EEG) data. However, the application of ML to EEG data to categorise the observation of pain versus non-pain images of human facial expressions or scenes depicting pain being inflicted has not been explored. The present study aimed to address this by training Random Forest (RF) models on cortical event-related potentials (ERPs) recorded while participants passively viewed faces displaying either pain or neutral expressions, as well as action scenes depicting pain or matched non-pain (neutral) scenarios. Ninety-one participants were recruited across three samples, which included a model development group (n = 40) and a cross-subject validation group (n = 51). Additionally, 25 participants from the model development group completed a second experimental session, providing a within-subject temporal validation sample. The analysis of ERPs revealed an enhanced N170 component in response to faces compared to action scenes. Moreover, an increased late positive potential (LPP) was observed during the viewing of pain scenes compared to neutral scenes. Additionally, an enhanced P3 response was found when participants viewed faces displaying pain expressions compared to neutral expressions. Subsequently, three RF models were developed to classify images into faces and scenes, neutral and pain scenes, and neutral and pain expressions. The RF model achieved classification accuracies of 75%, 64%, and 69% for cross-validation, cross-subject, and within-subject classifications, respectively, along with reasonably calibrated predictions for the classification of face versus scene images. However, the RF model was unable to classify pain versus neutral stimuli above chance levels when presented with subsequent tasks involving images from either category. These results expand upon previous findings by externally validating the use of ML in classifying ERPs related to different categories of visual images, namely faces and scenes. The results also indicate the limitations of ML in distinguishing pain and non-pain connotations using ERP responses to the passive viewing of visually similar images.
引用
收藏
页数:16
相关论文
共 5 条
  • [1] Machine learning and EEG can classify passive viewing of discrete categories of visual stimuli but not the observation of pain
    Tyler Mari
    Jessica Henderson
    S. Hasan Ali
    Danielle Hewitt
    Christopher Brown
    Andrej Stancak
    Nicholas Fallon
    [J]. BMC Neuroscience, 24
  • [2] MACHINE LEARNING CAN ACCURATELY CLASSIFY CHRONIC CONSTIPATION PATIENTS BY SYMPTOM BURDEN USING PAIN MEASURES ALONE
    Ruffle, James
    Tinkler, Linda
    Emmett, Christopher
    Aziz, Qasim
    Farmer, Adam
    Yiannakou, Yian
    [J]. GUT, 2019, 68 : A218 - A219
  • [3] MACHINE LEARNING WITH PAIN MEASURES ALONE CAN ACCURATELY CLASSIFY CHRONIC CONSTIPATION PATIENTS TO HIGH OR LOW TOTAL SYMPTOM BURDEN
    Ruffle, James K.
    Tinkler, Linda
    Emmett, Christopher
    Aziz, Qasim
    Farmer, Adam D.
    Yiannakou, Yan
    [J]. GASTROENTEROLOGY, 2019, 156 (06) : S590 - S591
  • [4] Single-trial extraction of event-related potentials (ERPs) and classification of visual stimuli by ensemble use of discrete wavelet transform with Huffman coding and machine learning techniques
    Amin, Hafeez Ullah
    Ullah, Rafi
    Reza, Mohammed Faruque
    Malik, Aamir Saeed
    [J]. JOURNAL OF NEUROENGINEERING AND REHABILITATION, 2023, 20 (01)
  • [5] Single-trial extraction of event-related potentials (ERPs) and classification of visual stimuli by ensemble use of discrete wavelet transform with Huffman coding and machine learning techniques
    Hafeez Ullah Amin
    Rafi Ullah
    Mohammed Faruque Reza
    Aamir Saeed Malik
    [J]. Journal of NeuroEngineering and Rehabilitation, 20