Question Classification Using Universal Sentence Encoder and Deep Contextualized Transformer

被引:0
|
作者
Arif, Najam [1 ]
Latif, Seemab [1 ]
Latif, Rabia [2 ]
机构
[1] Natl Univ Sci & Technol NUST, Sch Elect Engn & Comp Sci, Islamabad, Pakistan
[2] Prince Sultan Univ, Coll Comp & Informat Sci, Riyadh, Saudi Arabia
关键词
Covid-19; Machine learning; Multi-class; Question Answer systems (QAs); Text classification; Universal sentence encoder;
D O I
10.1109/DESE54285.2021.9719473
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the most vital steps in automatic Question Answer systems is Question classification. The Question classification is also known as Answer type classification, identification, or prediction. The precise and accurate identification of answer types can lead to the elimination of irrelevant candidate answers from the pool of answers available for the question. High accuracy of Question Classification phase means highly accurate answer for the given question. This paper proposes an approach, named Question Sentence Embedding(QSE), for question classification by utilizing semantic features. Extracting a large number of features does not solve the problem every time. Our proposed approach simplifies the feature extraction stage by not extracting features such as named entities which are present in fewer questions because of their short length and features such as hypernyms and hyponyms of a word which requires WordNet extension and hence makes the system more external sources dependent. We encourage the use of Universal Sentence Embedding with Transformer Encoder for obtaining sentence level embedding vector of fixed size and then calculate semantic similarity among these vectors to classify questions in their predefined categories. As it is the time of the Global pandemic COVID-19 and people are more curious to ask questions about COVID. So, our experimental dataset is a publicly available COVID-Q dataset. The acquired result highlights an accuracy of 69% on COVID questions. The approach outperforms the baseline method manifesting the efficacy of the QSE method.
引用
收藏
页码:206 / 211
页数:6
相关论文
共 50 条
  • [1] Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection Task
    Laskar, Md Tahmid Rahman
    Huang, Jimmy
    Hoque, Enamul
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 5505 - 5514
  • [2] Semantic Sentiment Classification for COVID-19 Tweets Using Universal Sentence Encoder
    Fattoh, Ibrahim Eldesouky
    Kamal Alsheref, Fahad
    Ead, Waleed M.
    Youssef, Ahmed Mohamed
    Computational Intelligence and Neuroscience, 2022, 2022
  • [3] Semantic Sentiment Classification for COVID-19 Tweets Using Universal Sentence Encoder
    Fattoh, Ibrahim Eldesouky
    Alsheref, Fahad Kamal
    Ead, Waleed M.
    Youssef, Ahmed Mohamed
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [4] Question Classification for the Travel Domain using Deep Contextualized Word Embedding Models
    Weerakoon, Charmy
    Ranathunga, Surangika
    MORATUWA ENGINEERING RESEARCH CONFERENCE (MERCON 2021) / 7TH INTERNATIONAL MULTIDISCIPLINARY ENGINEERING RESEARCH CONFERENCE, 2021, : 573 - 578
  • [5] Universal Sentence Encoder for English
    Cer, Daniel
    Yang, Yinfei
    Kong, Sheng-yi
    Hua, Nan
    Limtiaco, Nicole
    St John, Rhomni
    Constant, Noah
    Guajardo-Cespedes, Mario
    Yuan, Steve
    Tar, Chris
    Sung, Yun-Hsuan
    Strope, Brian
    Kurzweil, Ray
    CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018): PROCEEDINGS OF SYSTEM DEMONSTRATIONS, 2018, : 169 - 174
  • [6] Transformer encoder with multiscale deep learning for pain classification using physiological signals
    Lu, Zhenyuan
    Ozek, Burcu
    Kamarthi, Sagar
    FRONTIERS IN PHYSIOLOGY, 2023, 14
  • [7] Automatic Short Answer Grading Using Universal Sentence Encoder
    Chakraborty, Chandralika
    Sethi, Rohan
    Chauhan, Vidushi
    Sarma, Bhairab
    Chakraborty, Udit Kumar
    LEARNING IN THE AGE OF DIGITAL AND GREEN TRANSITION, ICL2022, VOL 1, 2023, 633 : 511 - 518
  • [8] Discrete Cosine Transform as Universal Sentence Encoder
    Almarwani, Nada
    Diab, Mona
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 419 - 426
  • [9] Multilingual Universal Sentence Encoder for Semantic Retrieval
    Yang, Yinfei
    Cer, Daniel
    Ahmad, Amin
    Guo, Mandy
    Law, Jax
    Constant, Noah
    Abrego, Gustavo Hernandez
    Yuan, Steve
    Tar, Chris
    Sung, Yun-Hsuan
    Strope, Brian
    Kurzweil, Ray
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020): SYSTEM DEMONSTRATIONS, 2020, : 87 - 94
  • [10] MathUSE: Mathematical information retrieval system using universal sentence encoder model
    Dadure, Pankaj
    Pakray, Partha
    Bandyopadhyay, Sivaji
    JOURNAL OF INFORMATION SCIENCE, 2024, 50 (01) : 66 - 84