共 50 条
Improving BERT model for requirements classification by bidirectional LSTM-CNN deep model
被引:11
|作者:
Kaur K.
[1
]
Kaur P.
[1
]
机构:
[1] Department of Computer Science, Guru Nanak Dev University, Amritsar
来源:
关键词:
BERT;
Bidirectional LSTM;
Convolutional neural network;
Deep learning;
Requirements classification;
Requirements engineering;
Software requirements;
Transfer learning;
D O I:
10.1016/j.compeleceng.2023.108699
中图分类号:
学科分类号:
摘要:
In the last decade, requirements classification has emerged as hot research topic in Requirements Engineering (RE). Early identification of software requirements helps the development team in the design of software systems. Manual identification and classification of these requirements is time-consuming and labor-intensive task. To address this issue, machine and deep learning techniques have been studied for automatic classification of requirements. Furthermore, an efficient word embedding representation of the input data is a major concern in automatic approaches. This research work presents a novel requirements classification model that integrates Bidirectional Encoder Representations from Transformers (BERT) with Bidirectional Long-short Term Memory (BiLSTM) with Convolutional Neural Network (CNN) layer called BERT-BiCNN. In BERT-BiCNN, BERT is used as word embedding layer that extract the full combination of contextual semantics. BiLSTM obtains contextual information. The CNN mechanism has been employed to reduce the dimensionality of feature space by selecting the important features. The effectiveness of BERT-BiCNN is evaluated on the PROMISE dataset. On comparative analysis, it can be concluded that proposed approach outperforms six recent deep learning based architectures in binary and multi-class classification. © 2023 Elsevier Ltd
引用
收藏
相关论文