A Self-Attention Integrated Learning Model for Landing Gear Performance Prediction

被引:3
|
作者
Lin, Lin [1 ]
Tong, Changsheng [1 ]
Guo, Feng [1 ]
Fu, Song [1 ]
Lv, Yancheng [1 ]
He, Wenhui [1 ]
机构
[1] Harbin Inst Technol, Sch Mechatron Engn, Harbin 150001, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
performance prediction; feature selection; data distribution; integrated learning; self-attention; DESIGN;
D O I
10.3390/s23136219
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The landing gear structure suffers from large loads during aircraft takeoff and landing, and an accurate prediction of landing gear performance is beneficial to ensure flight safety. Nevertheless, the landing gear performance prediction method based on machine learning has a strong reliance on the dataset, in which the feature dimension and data distribution will have a great impact on the prediction accuracy. To address these issues, a novel MCA-MLPSA is developed. First, an MCA (multiple correlation analysis) method is proposed to select key features. Second, a heterogeneous multilearner integration framework is proposed, which makes use of different base learners. Third, an MLPSA (multilayer perceptron with self-attention) model is proposed to adaptively capture the data distribution and adjust the weights of each base learner. Finally, the excellent prediction performance of the proposed MCA-MLPSA is validated by a series of experiments on the landing gear data.
引用
收藏
页数:24
相关论文
共 50 条
  • [31] Homogeneous Learning: Self-Attention Decentralized Deep Learning
    Sun, Yuwei
    Ochiai, Hideya
    IEEE ACCESS, 2022, 10 : 7695 - 7703
  • [32] Convergence of Deep Learning and Forensic Methodologies Using Self-attention Integrated EfficientNet Model for Deep Fake Detection
    Rimjhim Padam Singh
    Nichenametla Hima Sree
    Koti Leela Sai Praneeth Reddy
    Kandukuri Jashwanth
    SN Computer Science, 5 (8)
  • [33] High generalization performance structured self-attention model for knapsack problem
    Ding, Man
    Han, Congying
    Guo, Tiande
    DISCRETE MATHEMATICS ALGORITHMS AND APPLICATIONS, 2021, 13 (06)
  • [34] A self-attention sequential model for long-term prediction of video streams
    Ge, Yunfeng
    Li, Hongyan
    Shi, Keyi
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (03): : 88 - 102
  • [35] Self-attention transformer model for pan evaporation prediction: a case study in Australia
    Abed, Mustafa
    Imteaz, Monzur Alam
    Huang, Yuk Feng
    Ahmed, Ali Najah
    JOURNAL OF HYDROINFORMATICS, 2024, 26 (10) : 2538 - 2556
  • [36] Integrated Multi-Head Self-Attention Transformer model for electricity demand prediction incorporating local climate variables
    Ghimire, Sujan
    Nguyen-Huy, Thong
    AL-Musaylh, Mohanad S.
    Deo, Ravinesh C.
    Casillas-Perez, David
    Salcedo-Sanz, Sancho
    ENERGY AND AI, 2023, 14
  • [37] Self-attention with Functional Time Representation Learning
    Xu, Da
    Ruan, Chuanwei
    Kumar, Sushant
    Korpeoglu, Evren
    Achan, Kannan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [38] A self-attention dynamic graph convolution network model for traffic flow prediction
    Liao, Kaili
    Zhou, Wuneng
    Wu, Wanpeng
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024,
  • [39] SELF-ATTENTION BASED MODEL FOR PUNCTUATION PREDICTION USING WORD AND SPEECH EMBEDDINGS
    Yi, Jiangyan
    Tao, Jianhua
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7270 - 7274
  • [40] Compressed Self-Attention for Deep Metric Learning
    Chen, Ziye
    Gong, Mingming
    Xu, Yanwu
    Wang, Chaohui
    Zhang, Kun
    Du, Bo
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3561 - 3568