Neutron: an attention-based neural decompiler

被引:0
|
作者
Ruigang Liang
Ying Cao
Peiwei Hu
Kai Chen
机构
[1] SKLOIS,
[2] Institute of Information Engineering,undefined
[3] Chinese Academy of Sciences,undefined
[4] School of Cyber Security,undefined
[5] University of Chinese Academy of Sciences,undefined
来源
关键词
Decompilation; LSTM; Attention; Translation;
D O I
暂无
中图分类号
学科分类号
摘要
Decompilation aims to analyze and transform low-level program language (PL) codes such as binary code or assembly code to obtain an equivalent high-level PL. Decompilation plays a vital role in the cyberspace security fields such as software vulnerability discovery and analysis, malicious code detection and analysis, and software engineering fields such as source code analysis, optimization, and cross-language cross-operating system migration. Unfortunately, the existing decompilers mainly rely on experts to write rules, which leads to bottlenecks such as low scalability, development difficulties, and long cycles. The generated high-level PL codes often violate the code writing specifications. Further, their readability is still relatively low. The problems mentioned above hinder the efficiency of advanced applications (e.g., vulnerability discovery) based on decompiled high-level PL codes.In this paper, we propose a decompilation approach based on the attention-based neural machine translation (NMT) mechanism, which converts low-level PL into high-level PL while acquiring legibility and keeping functionally similar. To compensate for the information asymmetry between the low-level and high-level PL, a translation method based on basic operations of low-level PL is designed. This method improves the generalization of the NMT model and captures the translation rules between PLs more accurately and efficiently. Besides, we implement a neural decompilation framework called Neutron. The evaluation of two practical applications shows that Neutron’s average program accuracy is 96.96%, which is better than the traditional NMT model.
引用
收藏
相关论文
共 50 条
  • [21] Attention-Based Graph Neural Network for News Recommendation
    Ji, Zhenyan
    Wu, Mengdan
    Liu, Jirui
    Armendariz Inigo, Jose Enrique
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [22] Attention-based Neural Network for Traffic Sign Detection
    Zhang, Jing
    Hui, Le
    Lu, Jianfeng
    Zhu, Yuhua
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1839 - 1844
  • [23] Attention-Based Recurrent Neural Network for Multicriteria Recommendations
    Bougteb, Yahya
    Frikh, Bouchra
    Ouhbi, Brahim
    Zemmouri, El Moukhtar
    [J]. INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, INTELLISYS 2023, 2024, 823 : 264 - 274
  • [24] Attention-based Recurrent Neural Network for Location Recommendation
    Xia, Bin
    Li, Yun
    Li, Qianmu
    Li, Tao
    [J]. 2017 12TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS AND KNOWLEDGE ENGINEERING (IEEE ISKE), 2017,
  • [25] Signal Peptides Generated by Attention-Based Neural Networks
    Wu, Zachary
    Yang, Kevin K.
    Liszka, Michael J.
    Lee, Alycia
    Batzilla, Alina
    Wernick, David
    Weiner, David P.
    Arnold, Frances H.
    [J]. ACS SYNTHETIC BIOLOGY, 2020, 9 (08): : 2154 - 2161
  • [26] Attention-Based Recurrent Neural Network for Sequence Labeling
    Li, Bofang
    Liu, Tao
    Zhao, Zhe
    Du, Xiaoyong
    [J]. WEB AND BIG DATA (APWEB-WAIM 2018), PT I, 2018, 10987 : 340 - 348
  • [27] A Unifying Framework of Attention-Based Neural Load Forecasting
    Xiong, Jing
    Zhang, Yu
    [J]. IEEE ACCESS, 2023, 11 : 51606 - 51616
  • [28] Demystifying Oversmoothing in Attention-Based Graph Neural Networks
    Wu, Xinyi
    Ajorlou, Amir
    Wu, Zihui
    Jadbabaie, Ali
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [29] Local Pose Optimization with an Attention-based Neural Network
    Liu, Yiling
    Wang, Hesheng
    Xu, Fan
    Wang, Yong
    Chen, Weidong
    Tang, Qirong
    [J]. 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 3084 - 3089
  • [30] Chroma Intra Prediction With Lightweight Attention-Based Neural Networks
    Zou, Chengyi
    Wan, Shuai
    Ji, Tiannan
    Blanch, Marc Gorriz
    Mrak, Marta
    Herranz, Luis
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 549 - 560