Neutron: an attention-based neural decompiler

被引:0
|
作者
Ruigang Liang
Ying Cao
Peiwei Hu
Kai Chen
机构
[1] SKLOIS,
[2] Institute of Information Engineering,undefined
[3] Chinese Academy of Sciences,undefined
[4] School of Cyber Security,undefined
[5] University of Chinese Academy of Sciences,undefined
来源
关键词
Decompilation; LSTM; Attention; Translation;
D O I
暂无
中图分类号
学科分类号
摘要
Decompilation aims to analyze and transform low-level program language (PL) codes such as binary code or assembly code to obtain an equivalent high-level PL. Decompilation plays a vital role in the cyberspace security fields such as software vulnerability discovery and analysis, malicious code detection and analysis, and software engineering fields such as source code analysis, optimization, and cross-language cross-operating system migration. Unfortunately, the existing decompilers mainly rely on experts to write rules, which leads to bottlenecks such as low scalability, development difficulties, and long cycles. The generated high-level PL codes often violate the code writing specifications. Further, their readability is still relatively low. The problems mentioned above hinder the efficiency of advanced applications (e.g., vulnerability discovery) based on decompiled high-level PL codes.In this paper, we propose a decompilation approach based on the attention-based neural machine translation (NMT) mechanism, which converts low-level PL into high-level PL while acquiring legibility and keeping functionally similar. To compensate for the information asymmetry between the low-level and high-level PL, a translation method based on basic operations of low-level PL is designed. This method improves the generalization of the NMT model and captures the translation rules between PLs more accurately and efficiently. Besides, we implement a neural decompilation framework called Neutron. The evaluation of two practical applications shows that Neutron’s average program accuracy is 96.96%, which is better than the traditional NMT model.
引用
收藏
相关论文
共 50 条
  • [1] Neutron: an attention-based neural decompiler
    Liang, Ruigang
    Cao, Ying
    Hu, Peiwei
    Chen, Kai
    [J]. CYBERSECURITY, 2021, 4 (01)
  • [2] Attention-Based Neural Text Segmentation
    Badjatiya, Pinkesh
    Kurisinkel, Litton J.
    Gupta, Manish
    Varma, Vasudeva
    [J]. ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 180 - 193
  • [3] Attention-Based Neural Tag Recommendation
    Yuan, Jiahao
    Jin, Yuanyuan
    Liu, Wenyan
    Wang, Xiaoling
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2019), PT II, 2019, 11447 : 350 - 365
  • [4] A Hierarchical Neural Attention-based Text Classifier
    Sinha, Koustuv
    Dong, Yue
    Cheung, Jackie C. K.
    Ruths, Derek
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 817 - 823
  • [5] A Neural Autoregressive Approach to Attention-based Recognition
    Zheng, Yin
    Zemel, Richard S.
    Zhang, Yu-Jin
    Larochelle, Hugo
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (01) : 67 - 79
  • [6] A Neural Autoregressive Approach to Attention-based Recognition
    Yin Zheng
    Richard S. Zemel
    Yu-Jin Zhang
    Hugo Larochelle
    [J]. International Journal of Computer Vision, 2015, 113 : 67 - 79
  • [7] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    [J]. Artificial Intelligence Review, 2023, 56 : 2263 - 2310
  • [8] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [9] Attention-based Hierarchical Neural Query Suggestion
    Chen, Wanyu
    Cai, Fei
    Chen, Honghui
    de Rijke, Maarten
    [J]. ACM/SIGIR PROCEEDINGS 2018, 2018, : 1093 - 1096
  • [10] Attention-Based SeriesNet: An Attention-Based Hybrid Neural Network Model for Conditional Time Series Forecasting
    Cheng, Yepeng
    Liu, Zuren
    Morimoto, Yasuhiko
    [J]. INFORMATION, 2020, 11 (06)