Password Guessing Based on LSTM Recurrent Neural Networks

被引:6
|
作者
Xu, Lingzhi [1 ]
Ge, Can [1 ]
Qiu, Weidong [1 ]
Huang, Zheng [1 ]
Guo, Jie [1 ]
Lian, Huijuan [1 ]
Gong, Zheng [2 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Informat Secur Engn, Shanghai, Peoples R China
[2] South China Normal Univ, Sch Comp Sci, Guangzhou, Guangdong, Peoples R China
关键词
password guessing; recurrent neural network; LSTM;
D O I
10.1109/CSE-EUC.2017.155
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Passwords are frequently used in data encryption and user authentication. Since people incline to choose meaningful words or numbers as their passwords, lots of passwords are easy to guess. This paper introduces a password guessing method based on Long Short-Term Memory recurrent neural networks. After training our LSTM neural network with 30 million passwords from leaked Rockyou dataset, the generated 3.35 billion passwords could cover 81.52% of the remaining Rockyou dataset. Compared with PCFG and Markov methods, this method shows higher coverage rate.
引用
收藏
页码:785 / 788
页数:4
相关论文
共 50 条
  • [31] Tweet Modeling with LSTM Recurrent Neural Networks for Hashtag Recommendation
    Li, Jia
    Xu, Hua
    He, Xingwei
    Deng, Junhui
    Sun, Xiaomin
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 1570 - 1577
  • [32] From feedforward to recurrent LSTM neural networks for language modeling
    Sundermeyer, Martin
    Ney, Hermann
    Schluter, Ralf
    [J]. IEEE Transactions on Audio, Speech and Language Processing, 2015, 23 (03): : 517 - 529
  • [33] LSTM Recurrent Neural Networks for Short Text and Sentiment Classification
    Nowak, Jakub
    Taspinar, Ahmet
    Scherer, Rafal
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2017, PT II, 2017, 10246 : 553 - 562
  • [34] Towards traffic matrix prediction with LSTM recurrent neural networks
    Zhao, Jianlong
    Qu, Hua
    Zhao, Jihong
    Jiang, Dingchao
    [J]. ELECTRONICS LETTERS, 2018, 54 (09) : 566 - 567
  • [35] Predicting Activities in Business Processes with LSTM Recurrent Neural Networks
    Tello-Leal, Edgar
    Roa, Jorge
    Rubiolo, Mariano
    Ramirez-Alcocer, Ulises M.
    [J]. 2018 ITU KALEIDOSCOPE: MACHINE LEARNING FOR A 5G FUTURE (ITU K), 2018,
  • [36] From Feedforward to Recurrent LSTM Neural Networks for Language Modeling
    Sundermeyer, Martin
    Ney, Hermann
    Schlueter, Ralf
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2015, 23 (03) : 517 - 529
  • [37] Recurrent Neural Networks for Image Captioning: A Case Study with LSTM
    Mohite, Shailaja Sanjay
    Suganthini, C.
    Arunarani, A. R.
    Devi, K. Lalitha
    Sharma, Manish
    Patil, R. N.
    Shrivastava, Anurag
    [J]. JOURNAL OF ELECTRICAL SYSTEMS, 2024, 20 (03) : 1082 - 1092
  • [38] A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
    Yu, Yong
    Si, Xiaosheng
    Hu, Changhua
    Zhang, Jianxun
    [J]. NEURAL COMPUTATION, 2019, 31 (07) : 1235 - 1270
  • [39] FaCT-LSTM: Fast and Compact Ternary Architecture for LSTM Recurrent Neural Networks
    Mirsalari, Seyed Ahmad
    Nazari, Najmeh
    Sinaei, Sima
    Salehi, Mostafa E.
    Daneshtalab, Masoud
    [J]. IEEE DESIGN & TEST, 2022, 39 (03) : 45 - 53
  • [40] DenseGAN: A Password Guessing Model Based on DenseNet and PassGAN
    Fu, Chaohui
    Duan, Ming
    Dai, Xunhai
    Wei, Qiang
    Wu, Qianqiong
    Zhou, Rui
    [J]. INFORMATION SECURITY PRACTICE AND EXPERIENCE, ISPEC 2021, 2021, 13107 : 296 - 305