A generic LSTM neural network architecture to infer heterogeneous model transformations

被引:13
|
作者
Burgueno, Loli [1 ,2 ]
Cabot, Jordi [1 ,3 ]
Li, Shuai [2 ]
Gerard, Sebastien [2 ]
机构
[1] Open Univ Catalonia, IN3, Barcelona, Spain
[2] Univ Paris Saclay, CEA, Inst LIST, Gif Sur Yvette, France
[3] ICREA, Barcelona, Spain
来源
SOFTWARE AND SYSTEMS MODELING | 2022年 / 21卷 / 01期
关键词
Model manipulation; Code generation; Model transformation; Artificial intelligence; Machine learning; Neural networks;
D O I
10.1007/s10270-021-00893-y
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Models capture relevant properties of systems. During the models' life-cycle, they are subjected to manipulations with different goals such as managing software evolution, performing analysis, increasing developers' productivity, and reducing human errors. Typically, these manipulation operations are implemented as model transformations. Examples of these transformations are (i) model-to-model transformations for model evolution, model refactoring, model merging, model migration, model refinement, etc., (ii) model-to-text transformations for code generation and (iii) text-to-model ones for reverse engineering. These operations are usually manually implemented, using general-purpose languages such as Java, or domain-specific languages (DSLs) such as ATL or Acceleo. Even when using such DSLs, transformations are still time-consuming and error-prone. We propose using the advances in artificial intelligence techniques to learn these manipulation operations on models and automate the process, freeing the developer from building specific pieces of code. In particular, our proposal is a generic neural network architecture suitable for heterogeneous model transformations. Our architecture comprises an encoder-decoder long short-term memory with an attention mechanism. It is fed with pairs of input-output examples and, once trained, given an input, automatically produces the expected output. We present the architecture and illustrate the feasibility and potential of our approach through its application in two main operations on models: model-to-model transformations and code generation. The results confirm that neural networks are able to faithfully learn how to perform these tasks as long as enough data are provided and no contradictory examples are given.
引用
下载
收藏
页码:139 / 156
页数:18
相关论文
共 50 条
  • [41] The LSTM Neural Network Based on Memristor
    Chu, Ziqi
    Xu, Hui
    Liu, Haijun
    2020 3RD INTERNATIONAL CONFERENCE ON COMPUTER INFORMATION SCIENCE AND APPLICATION TECHNOLOGY (CISAT) 2020, 2020, 1634
  • [42] Intelligent Evaluation Method of Architecture and Interior Art Design Education Based on LSTM Neural Network
    You Y.
    Ding H.
    Computer-Aided Design and Applications, 2023, 20 (S10): : 124 - 134
  • [43] A Direct Data Aware LSTM Neural Network Architecture for Complete Remaining Trace and Runtime Prediction
    Gunnarsson, Bjorn Rafn
    vanden Broucke, Seppe
    De Weerdt, Jochen
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (04) : 2330 - 2342
  • [44] Heterogeneous Transformer: A Scale Adaptable Neural Network Architecture for Device Activity Detection
    Li, Yang
    Chen, Zhilin
    Wang, Yunqi
    Yang, Chenyang
    Ai, Bo
    Wu, Yik-Chung
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (05) : 3432 - 3446
  • [45] A Heterogeneous Architecture for the Vision Processing Unit with a Hybrid Deep Neural Network Accelerator
    Liu, Peng
    Yang, Zikai
    Kang, Lin
    Wang, Jian
    MICROMACHINES, 2022, 13 (02)
  • [46] The Heterogeneous Deep Neural Network Processor With a Non-von Neumann Architecture
    Shin, Dongjoo
    Yoo, Hoi-Jun
    PROCEEDINGS OF THE IEEE, 2020, 108 (08) : 1245 - 1260
  • [47] Generic Architecture for Minimizing Drive Tests in Heterogeneous Networks
    Hiltunen, Tuomas
    Mondal, Riaz Uddin
    Turkka, Jussi
    Ristaniemi, Tapani
    2015 IEEE 82ND VEHICULAR TECHNOLOGY CONFERENCE (VTC FALL), 2015,
  • [48] MIRAI architecture for heterogeneous network
    Mizuno, Mitsuhiko
    Wu, Gang
    Havinga, Paul J.M.
    Journal of the Communications Research Laboratory, 2001, 48 (04): : 11 - 22
  • [49] Dynamically scalable, heterogeneous and generic architecture for a grid of workstations
    Purusothaman T.
    Annadurai S.
    Vijay Ganesh H.
    Chockalingam C.T.
    Uthra Kumar B.
    Journal of Grid Computing, 2004, 2 (3) : 239 - 246
  • [50] MIRAI architecture for heterogeneous network
    Wu, G
    Mizuno, M
    Havinga, PJM
    IEEE COMMUNICATIONS MAGAZINE, 2002, 40 (02) : 126 - 134