Linearity of word-level representations of multiple-valued networks

被引:0
|
作者
Yanushkevich, SN [1 ]
Shmerko, VP
Malyugin, VD
Dziurzanski, P
Tomaszewska, AM
机构
[1] Univ Calgary, Dept Elect & Comp Engn, Calgary, AB, Canada
[2] Russian Acad Sci, Control Problems Inst, Moscow 117901, Russia
[3] Tech Univ Szczecin, Fac Comp Sci, Szczecin, Poland
关键词
multiple-valued logic; spectral technique; linear decision diagrams;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study a boundary case of combinational Multiple-Valued Logic (MVL) network representations, namely, the linear word-level expressions and Linear word-level Decision Diagrams (LDDs). The latter models have a number of useful properties: linearity and planarity, while they need less memory compared to other circuit formats; besides, they are compatible with the traditional word-level Decision Diagrams (DDs). The essential point of our approach is to represent every level of MVL network by a linear word-level expression that is mapped to an LDD. Hence, an arbitrary MVL network is represented by a set of LDDs. We prove that the word-level DD model upon the condition of linearity, lost the ability to optimize the initial form. Two types of linear word-level expressions are studied. The first type of word-level description is based on an arithmetic (spectral) transform of the integer-valued equivalent of the initial multi-output function. in the second type we obtain the word-level description with digit-wise operations by mapping of the initial functions to integer-valued equivalent. For both approaches, we successfully simulate MVL networks with about 250 levels and 8000 ternary gates. We test both models and show that the LDD representation of an MVL circuit consumes ten times less memory compared to standard formats.
引用
收藏
页码:129 / 158
页数:30
相关论文
共 50 条
  • [1] The word-level models for efficient computation of multiple-valued functions. Part 2: LWL based model
    Tomaszewska, AM
    Yanushkevich, SN
    Shmerko, VP
    [J]. ISMVL 2002: 32ND IEEE INTERNATIONAL SYMPOSIUM ON MULTIPLE-VALUED LOGIC, PROCEEDINGS, 2002, : 209 - 215
  • [2] The word-level models for efficient computation of Multiple-Valued Functions. Part 1: LAR based model
    Yanushkevich, SN
    Dziurzanski, P
    Shmerko, VP
    [J]. ISMVL 2002: 32ND IEEE INTERNATIONAL SYMPOSIUM ON MULTIPLE-VALUED LOGIC, PROCEEDINGS, 2002, : 202 - 208
  • [3] Combining Subword Representations into Word-level Representations in the Transformer Architecture
    Casas, Noe
    Costa-jussa, Marta R.
    Fonollosa, Jose A. R.
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020): STUDENT RESEARCH WORKSHOP, 2020, : 66 - 71
  • [4] Design of Multiple-Valued Logic Networks with Regular Structure by Using Spectral Representations
    Stankovic, Radomir S.
    Astola, Jaakko T.
    Moraga, Claudio
    [J]. JOURNAL OF MULTIPLE-VALUED LOGIC AND SOFT COMPUTING, 2012, 19 (1-3) : 251 - 269
  • [5] Multiple-Valued Artificial Neural Networks
    Makarenko, Alexander
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [6] Stochastic Multiple-Valued Gene Networks
    Zhu, Peican
    Han, Jie
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2014, 8 (01) : 42 - 53
  • [7] Abstract recommendation system: beyond word-level representations
    Korolev, Vadim
    Mitrofanov, Artem
    Sattarov, Boris
    Tkachenko, Valery
    [J]. ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2019, 258
  • [8] Random Multiple-Valued Networks: Theory and Applications
    Dubrova, Elena
    [J]. ISMVL 2006: 36th International Symposium on Multiple-Valued Logic, 2006, : 159 - 164
  • [9] Neural networks: Binary monotonic and multiple-valued
    Zurada, JM
    [J]. 30TH IEEE INTERNATIONAL SYMPOSIUM ON MULTIPLE-VALUED LOGIC, PROCEEDINGS, 2000, : 67 - 74
  • [10] Gating Mechanisms for Combining Character and Word-level Word Representations: An Empirical Study
    Balazs, Jorge A.
    Matsuo, Yutaka
    [J]. NAACL HLT 2019: THE 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2019, : 110 - 124