K-means Data Clustering with Memristor Networks

被引:88
|
作者
Jeong, YeonJoo [1 ,2 ]
Lee, Jihang [1 ,2 ]
Moon, John [1 ]
Shin, Jong Hoon [1 ]
Lu, Wei D. [1 ]
机构
[1] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
[2] Univ Michigan, Dept Mat Sci & Engn, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会;
关键词
Unsupervised learning; Euclidean distance; neuromorphic computing; analog switching; RRAM; Ta2O5; FEATURE-EXTRACTION; NEURAL-NETWORKS; CLASSIFICATION; DIMENSIONALITY; DEVICE;
D O I
10.1021/acs.nanolett.8b01526
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Memristor-based neuromorphic networks have been actively studied as a promising candidate to overcome the von-Neumann bottleneck in future computing applications. Several recent studies have demonstrated memristor network's capability to perform supervised as well as unsupervised learning, where features inherent in the input are identified and analyzed by comparing with features stored in the memristor network. However, even though in some cases the stored feature vectors can be normalized so that the winning neurons can be directly found by the (input) vector (stored) vector dot-products, in many other cases, normalization of the feature vectors is not trivial or practically feasible, and calculation of the actual Euclidean distance between the input vector and the stored vector is required. Here we report experimental implementation of memristor crossbar hardware systems that can allow direct comparison of the Euclidean distances without normalizing the weights. The experimental system enables unsupervised K-means clustering algorithm through online learning, and produces high classification accuracy (93.3%) for the standard IRIS data set. The approaches and devices can be used in other unsupervised learning systems, and significantly broaden the range of problems a memristor-based network can solve.
引用
收藏
页码:4447 / 4453
页数:7
相关论文
共 50 条
  • [41] Modified K-means Algorithm for Big Data Clustering
    Sengupta, Debapriya
    Roy, Sayantan Singha
    Ghosh, Sarbani
    Dasgupta, Ranjan
    [J]. PROCEEDINGS 2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI), 2017, : 1443 - 1448
  • [42] Selection of K in K-means clustering
    Pham, DT
    Dimov, SS
    Nguyen, CD
    [J]. PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, 2005, 219 (01) : 103 - 119
  • [43] Geodesic K-means Clustering
    Asgharbeygi, Nima
    Maleki, Arian
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 3450 - 3453
  • [44] Stability of k-means clustering
    Ben-David, Shai
    Pal, Ddvid
    Simon, Hans Ulrich
    [J]. LEARNING THEORY, PROCEEDINGS, 2007, 4539 : 20 - +
  • [45] Discriminative k-Means Clustering
    Arandjelovic, Ognjen
    [J]. 2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [46] Balanced K-Means for Clustering
    Malinen, Mikko I.
    Franti, Pasi
    [J]. STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, 2014, 8621 : 32 - 41
  • [47] On autonomous k-means clustering
    Elomaa, T
    Koivistoinen, H
    [J]. FOUNDATIONS OF INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, 3488 : 228 - 236
  • [48] On the Optimality of k-means Clustering
    Dalton, Lori A.
    [J]. 2013 IEEE INTERNATIONAL WORKSHOP ON GENOMIC SIGNAL PROCESSING AND STATISTICS (GENSIPS 2013), 2013, : 70 - 71
  • [49] Transformed K-means Clustering
    Goel, Anurag
    Majumdar, Angshul
    [J]. 29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1526 - 1530
  • [50] K-Means Clustering Explained
    Emerson, Robert Wall
    [J]. JOURNAL OF VISUAL IMPAIRMENT & BLINDNESS, 2024, 118 (01) : 65 - 66