Deep neural networks for texture classification-A theoretical analysis

被引:60
|
作者
Basu, Saikat [1 ]
Mukhopadhyay, Supratik [1 ]
Karki, Manohar [1 ]
DiBiano, Robert [1 ]
Ganguly, Sangram [4 ]
Nemani, Ramakrishna [2 ]
Gayaka, Shreekant [3 ]
机构
[1] Louisiana State Univ, Baton Rouge, LA 70803 USA
[2] NASA, Ames Res Ctr, Moffett Field, CA 94035 USA
[3] Appl Mat Inc, Santa Clara, CA USA
[4] NASA, Ames Res Ctr, Bay Area Environm Res Inst, Moffett Field, CA 94035 USA
关键词
Deep neural network; Texture classification; vc dimension;
D O I
10.1016/j.neunet.2017.10.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We investigate the use of Deep Neural Networks for the classification of image datasets where texture features are important for generating class-conditional discriminative representations. To this end, we first derive the size of the feature space for some standard textural features extracted from the input dataset and then use the theory of Vapnik-Chervonenkis dimension to show that hand-crafted feature extraction creates low-dimensional representations which help in reducing the overall excess error rate. As a corollary to this analysis, we derive for the first time upper bounds on the VC dimension of Convolutional Neural Network as well as Dropout and Dropconnect networks and the relation between excess error rate of Dropout and Dropconnect networks. The concept of intrinsic dimension is used to validate the intuition that texture-based datasets are inherently higher dimensional as compared to handwritten digits or other object recognition datasets and hence more difficult to be shattered by neural networks. We then derive the mean distance from the centroid to the nearest and farthest sampling points in an n-dimensional manifold and show that the Relative Contrast of the sample data vanishes as dimensionality of the underlying vector space tends to infinity. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:173 / 182
页数:10
相关论文
共 50 条
  • [21] Landscape Classification with Deep Neural Networks
    Buscombe, Daniel
    Ritchie, Andrew C.
    GEOSCIENCES, 2018, 8 (07)
  • [22] Selective Classification for Deep Neural Networks
    Geifman, Yonatan
    El-Yaniv, Ran
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [23] Shortcut Convolutional Neural Networks for Classification of Gender and Texture
    Zhang, Ting
    Li, Yujian
    Liu, Zhaoying
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, PT II, 2017, 10614 : 30 - 39
  • [24] Seed Texture Classification by Random Forest and Neural Networks
    Aygun, Sercan
    Yalcin, Hulya
    Gunes, Ece Olcay
    2017 25TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2017,
  • [25] Multi-target deep neural networks: Theoretical analysis and implementation
    Zeng, Zeng
    Liang, Nanying
    Yang, Xulei
    Hoi, Steven
    NEUROCOMPUTING, 2018, 273 : 634 - 642
  • [26] Deep convolutional neural networks for regular texture recognition
    Liu, Ni
    Rogers, Mitchell
    Cui, Hua
    Liu, Weiyu
    Li, Xizhi
    Delmas, Patrice
    PEERJ COMPUTER SCIENCE, 2022, 8
  • [27] Deep neural networks approach to skin lesions classification - a comparative analysis
    Kwasigroch, Arkadiusz
    Mikolajczyk, Agnieszka
    Grochowski, Michal
    2017 22ND INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS (MMAR), 2017, : 1043 - 1048
  • [28] A comparative study and analysis of LSTM deep neural networks for heartbeats classification
    Srinidhi Hiriyannaiah
    Siddesh G M
    Kiran M H M
    K G Srinivasa
    Health and Technology, 2021, 11 : 663 - 671
  • [29] A comparative study and analysis of LSTM deep neural networks for heartbeats classification
    Hiriyannaiah, Srinidhi
    Siddesh, G. M.
    Kiran, M. H. M.
    Srinivasa, K. G.
    HEALTH AND TECHNOLOGY, 2021, 11 (03) : 663 - 671
  • [30] On the Depth of Deep Neural Networks: A Theoretical View
    Sun, Shizhao
    Chen, Wei
    Wang, Liwei
    Liu, Xiaoguang
    Liu, Tie-Yan
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2066 - 2072