Performance of Robust Two-dimensional Principal Component for Classification

被引:0
|
作者
Herwindiati, Dyah E. [1 ]
Isa, Sani M. [1 ]
Hendryli, Janson [1 ]
机构
[1] Tarumanagara Univ, Fac Informat Technol, Jakarta, Indonesia
关键词
2DPCA; PCA; outlier; robust; sensitivity; vector variance; wishart distribution;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The robust dimension reduction for classification of two dimensional data is discussed in this paper. The classification process is done with reference of original data. The classifying of class membership is not easy when more than one variable are loaded with the same information, and they can be written as a near linear combination of other variables. The standard approach to overcome this problem is dimension reduction. One of the most common forms of dimensionality reduction is the principal component analysis (PCA). The two-dimensional principal component (2DPCA) is often called a variant of principal component. The image matrices were directly treated as 2D matrices; the covariance matrix of image can be constructed directly using the original image matrices. The presence of outliers in the data has been proved to pose a serious problem in dimension reduction. The first component consisting of the greatest variation is often pushed toward the anomalous observations. The robust minimizing vector variance (MVV) combined with two dimensional projection approach is used for solving the problem. The computation experiment shows the robust method has the good performances for matrix data classification.
引用
收藏
页码:434 / 440
页数:7
相关论文
共 50 条
  • [1] A Robust Two-Dimensional Principal Component Analysis for Classification
    Herwindiati, D. E.
    [J]. PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY, 2010, 94
  • [2] ROBUST TWO-DIMENSIONAL PRINCIPAL COMPONENT ANALYSIS VIA ALTERNATING OPTIMIZATION
    Sun, Yipeng
    Tao, Xiaoming
    Li, Yang
    Lu, Jianhua
    [J]. 2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 340 - 344
  • [3] Modular two-dimensional principal component regression for robust face recognition
    Zhang, Zhenyue
    Jiang, Mingyan
    Ben, Xianye
    Li, Fei
    [J]. Journal of Fiber Bioengineering and Informatics, 2015, 8 (02): : 365 - 372
  • [4] AUTO-WEIGHTED TWO-DIMENSIONAL PRINCIPAL COMPONENT ANALYSIS WITH ROBUST OUTLIERS
    Zhang, Rui
    Nie, Feiping
    Li, Xuelong
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 6065 - 6069
  • [5] Probabilistic two-dimensional principal component analysis
    College of Information Science and Technology, East China University of Science and Technology, Shanghai 200237, China
    [J]. Zidonghua Xuebao, 2008, 3 (353-359):
  • [6] Improved Two-Dimensional Quaternion Principal Component Analysis
    Zhao, Meixiang
    Jia, Zhigang
    Gong, Dunwei
    [J]. IEEE ACCESS, 2019, 7 : 79409 - 79417
  • [7] Generalized Cosine Two-dimensional Principal Component Analysis
    Wang, Xiao-Feng
    Lu, Cheng-Hao
    Li, Jin-Xiang
    Liu, Jun
    [J]. Zidonghua Xuebao/Acta Automatica Sinica, 2022, 48 (11): : 2836 - 2851
  • [8] A Unified View of Two-Dimensional Principal Component Analyses
    Inoue, Kohei
    Hara, Kenji
    Urahama, Kiichi
    [J]. STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, 2012, 7626 : 566 - 574
  • [9] Incremental two-dimensional kernel principal component analysis
    Choi, Yonghwa
    Ozawa, Seiichi
    Lee, Minho
    [J]. NEUROCOMPUTING, 2014, 134 : 280 - 288
  • [10] Principal Component Analysis of Two-Dimensional Functional Data
    Zhou, Lan
    Pan, Huijun
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (03) : 779 - 801