Feature Selection for Classification with QAOA

被引:3
|
作者
Turati, Gloria [1 ]
Dacrema, Maurizio Ferrari [1 ]
Cremonesi, Paolo [1 ]
机构
[1] Politecn Milan, Milan, Italy
关键词
QAOA; Feature selection; QUBO; Classification;
D O I
10.1109/QCE53715.2022.00117
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Feature selection is of great importance in Machine Learning, where it can be used to reduce the dimensionality of classification, ranking and prediction problems. The removal of redundant and noisy features can improve both the accuracy and scalability of the trained models. However, feature selection is a computationally expensive task with a solution space that grows combinatorically. In this work, we consider in particular a quadratic feature selection problem that can be tackled with the Quantum Approximate Optimization Algorithm (QAOA), already employed in combinatorial optimization. First we represent the feature selection problem with the QUBO formulation, which is then mapped to an Ising spin Hamiltonian. Then we apply QAOA with the goal of finding the ground state of this Hamiltonian, which corresponds to the optimal selection of features. In our experiments, we consider seven different real-world datasets with dimensionality up to 21 and run QAOA on both a quantum simulator and, for small datasets, the 7-qubit IBM (ibm-perth) quantum computer. We use the set of selected features to train a classification model and evaluate its accuracy. Our analysis shows that it is possible to tackle the feature selection problem with QAOA and that currently available quantum devices can be used effectively. Future studies could test a wider range of classification models as well as improve the effectiveness of QAOA by exploring better performing optimizers for its classical step.
引用
收藏
页码:782 / 785
页数:4
相关论文
共 50 条
  • [1] Feature Selection for Collective Classification
    Senliol, Baris
    Aral, Atakan
    Cataltepe, Zehra
    [J]. 2009 24TH INTERNATIONAL SYMPOSIUM ON COMPUTER AND INFORMATION SCIENCES, 2009, : 285 - 290
  • [2] Feature Selection for Twitter Classification
    Ostrowski, David Alfred
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC), 2014, : 267 - 272
  • [3] ONLINE FEATURE SELECTION AND CLASSIFICATION
    Kalkan, Habil
    Cetisli, Bayram
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 2124 - 2127
  • [4] Feature Selection for Monotonic Classification
    Hu, Qinghua
    Pan, Weiwei
    Zhang, Lei
    Zhang, David
    Song, Yanping
    Guo, Maozu
    Yu, Daren
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2012, 20 (01) : 69 - 81
  • [5] Feature Selection for Gender Classification
    Zhang, Zhihong
    Hancock, Edwin R.
    [J]. PATTERN RECOGNITION AND IMAGE ANALYSIS: 5TH IBERIAN CONFERENCE, IBPRIA 2011, 2011, 6669 : 76 - 83
  • [6] Feature Selection in Text Classification
    Sahin, Durmus Ozkan
    Ates, Nurullah
    Kilic, Erdal
    [J]. 2016 24TH SIGNAL PROCESSING AND COMMUNICATION APPLICATION CONFERENCE (SIU), 2016, : 1777 - 1780
  • [7] Sequential Feature Selection for Classification
    Rueckstiess, Thomas
    Osendorfer, Christian
    van der Smagt, Patrick
    [J]. AI 2011: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2011, 7106 : 132 - +
  • [8] FEATURE SELECTION BASED ON COMPLEMENTARITY OF FEATURE CLASSIFICATION CAPABILITY
    Gao, Fei
    Yu, Tian
    Wei, Yang
    Jin, Han
    Wei, Jin-Mao
    [J]. PROCEEDINGS OF 2013 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOLS 1-4, 2013, : 130 - 135
  • [9] A Review on Feature Selection and Feature Extraction for Text Classification
    Shah, Foram P.
    Patel, Vibha
    [J]. PROCEEDINGS OF THE 2016 IEEE INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS, SIGNAL PROCESSING AND NETWORKING (WISPNET), 2016, : 2264 - 2268
  • [10] Classification Algorithm Based on Feature Selection and Samples Selection
    Xu, Yitian
    Zhen, Ling
    Yang, Liming
    Wang, Laisheng
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 2, PROCEEDINGS, 2009, 5552 : 631 - 638