A refined approximation for Euclidean k-means

被引:8
|
作者
Grandoni, Fabrizio [1 ]
Ostrovsky, Rafail [2 ]
Rabani, Yuval [3 ]
Schulman, Leonard J. [4 ]
Venkat, Rakesh [5 ]
机构
[1] IDSIA, USI SUPSI, Lugano, Switzerland
[2] Univ Calif Los Angeles, Los Angeles, CA 90024 USA
[3] Hebrew Univ Jerusalem, Jerusalem, Israel
[4] CALTECH, Pasadena, CA 91125 USA
[5] IIT Hyderabad, Kandi, India
基金
美国国家科学基金会;
关键词
Approximation algorithms; Euclidean k-means; Euclidean facility location; Integrality gaps; LOCAL SEARCH YIELDS;
D O I
10.1016/j.ipl.2022.106251
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the Euclidean k-Means problem we are given a collection of n points D in an Euclidean space and a positive integer k. Our goal is to identify a collection of k points in the same space (centers) so as to minimize the sum of the squared Euclidean distances between each point in D and the closest center. This problem is known to be APX-hard and the current best approximation ratio is a primal-dual 6.357 approximation based on a standard LP for the problem [Ahmadian et al. FOCS'17, SICOMP'20]. In this note we show how a minor modification of Ahmadian et al.'s analysis leads to a slightly improved 6.12903 approximation. As a related result, we also show that the mentioned LP has integrality gap at least 16+root 5 15 > 1.2157. (c) 2022 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Improved Coresets for Euclidean k-Means
    Cohen-Addad, Vincent
    Larsen, Kasper Green
    Saulpic, David
    Schwiegelshohn, Chris
    Sheikh-Omar, Omar Ali
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [2] On Euclidean k-Means Clustering with α-Center Proximity
    Deshpande, Amit
    Louis, Anand
    Singh, Apoorv Vikram
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [3] Clustering Stable Instances of Euclidean k-means
    Dutta, Abhratanu
    Vijayaraghavan, Aravindan
    Wang, Alex
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Refined Learning Bounds for Kernel and Approximate k-Means
    Liu, Yong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] K-means Clustering Algorithm with Refined Initial Center
    Chen, Xuhui
    Xu, Yong
    [J]. PROCEEDINGS OF THE 2009 2ND INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND INFORMATICS, VOLS 1-4, 2009, : 2203 - 2206
  • [6] Local search yields approximation schemes for k-means and k-median in Euclidean and minor-free metrics
    Cohen-Addad, Vincent
    Klein, Philip N.
    Mathieu, Claire
    [J]. 2016 IEEE 57TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2016, : 353 - 364
  • [7] LOCAL SEARCH YIELDS APPROXIMATION SCHEMES FOR k-MEANS AND k-MEDIAN IN EUCLIDEAN AND MINOR-FREE METRICS
    Cohen-Addad, Vincent
    Klein, Philip N.
    Mathieu, Claire
    [J]. SIAM JOURNAL ON COMPUTING, 2019, 48 (02) : 644 - 667
  • [8] Kernel K-Means Sampling for Nystrom Approximation
    He, Li
    Zhang, Hong
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (05) : 2108 - 2120
  • [9] Approximation of Kernel k-Means for Streaming Data
    Havens, Timothy C.
    [J]. 2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 509 - 512
  • [10] Fast k-means algorithms with constant approximation
    Song, MJ
    Rajasekaran, S
    [J]. ALGORITHMS AND COMPUTATION, 2005, 3827 : 1029 - 1038