Learning Hard-Constrained Models with One Sample

被引:0
|
作者
Galanis, Andreas [1 ]
Kalavasis, Alkis [2 ]
Kandiros, Anthimos Vardis [3 ]
机构
[1] Univ Oxford, Oxford, England
[2] Yale Univ, New Haven, CT USA
[3] MIT, Cambridge, MA USA
关键词
COMPLEXITY; COLORINGS; ALGORITHM; DYNAMICS; ENTROPY; NUMBER;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider the problem of estimating the parameters of a Markov Random Field with hard-constraints using a single sample. As our main running examples, we use the k-SAT and the proper coloring models, as well as general H-coloring models; for all of these we obtain both positive and negative results. In contrast to the soft-constrained case, we show in particular that single-sample estimation is not always possible, and that the existence of an estimator is related to the existence of non-satisfiable instances. Our algorithms are based on the pseudo-likelihood estimator. We show variance bounds for this estimator using coupling techniques inspired, in the case of k-SAT, by Moitra's sampling algorithm (JACM, 2019); our positive results for colorings build on this new coupling approach. For q-colorings on graphs with maximum degree d, we give a linear-time estimator exists when q > d + 1, whereas the problem is non-identifiable when q <= d + 1. For general H-colorings, we show that standard conditions that guarantee sampling, such as Dobrushin's condition, are insufficient for one-sample learning; on the positive side, we provide a general condition that is sufficient to guarantee linear-time learning and obtain applications for proper colorings and permissive models. For the k-SAT model on formulas with maximum degree d, we provide a linear-time estimator when k greater than or similar to 6.45 log d, whereas the problem becomes non-identifiable when k less than or similar to log d.
引用
收藏
页码:3184 / 3196
页数:13
相关论文
共 50 条
  • [31] ON ONE OF MATERN HARD-CORE POINT PROCESS MODELS
    STOYAN, D
    STOYAN, H
    [J]. MATHEMATISCHE NACHRICHTEN, 1985, 122 : 205 - 214
  • [32] Verification of the nucleation theorem for the internally constrained one-dimensional hard rod fluid
    Punnathanam, S
    Corti, DS
    [J]. JOURNAL OF CHEMICAL PHYSICS, 2002, 117 (13): : 6169 - 6175
  • [33] SAMPLE-TO-SAMPLE FLUCTUATIONS IN ONE-DIMENSIONAL DISORDERED ISING-MODELS
    CRISANTI, A
    DESIMONE, G
    PALADIN, G
    VULPIANI, A
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 1993, 192 (04) : 589 - 603
  • [34] ROBUST ESTIMATES AND TESTS FOR THE ONE-SAMPLE AND TWO-SAMPLE SCALE MODELS
    SHOEMAKER, LH
    HETTMANSPERGER, TP
    [J]. BIOMETRIKA, 1982, 69 (01) : 47 - 53
  • [35] Sample Efficient Grasp Learning Using Equivariant Models
    Zhu, Xupeng
    Wang, Dian
    Biza, Ondrej
    Su, Guanang
    Walters, Robin
    Platt, Robert
    [J]. ROBOTICS: SCIENCE AND SYSTEM XVIII, 2022,
  • [36] Operating Critical Machine Learning Models in Resource Constrained Regimes
    Selvan, Raghavendra
    Schon, Julian
    Dam, Erik B.
    [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2023 WORKSHOPS, 2023, 14394 : 325 - 335
  • [37] ONE-SAMPLE ESTIMATION FOR GENERALIZED LEHMANN ALTERNATIVE MODELS
    MIURA, R
    TSUKAHARA, H
    [J]. STATISTICA SINICA, 1993, 3 (01) : 83 - 101
  • [38] Criterion for condensation in kinetically constrained one-dimensional transport models
    Miedema, D. M.
    de Wijn, A. S.
    Schall, P.
    [J]. PHYSICAL REVIEW E, 2014, 89 (06):
  • [39] Efficient Gradient Support Pursuit With Less Hard Thresholding for Cardinality-Constrained Learning
    Shang, Fanhua
    Wei, Bingkun
    Liu, Hongying
    Liu, Yuanyuan
    Zhou, Pan
    Gong, Maoguo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7806 - 7817
  • [40] Maximum gaps in one-dimensional hard-core models
    Dong, Dingding
    Mani, Nitya
    [J]. ELECTRONIC COMMUNICATIONS IN PROBABILITY, 2023, 28