Online and Distribution-Free Robustness: Regression and Contextual Bandits with Huber Contamination

被引:9
|
作者
Chen, Sitan [1 ]
Koehler, Frederic [1 ]
Moitra, Ankur [2 ]
Yau, Morris [2 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA
关键词
robust statistics; regression; contextual bandits; online learning; Huber contamination; ASYMPTOTICS;
D O I
10.1109/FOCS52979.2021.00072
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this work we revisit two classic high-dimensional online learning problems, namely linear regression and contextual bandits, from the perspective of adversarial robustness. Existing works in algorithmic robust statistics make strong distributional assumptions that ensure that the input data is evenly spread out or comes from a nice generative model. Is it possible to achieve strong robustness guarantees even without distributional assumptions altogether, where the sequence of tasks we are asked to solve is adaptively and adversarially chosen? We answer this question in the affirmative for both linear regression and contextual bandits. In fact our algorithms succeed where conventional methods fail. In particular we show strong lower bounds against Huber regression and more generally any convex M-estimator. Our approach is based on a novel alternating minimization scheme that interleaves ordinary least-squares with a simple convex program that finds the optimal reweighting of the distribution under a spectral constraint. Our results obtain essentially optimal dependence on the contamination level eta, reach the optimal breakdown point, and naturally apply to infinite dimensional settings where the feature vectors are represented implicitly via a kernel map.
引用
收藏
页码:684 / 695
页数:12
相关论文
共 50 条
  • [31] DISTRIBUTION-FREE ESTIMATION OF THE BOX-COX REGRESSION MODEL WITH CENSORING
    Chen, Songnian
    ECONOMETRIC THEORY, 2012, 28 (03) : 680 - 695
  • [32] DISTRIBUTION-FREE CONSISTENCY RESULTS IN NONPARAMETRIC DISCRIMINATION AND REGRESSION FUNCTION ESTIMATION
    DEVROYE, LP
    WAGNER, TJ
    ANNALS OF STATISTICS, 1980, 8 (02): : 231 - 239
  • [33] Distribution-free tolerance intervals with nomination samples: Applications to mercury contamination in fish
    Nourmohammadi, Mohammad
    Jozani, Mohammad Jafari
    Johnson, Brad C.
    STATISTICAL METHODOLOGY, 2015, 26 : 16 - 33
  • [34] Exactly distribution-free inference in instrumental variables regression with possibly weak instruments
    Andrews, Donald W. K.
    Marmer, Vadim
    JOURNAL OF ECONOMETRICS, 2008, 142 (01) : 183 - 200
  • [35] A novel distribution-free hybrid regression model for manufacturing process efficiency improvement
    Chakraborty, Tanujit
    Chakraborty, Ashis Kumar
    Chattopadhyay, Swarup
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2019, 362 : 130 - 142
  • [36] DISTRIBUTION-FREE TESTS FOR PARALLELISM AND CONCURRENCE IN 2-SAMPLE REGRESSION PROBLEM
    RAO, KSM
    GORE, AP
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 1981, 5 (03) : 281 - 286
  • [37] Distribution-Free Model-Agnostic Regression Calibration via Nonparametric Methods
    Liu, Shang
    Cai, Zhongze
    Li, Xiaocheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [38] DISTRIBUTION-FREE DERIVATION OF DISPERSION OF REGRESSION-COEFFICIENTS WITHOUT THE NULL HYPOTHESIS
    ANDERSON, O
    BAUER, RK
    METRIKA, 1963, 6 (01) : 10 - 17
  • [39] Image-to-Image Regression with Distribution-Free Uncertainty Quantification and Applications in Imaging
    Angelopoulos, Anastasios N.
    Kohli, Amit
    Bates, Stephen
    Jordan, Michael I.
    Malik, Jitendra
    Alshaabi, Thayer
    Upadhyayula, Srigokul
    Romano, Yaniv
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 717 - 730
  • [40] Distribution-free risk assessment of regression-based machine learning algorithms
    Singh, Sukrita
    Sarna, Neeraj
    Li, Yuanyuan
    Lin, Yang
    Orfanoudaki, Agni
    Berger, Michael
    13TH SYMPOSIUM ON CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, 2024, 230 : 44 - 64