Loss Minimization Yields Multicalibration for Large Neural Networks

被引:0
|
作者
Blasiok, Jaroslaw [1 ]
Gopalan, Parikshit [2 ]
Hu, Lunjia [3 ]
Kalai, Adam Tauman [4 ]
Nakkiran, Preetum [2 ]
机构
[1] Swiss Fed Inst Technol, Zurich, Switzerland
[2] Apple, Palo Alto, CA USA
[3] Stanford Univ, Stanford, CA 94305 USA
[4] Microsoft Res, Cambridge, MA USA
来源
15TH INNOVATIONS IN THEORETICAL COMPUTER SCIENCE CONFERENCE, ITCS 2024 | 2024年
关键词
Multi-group fairness; loss minimization; neural networks;
D O I
10.4230/LIPIcs.ITCS.2024.17
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Multicalibration is a notion of fairness for predictors that requires them to provide calibrated predictions across a large set of protected groups. Multicalibration is known to be a distinct goal than loss minimization, even for simple predictors such as linear functions. In this work, we consider the setting where the protected groups can be represented by neural networks of size k, and the predictors are neural networks of size n > k. We show that minimizing the squared loss over all neural nets of size n implies multicalibration for all but a bounded number of unlucky values of n. We also give evidence that our bound on the number of unlucky values is tight, given our proof technique. Previously, results of the flavor that loss minimization yields multicalibration were known only for predictors that were near the ground truth, hence were rather limited in applicability. Unlike these, our results rely on the expressivity of neural nets and utilize the representation of the predictor.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] Optimization of wavelet neural networks based on structural risk minimization
    Huang, Min
    Cui, Baotong
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13 : 1185 - 1188
  • [42] Optimizing multilayer Bayesian neural networks for evaluation of fission yields
    Wang, Zi-Ao
    Pei, Junchen
    PHYSICAL REVIEW C, 2021, 104 (06)
  • [43] Forecasting Government Bond Yields with Neural Networks Considering Cointegration
    Wegener, Christoph
    von Spreckelsen, Christian
    Basse, Tobias
    von Mettenheim, Hans-Joerg
    JOURNAL OF FORECASTING, 2016, 35 (01) : 86 - 92
  • [44] Minimization of internal shrinkage in castings using synthesis of neural networks
    Tiwari, M.K.
    Roy, Debjit
    International Journal of Smart Engineering System Design, 2002, 4 (03): : 205 - 214
  • [45] Convolutional Neural Networks with Large-Margin Softmax Loss Function for Cognitive Load Recognition
    Liu, Yuetian
    Liu, Qingshan
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 4045 - 4049
  • [46] Dynamics of large uncontrolled loss networks
    Zachary, S
    JOURNAL OF APPLIED PROBABILITY, 2000, 37 (03) : 685 - 695
  • [47] Combined probabilistic deflection and retransmission scheme for loss minimization in OBS networks
    Thachayani, M.
    Nakkeeran, R.
    OPTICAL SWITCHING AND NETWORKING, 2015, 18 : 51 - 58
  • [48] Performance Loss Minimization in Cooperative Networks Based on Quantized Channel Feedback
    Karamad, Ehsan
    Adve, Raviraj S.
    2012 50TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2012, : 1872 - 1878
  • [49] Weighted Data Loss Minimization in UAV Enabled Wireless Sensor Networks
    Xiang, Zhengzhong
    Liu, Tang
    Peng, Jian
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS (WASA 2022), PT II, 2022, 13472 : 117 - 129
  • [50] Loss Functions for Image Restoration With Neural Networks
    Zhao, Hang
    Gallo, Orazio
    Frosio, Iuri
    Kautz, Jan
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2017, 3 (01) : 47 - 57