Analysis of fine-tuning measures in models with extended Higgs sectors

被引:2
|
作者
Boer, Daniel [1 ]
Peeters, Ruud [1 ]
Zeinstra, Sybrand [2 ]
机构
[1] Univ Groningen, Van Swinderen Inst Particle Phys & Grav, Nijenborgh 4, NL-9747 AG Groningen, Netherlands
[2] Westfalische Wilhelms Univ Munster, Inst Theoret Phys, Wilhelm Klemm Str 9, D-48149 Munster, Germany
关键词
BOUNDS;
D O I
10.1016/j.nuclphysb.2019.114695
中图分类号
O412 [相对论、场论]; O572.2 [粒子物理学];
学科分类号
摘要
In the literature measures of fine-tuning have been discussed as one of the tools to assess the feasibility of beyond the Standard Model theories. In this paper we focus on two specific measures and investigate what kind of fine-tuning they actually quantify. First we apply both measures to the two Higgs doublet model, for which one can analyze the numerical results in terms of available analytic expressions. After drawing various conclusions about the fine-tuning measures, we investigate a particular left-right symmetric model for which it has been claimed that already at tree-level it suffers from a high amount of fine-tuning. We will reach a different conclusion, although left-right symmetric models may require a modest amount of fine-tuning if phenomenological constraints are imposed. Our analysis shows that the two considered measures can probe different aspects of fine-tuning and are both useful if applied and interpreted in the appropriate way. (C) 2019 The Author(s). Published by Elsevier B.V.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Simultaneous paraphrasing and translation by fine-tuning Transformer models
    Chada, Rakesh
    NEURAL GENERATION AND TRANSLATION, 2020, : 198 - 203
  • [42] PETALS: Collaborative Inference and Fine-tuning of Large Models
    Borzunov, Alexander
    Baranchuk, Dmitry
    Dettmers, Tim
    Ryabinin, Max
    Belkada, Younes
    Chumachenko, Artem
    Samygin, Pavel
    Raffel, Colin
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-DEMO 2023, VOL 3, 2023, : 558 - 568
  • [43] Relaxed fine-tuning in models with nonuniversal gaugino masses
    Abe, Hiroyuki
    Kobayashi, Tatsuo
    Omura, Yuji
    PHYSICAL REVIEW D, 2007, 76 (01):
  • [44] Stop Search With Acceptable Fine-Tuning in Susy Models
    Cici, Ali
    Un, Cem Salih
    Kirca, Zerrin
    PROCEEDINGS OF THE TURKISH PHYSICAL SOCIETY 32ND INTERNATIONAL PHYSICS CONGRESS (TPS32), 2017, 1815
  • [45] Fine-tuning language models to recognize semantic relations
    Roussinov, Dmitri
    Sharoff, Serge
    Puchnina, Nadezhda
    LANGUAGE RESOURCES AND EVALUATION, 2023, 57 (04) : 1463 - 1486
  • [46] Fine-tuning language models to recognize semantic relations
    Dmitri Roussinov
    Serge Sharoff
    Nadezhda Puchnina
    Language Resources and Evaluation, 2023, 57 : 1463 - 1486
  • [47] Fine-Tuning Language Models with Just Forward Passes
    Malladi, Sadhika
    Gao, Tianyu
    Nichani, Eshaan
    Damian, Alex
    Lee, Jason D.
    Chen, Danqi
    Arora, Sanjeev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [48] FedPFT: Federated Proxy Fine-Tuning of Foundation Models
    Peng, Zhaopeng
    Fan, Xiaoliang
    Chen, Yufan
    Wang, Zheng
    Pan, Shirui
    Wen, Chenglu
    Zhang, Ruisheng
    Wang, Cheng
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 4806 - 4814
  • [49] Precision tests and fine tuning in twin Higgs models
    Contino, Roberto
    Greco, Davide
    Mahbubani, Rakhi
    Rattazzi, Riccardo
    Torre, Riccardo
    PHYSICAL REVIEW D, 2017, 96 (09)
  • [50] Radiative natural supersymmetry: Reconciling electroweak fine-tuning and the Higgs boson mass
    Baer, Howard
    Barger, Vernon
    Huang, Peisi
    Mickelson, Dan
    Mustafayev, Azar
    Tata, Xerxes
    PHYSICAL REVIEW D, 2013, 87 (11)