Astroconformer: The prospects of analysing stellar light curves with transformer-based deep learning models

被引:2
|
作者
Pan, Jia-Shu [1 ,2 ]
Ting, Yuan-Sen [1 ,3 ,4 ]
Yu, Jie [3 ,5 ,6 ]
机构
[1] Australian Natl Univ, Res Sch Astron & Astrophys, Cotter Rd, Weston, ACT 2611, Australia
[2] Nanjing Univ, Sch Astron & Space Sci, Nanjing 210093, Peoples R China
[3] Australian Natl Univ, Sch Comp, Acton, ACT 2601, Australia
[4] Ohio State Univ, Dept Astron, Columbus, OH 43210 USA
[5] Max Planck Inst Solar Syst Res, Justus von Liebig Weg 3, D-37077 Gottingen, Germany
[6] Heidelberg Inst Theoret Studies HITS gGmbH, Schloss Wolfsbrunnenweg 35, D-69118 Heidelberg, Germany
基金
澳大利亚研究理事会;
关键词
asteroseismology; methods: data analysis; MAGNETIC-FIELDS; ASTEROSEISMOLOGY; STARS; CORES;
D O I
10.1093/mnras/stae068
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
Stellar light curves contain valuable information about oscillations and granulation, offering insights into stars' internal structures and evolutionary states. Traditional asteroseismic techniques, primarily focused on power spectral analysis, often overlook the crucial phase information in these light curves. Addressing this gap, recent machine learning applications, particularly those using Convolutional Neural Networks (CNNs), have made strides in inferring stellar properties from light curves. However, CNNs are limited by their localized feature extraction capabilities. In response, we introduce Astroconformer, a Transformer-based deep learning framework, specifically designed to capture long-range dependencies in stellar light curves. Our empirical analysis centres on estimating surface gravity (log g), using a data set derived from single-quarter Kepler light curves with log g values ranging from 0.2 to 4.4. Astroconformer demonstrates superior performance, achieving a root-mean-square-error (RMSE) of 0.017 dex at log g approximate to 3 in data-rich regimes and up to 0.1 dex in sparser areas. This performance surpasses both K-nearest neighbour models and advanced CNNs. Ablation studies highlight the influence of receptive field size on model effectiveness, with larger fields correlating to improved results. Astroconformer also excels in extracting nu(max) with high precision. It achieves less than 2 per cent relative median absolute error for 90-d red giant light curves. Notably, the error remains under 3 per cent for 30-d light curves, whose oscillations are undetectable by a conventional pipeline in 30 per cent cases. Furthermore, the attention mechanisms in Astroconformer align closely with the characteristics of stellar oscillations and granulation observed in light curves.
引用
收藏
页码:5890 / 5903
页数:14
相关论文
共 50 条
  • [21] Transformer-based deep learning for predicting protein properties in the life sciences
    Chandra, Abel
    Tunnermann, Laura
    Lofstedt, Tommy
    Gratz, Regina
    ELIFE, 2023, 12
  • [22] Automatic identification of suicide notes with a transformer-based deep learning model
    Zhang, Tianlin
    Schoene, Annika M.
    Ananiadou, Sophia
    INTERNET INTERVENTIONS-THE APPLICATION OF INFORMATION TECHNOLOGY IN MENTAL AND BEHAVIOURAL HEALTH, 2021, 25
  • [23] Transformer-Based Deep Learning Architecture for Improved Cardiac Substructure Segmentation
    Summerfield, N.
    Qiu, J.
    Hossain, S.
    Dong, M.
    Glide-Hurst, C.
    MEDICAL PHYSICS, 2022, 49 (06) : E525 - E526
  • [24] A transformer-based deep learning model for Persian moral sentiment analysis
    Karami, Behnam
    Bakouie, Fatemeh
    Gharibzadeh, Shahriar
    JOURNAL OF INFORMATION SCIENCE, 2023,
  • [25] Transformer-Based Deep Learning Network for Tooth Segmentation on Panoramic Radiographs
    Sheng Chen
    Wang Lin
    Huang Zhenhuan
    Wang Tian
    Guo Yalin
    Hou Wenjie
    Xu Laiqing
    Wang Jiazhu
    Yan Xue
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2023, 36 (01) : 257 - 272
  • [26] A Transformer-Based Deep Learning Network for Underwater Acoustic Target Recognition
    Feng, Sheng
    Zhu, Xiaoqian
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [27] Transformer-Based Deep Learning Network for Tooth Segmentation on Panoramic Radiographs
    Chen Sheng
    Lin Wang
    Zhenhuan Huang
    Tian Wang
    Yalin Guo
    Wenjie Hou
    Laiqing Xu
    Jiazhu Wang
    Xue Yan
    Journal of Systems Science and Complexity, 2023, 36 : 257 - 272
  • [28] Broadband Solar Metamaterial Absorbers Empowered by Transformer-Based Deep Learning
    Chen, Wei
    Gao, Yuan
    Li, Yuyang
    Yan, Yiming
    Ou, Jun-Yu
    Ma, Wenzhuang
    Zhu, Jinfeng
    ADVANCED SCIENCE, 2023, 10 (13)
  • [29] Transformer-Based Deep Learning Network for Tooth Segmentation on Panoramic Radiographs
    SHENG Chen
    WANG Lin
    HUANG Zhenhuan
    WANG Tian
    GUO Yalin
    HOU Wenjie
    XU Laiqing
    WANG Jiazhu
    YAN Xue
    Journal of Systems Science & Complexity, 2023, 36 (01) : 257 - 272
  • [30] Bornon: Bengali Image Captioning with Transformer-Based Deep Learning Approach
    Faisal Muhammad Shah
    Mayeesha Humaira
    Md Abidur Rahman Khan Jim
    Amit Saha Ami
    Shimul Paul
    SN Computer Science, 2022, 3 (1)