Astroconformer: The prospects of analysing stellar light curves with transformer-based deep learning models

被引:2
|
作者
Pan, Jia-Shu [1 ,2 ]
Ting, Yuan-Sen [1 ,3 ,4 ]
Yu, Jie [3 ,5 ,6 ]
机构
[1] Australian Natl Univ, Res Sch Astron & Astrophys, Cotter Rd, Weston, ACT 2611, Australia
[2] Nanjing Univ, Sch Astron & Space Sci, Nanjing 210093, Peoples R China
[3] Australian Natl Univ, Sch Comp, Acton, ACT 2601, Australia
[4] Ohio State Univ, Dept Astron, Columbus, OH 43210 USA
[5] Max Planck Inst Solar Syst Res, Justus von Liebig Weg 3, D-37077 Gottingen, Germany
[6] Heidelberg Inst Theoret Studies HITS gGmbH, Schloss Wolfsbrunnenweg 35, D-69118 Heidelberg, Germany
基金
澳大利亚研究理事会;
关键词
asteroseismology; methods: data analysis; MAGNETIC-FIELDS; ASTEROSEISMOLOGY; STARS; CORES;
D O I
10.1093/mnras/stae068
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
Stellar light curves contain valuable information about oscillations and granulation, offering insights into stars' internal structures and evolutionary states. Traditional asteroseismic techniques, primarily focused on power spectral analysis, often overlook the crucial phase information in these light curves. Addressing this gap, recent machine learning applications, particularly those using Convolutional Neural Networks (CNNs), have made strides in inferring stellar properties from light curves. However, CNNs are limited by their localized feature extraction capabilities. In response, we introduce Astroconformer, a Transformer-based deep learning framework, specifically designed to capture long-range dependencies in stellar light curves. Our empirical analysis centres on estimating surface gravity (log g), using a data set derived from single-quarter Kepler light curves with log g values ranging from 0.2 to 4.4. Astroconformer demonstrates superior performance, achieving a root-mean-square-error (RMSE) of 0.017 dex at log g approximate to 3 in data-rich regimes and up to 0.1 dex in sparser areas. This performance surpasses both K-nearest neighbour models and advanced CNNs. Ablation studies highlight the influence of receptive field size on model effectiveness, with larger fields correlating to improved results. Astroconformer also excels in extracting nu(max) with high precision. It achieves less than 2 per cent relative median absolute error for 90-d red giant light curves. Notably, the error remains under 3 per cent for 30-d light curves, whose oscillations are undetectable by a conventional pipeline in 30 per cent cases. Furthermore, the attention mechanisms in Astroconformer align closely with the characteristics of stellar oscillations and granulation observed in light curves.
引用
收藏
页码:5890 / 5903
页数:14
相关论文
共 50 条
  • [41] Application of Deep Learning in Generating Structured Radiology Reports: A Transformer-Based Technique
    Seyed Ali Reza Moezzi
    Abdolrahman Ghaedi
    Mojdeh Rahmanian
    Seyedeh Zahra Mousavi
    Ashkan Sami
    Journal of Digital Imaging, 2023, 36 : 80 - 90
  • [42] Optimizing a transformer-based network for a deep-learning seismic processing workflow
    Harsuko, Randy
    Alkhalifah, Tariq
    GEOPHYSICS, 2024, 89 (04) : V347 - V359
  • [43] Cyberbullying Text Identification: A Deep Learning and Transformer-based Language Modeling Approach
    Saifullah K.
    Khan M.I.
    Jamal S.
    Sarker I.H.
    EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 2024, 11 (01) : 1 - 12
  • [44] Locational marginal price forecasting using Transformer-based deep learning network
    Liao, Shengyi
    Wang, Zhuo
    Luo, Yao
    Liang, Haiyan
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 8457 - 8462
  • [45] Topic classification of electric vehicle consumer experiences with transformer-based deep learning
    Ha, Sooji
    Marchetto, Daniel J.
    Dharur, Sameer
    Asensio, Omar, I
    PATTERNS, 2021, 2 (02):
  • [46] Deep transformer-based heterogeneous spatiotemporal graph learning for geographical traffic forecasting
    Shi, Guangsi
    Luo, Linhao
    Song, Yongze
    Li, Jing
    Pan, Shirui
    ISCIENCE, 2024, 27 (07)
  • [47] Identifying suicidal emotions on social media through transformer-based deep learning
    Kodati, Dheeraj
    Tene, Ramakrishnudu
    APPLIED INTELLIGENCE, 2023, 53 (10) : 11885 - 11917
  • [48] Modelling monthly rainfall of India through transformer-based deep learning architecture
    Nayak, G. H. Harish
    Alam, Wasi
    Singh, K. N.
    Avinash, G.
    Ray, Mrinmoy
    Kumar, Rajeev Ranjan
    MODELING EARTH SYSTEMS AND ENVIRONMENT, 2024, 10 (03) : 3119 - 3136
  • [49] An Explainable Transformer-Based Deep Learning Model for the Prediction of Incident Heart Failure
    Rao, Shishir
    Li, Yikuan
    Ramakrishnan, Rema
    Hassaine, Abdelaali
    Canoy, Dexter
    Cleland, John
    Lukasiewicz, Thomas
    Salimi-Khorshidi, Gholamreza
    Rahimi, Kazem
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (07) : 3362 - 3372
  • [50] Prediction of Electric Vehicles Charging Demand: A Transformer-Based Deep Learning Approach
    Koohfar, Sahar
    Woldemariam, Wubeshet
    Kumar, Amit
    SUSTAINABILITY, 2023, 15 (03)