Forecasting Earnings Using k-Nearest Neighbors

被引:1
|
作者
Easton, Peter D. [1 ]
Kapons, Martin M. [2 ]
Monahan, Steven J. [3 ]
Schutt, Harm H. [4 ]
Weisbrod, Eric H. [5 ]
机构
[1] Univ Notre Dame, Mendoza Coll Business, Dept Accountancy, Notre Dame, IN 46556 USA
[2] Univ Amsterdam, Amsterdam Business Sch, Accounting Sect, Amsterdam, Netherlands
[3] Univ Utah, David Eccles Sch Business, Sch Accounting, Salt Lake City, UT USA
[4] Tilburg Univ, Tilburg Sch Econ & Management, Dept Accountancy, Tilburg, Netherlands
[5] Univ Kansas, Sch Business, Accounting Acad Area, Lawrence, KS USA
来源
ACCOUNTING REVIEW | 2024年 / 99卷 / 03期
关键词
earnings; forecasting; machine learning; nearest neighbors; TIME-SERIES PROPERTIES; IMPLIED COST; MODELS; PERFORMANCE;
D O I
10.2308/TAR-2021-0478
中图分类号
F8 [财政、金融];
学科分类号
0202 ;
摘要
We use a simple k -nearest neighbors algorithm (hereafter, k -NN*) to forecast earnings. k -NN* forecasts of one-, two-, and three -year -ahead earnings are more accurate than those generated by popular extant forecasting approaches. k -NN* forecasts of two- and three-year (one -year) -ahead EPS and aggregate three-year EPS are more (less) accurate than those generated by analysts. The association between the unexpected earnings implied by k -NN* and the contemporaneous market -adjusted return (i.e., the earnings association coefficient (EAC)) is positive and exceeds the EAC on unexpected earnings implied by alternate approaches. A trading strategy that is long (short) firms for which k -NN* predicts positive (negative) earnings growth earns positive risk -adjusted returns that exceed those earned by similar trading strategies that are based on alternate forecasts. The k -NN* algorithm generates an empirically reliable ex ante indicator of forecast accuracy that identifies situations when the k -NN* EAC is larger and the k -NN* trading strategy is more profitable.
引用
收藏
页码:115 / 140
页数:26
相关论文
共 50 条
  • [21] Electrical load forecasting: A deep learning approach based on K-nearest neighbors
    Dong, Yunxuan
    Ma, Xuejiao
    Fu, Tonglin
    [J]. APPLIED SOFT COMPUTING, 2021, 99
  • [22] Fast agglomerative clustering using information of k-nearest neighbors
    Chang, Chih-Tang
    Lai, Jim Z. C.
    Jeng, M. D.
    [J]. PATTERN RECOGNITION, 2010, 43 (12) : 3958 - 3968
  • [23] RSSI-based Localization Using K-Nearest Neighbors
    Achroufene, Achour
    [J]. AD HOC & SENSOR WIRELESS NETWORKS, 2023, 56 (1-2) : 105 - 135
  • [24] Toward Predicting Medical Conditions Using k-Nearest Neighbors
    Tayeb, Shahab
    Pirouz, Matin
    Sun, Johann
    Hall, Kaylee
    Chang, Andrew
    Li, Jessica
    Song, Connor
    Chauhan, Apoorva
    Ferra, Michael
    Sager, Theresa
    Zhan, Justin
    Latifi, Shahram
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 3897 - 3903
  • [25] Movie Recommender System Using K-Nearest Neighbors Variants
    Sonu Airen
    Jitendra Agrawal
    [J]. National Academy Science Letters, 2022, 45 : 75 - 82
  • [26] Classification using the local probabilistic centers of k-nearest neighbors
    Li, Bo Yu
    Chen, Yun Wen
    [J]. 18TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 3, PROCEEDINGS, 2006, : 1220 - +
  • [27] Information theoretic clustering using a k-nearest neighbors approach
    Vikjord, Vidar V.
    Jenssen, Robert
    [J]. PATTERN RECOGNITION, 2014, 47 (09) : 3070 - 3081
  • [28] Classifications of Motor Imagery Tasks Using K-Nearest Neighbors
    Aldea, Roxana
    Fira, Monica
    Lazar, Anca
    [J]. 2014 12TH SYMPOSIUM ON NEURAL NETWORK APPLICATIONS IN ELECTRICAL ENGINEERING (NEUREL), 2014, : 115 - 119
  • [29] A Placement Prediction System Using K-Nearest Neighbors Classifier
    Giri, Animesh
    Bhagavath, M. Vignesh V.
    Pruthvi, Bysani
    Dubey, Naini
    [J]. 2016 SECOND INTERNATIONAL CONFERENCE ON COGNITIVE COMPUTING AND INFORMATION PROCESSING (CCIP), 2016,
  • [30] Movie Recommender System Using K-Nearest Neighbors Variants
    Airen, Sonu
    Agrawal, Jitendra
    [J]. NATIONAL ACADEMY SCIENCE LETTERS-INDIA, 2022, 45 (01): : 75 - 82