Efficient Index Learning via Model Reuse and Fine-tuning

被引:0
|
作者
Liu, Guanli [1 ]
Qi, Jianzhong [1 ]
Kulik, Lars [1 ]
Soga, Kazuya [1 ]
Borovica-Gajic, Renata [1 ]
Rubinstein, Benjamin I. P. [1 ]
机构
[1] Univ Melbourne, Melbourne, Vic, Australia
基金
澳大利亚研究理事会;
关键词
learned index; model reuse; fine-tuning;
D O I
10.1109/ICDEW58674.2023.00015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learned indices using machine learning techniques have demonstrated potential as alternatives to traditional indices such as B-trees in both query time and memory. However, a well fitted learned index requires significant space consumption to train models and tune parameters. Furthermore, fast training methods-ones that train in one pass-may not learn the data distribution well. To consider both the fitness to data distribution and building efficiency, in this paper, we apply pre-trained models and fine-tuning to accelerate the building of learned indices by 30.4% and improve lookup efficiency by up to 24.4% on real datasets and 22.5% on skewed datasets.
引用
收藏
页码:60 / 66
页数:7
相关论文
共 50 条
  • [1] How fine can fine-tuning be? Learning efficient language models
    Radiya-Dixit, Evani
    Wang, Xin
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2435 - 2442
  • [2] An efficient deep learning model to categorize brain tumor using reconstruction and fine-tuning
    Talukder, Md. Alamin
    Islam, Md. Manowarul
    Uddin, Md. Ashraf
    Akhter, Arnisha
    Pramanik, Md. Alamgir Jalil
    Aryal, Sunil
    Almoyad, Muhammad Ali Abdulllah
    Hasan, Khondokar Fida
    Moni, Mohammad Ali
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 230
  • [3] Scalable Online Planning via Reinforcement Learning Fine-Tuning
    Fickinger, Arnaud
    Hu, Hengyuan
    Amos, Brandon
    Russell, Stuart
    Brown, Noam
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning
    Ghalandari, Demian Gholipour
    Hokamp, Chris
    Ifrim, Georgiana
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1267 - 1280
  • [5] Transfer Learning With Adaptive Fine-Tuning
    Vrbancic, Grega
    Podgorelec, Vili
    [J]. IEEE ACCESS, 2020, 8 : 196197 - 196211
  • [6] Fine-Tuning via Mask Language Model Enhanced Representations Based Contrastive Learning and Application
    Zhang, Dechi
    Wan, Weibing
    [J]. Computer Engineering and Applications, 2024, 60 (17) : 129 - 138
  • [7] Towards Adaptive Prefix Tuning for Parameter-Efficient Language Model Fine-tuning
    Zhang, Zhen-Ru
    Tan, Chuanqi
    Xu, Haiyang
    Wang, Chengyu
    Huang, Jun
    Huang, Songfang
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1239 - 1248
  • [8] MultiFiT: Efficient Multi-lingual Language Model Fine-tuning
    Eisenschlos, Julian
    Ruder, Sebastian
    Czapla, Piotr
    Kardas, Marcin
    Gugger, Sylvain
    Howard, Jeremy
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5702 - 5707
  • [9] PockEngine: Sparse and Efficient Fine-tuning in a Pocket
    Zhu, Ligeng
    Hu, Lanxiang
    Lin, Ji
    Wang, Wei-Chen
    Chen, Wei-Ming
    Gan, Chuang
    Han, Song
    [J]. 56TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE, MICRO 2023, 2023, : 1381 - 1394
  • [10] Efficient Fine-Tuning of BERT Models on the Edge
    Vucetic, Danilo
    Tayaranian, Mohammadreza
    Ziaeefard, Maryam
    Clark, James J.
    Meyer, Brett H.
    Gross, Warren J.
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 1838 - 1842