Reducing cold start delay in serverless computing using lightweight virtual machines

被引:0
|
作者
Karamzadeh, Amirmohammad [1 ]
Shameli-Sendi, Alireza [1 ]
机构
[1] Shahid Beheshti Univ SBU, Fac Comp Sci & Engn, Tehran, Iran
关键词
Serverless computing; Function as a service; Cold start delay; Lightweight virtual machine; Machine learning;
D O I
10.1016/j.jnca.2024.104030
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, serverless computing has gained considerable attention in academic, professional, and business circles. Unique features such as code development flexibility and the cost-efficient pay-as-you-go pricing model have led to predictions of widespread adoption of serverless services. Major players in the cloud computing sector, including industry giants like Amazon, Google, and Microsoft, have made significant advancements in the field of serverless services. However, cloud computing faces complex challenges, with two prominent ones being the latency caused by cold start instances and security vulnerabilities associated with container escapes. These challenges undermine the smooth execution of isolated functions, a concern amplified by technologies like Google gVisor and Kata Containers. While the integration of tools like lightweight virtual machines has alleviated concerns about container escape vulnerabilities, the primary issue remains the increased delay experienced during cold start instances in the execution of serverless functions. The purpose of this research is to propose an architecture that reduces cold start delay overhead by utilizing lightweight virtual machines within a commercial architecture, thereby achieving a setup that closely resembles real-world scenarios. This research employs supervised learning methodologies to predict function invocations by leveraging the execution patterns of other program functions. The goal is to proactively mitigate cold start scenarios by invoking the target function before actual user initiation, effectively transitioning from cold starts to warm starts. In this study, we compared our approach with two fixed and variable window strategies. Commercial platforms like Knative, OpenFaaS, and OpenWhisk typically employ a fixed 15-minute window during cold starts. In contrast to these platforms, our approach demonstrated a significant reduction in cold start incidents. Specifically, when calling a function 200 times with 5, 10, and 20 invocations within one hour, our approach achieved reductions in cold starts by 83.33%, 92.13%, and 90.90%, respectively. Compared to the variable window approach, which adjusts the window based on cold start values, our proposed approach was able to prevent 82.92%, 91.66%, and 90.56% of cold starts for the same scenario. These results highlight the effectiveness of our approach in significantly reducing cold starts, thereby enhancing the performance and responsiveness of serverless functions. Our method outperformed both fixed and variable window strategies, making it a valuable contribution to the field of serverless computing. Additionally, the implementation of preinvocation strategies to convert cold starts into warm starts results in a substantial reduction in the execution time of functions within lightweight virtual machines.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Mitigating Cold-Start Delay using Warm-Start Containers in Serverless Platform
    Kumari, Anisha
    Sahoo, Bibhudatta
    Behera, Ranjan Kumar
    2022 IEEE 19TH INDIA COUNCIL INTERNATIONAL CONFERENCE, INDICON, 2022,
  • [2] Cold Start Prediction and Provisioning Optimization in Serverless Computing Using Deep Learning
    Kumar, N. Saravana
    Samy, S. Selvakumara
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2025, 37 (4-5):
  • [3] Mitigating Cold Start Problem in Serverless Computing with Function Fusion
    Lee, Seungjun
    Yoon, Daegun
    Yeo, Sangho
    Oh, Sangyoon
    SENSORS, 2021, 21 (24)
  • [4] FuncMem : Reducing Cold Start Latency in Serverless Computing Through Memory Prediction and Adaptive Task Execution
    Pandey, Manish
    Kwon, Young-Woo
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 131 - 138
  • [5] Cold Start in Serverless Computing: Current Trends and Mitigation Strategies
    Vahidinia, Parichehr
    Farahani, Bahar
    Alice, Fereidoon Shams
    2020 INTERNATIONAL CONFERENCE ON OMNI-LAYER INTELLIGENT SYSTEMS (IEEE COINS 2020), 2020, : 86 - 92
  • [6] WLEC: A Not So Cold Architecture to Mitigate Cold Start Problem in Serverless Computing
    Solaiman, Khondokar
    Adnan, Muhammad Abdullah
    2020 IEEE INTERNATIONAL CONFERENCE ON CLOUD ENGINEERING (IC2E 2020), 2020, : 144 - 153
  • [7] Reducing the cost of cold start time in serverless function executions using granularity trees
    Hanaforoosh, Mahrad
    Azgomi, Mohammad Abdollahi
    Ashtiani, Mehrdad
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 164
  • [8] Mitigating Cold Start Problem in Serverless Computing: A Reinforcement Learning Approach
    Vahidinia, Parichehr
    Farahani, Bahar
    Aliee, Fereidoon Shams
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (05) : 3917 - 3927
  • [9] Taming Serverless Cold Start of Cloud Model Inference With Edge Computing
    Zhao, Kongyange
    Zhou, Zhi
    Jiao, Lei
    Cai, Shen
    Xu, Fei
    Chen, Xu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (08) : 8111 - 8128
  • [10] Tackling Cold Start in Serverless Computing with Multi-Level Container Reuse
    Zhou, Amelie Chi
    Huang, Rongzheng
    Ke, Zhoubin
    Li, Yusen
    Wang, Yi
    Mao, Rui
    PROCEEDINGS 2024 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM, IPDPS 2024, 2024, : 89 - 99