Study on coded caching with parallel transmission

被引:0
|
作者
Lin X. [1 ]
Luo S. [1 ]
Liu N. [1 ]
机构
[1] National Mobile Communications Research Laboratory, Southeast University, Nanjing
关键词
coded caching; parallel transmission; uncoded prefetching;
D O I
10.19665/j.issn1001-2400.2023.02.002
中图分类号
学科分类号
摘要
In order to improve the efficiency of network transmission and obtain a better low-latency performance, caching technology appeared. Unlike traditional caching technology, coded caching enables a single broadcast transmission from the server to simultaneously satisfy different demands of users by creating multicast opportunities with a global caching gain obtained. A coded caching network with parallel transmission is considered in which the server can broadcast messages to all users and users can also send messages to each other. An uncoded prefetching coded caching scheme is proposed which is composed of three phases: the pre-caching phase, the allocation phase and the delivery phase, where the optimal delivery time is obtained by pre-allocating different workloads to the server and users. It is shown that the proposed scheme with parallel transmission has a better performance compared with either server-multicast transmission alone or transmission within a D2D network alone. Also, after considering the channel capability gap between the two different channels, the proposed scheme obtains a better performance than when channel transmission capability is ignored. Finally, the proposed cache and delivery scheme with parallel transmission in the case of uncoded prefetching is proved to be optimal when the users' cache resources are sufficient and the server broadcast channel and the D2D network transmission channel have the same channel capacity. © 2023 Science Press. All rights reserved.
引用
收藏
页码:11 / 22
页数:11
相关论文
共 19 条
  • [1] MADDAH-ALI M A, NIESEN U., Fundamental Limits of Caching [J], Information Theory IEEE Transactions, 60, 5, pp. 1077-1081, (2014)
  • [2] MADDAH-ALI M A, NIESEN U., Decentralized Coded Caching Attains Order-Optimal Memory-Rate Tradeoff, IEEE/ACM Transactions on Networking, 23, 4, pp. 1029-1040, (2015)
  • [3] YU Q, MADDAH-ALI M A, AVESTIMEHR A S., The Exact Rat E-Memory Tradeoff for Caching with Uncoded Prefetching, IEEE Transactions on Information Theory IEEE, 64, 2, pp. 1281-1296, (2018)
  • [4] SHARIATPANAHI S P, MOTAHARI S A, KHALAJ B H., Multi-Server Coded Caching, IEEE Transactions on Information Theory, 62, 12, pp. 7253-7271, (2016)
  • [5] WANG C, BIDOKHTI S S, WIGGER M., Improved Converses and Gap-Results for Coded Caching[J], IEEE Transactions on Information Theory, 64, 11, pp. 7051-7062, (2018)
  • [6] BIDOKHTI S S, WIGGER M, YENER A., Gaussian Broadcast Channels with Receiver Cache Assignment, IEEE International Conference on Communications, pp. 1-6, (2017)
  • [7] NADERIALIZADEH N, MADDAH-ALI M A, AVESTIMEHR A S., Fundamental Limits of Cache-Aided Interference Management, IEEE Transactions on Information Theory, 63, 5, pp. 3092-3107, (2017)
  • [8] XU F, TAO M, LIU K., Fundamental Tradeoff between Storage and Latency in Cache-Aided Wireless Interference Networks, IEEE Transactions on Information Theory, 63, 11, pp. 7464-7491, (2017)
  • [9] CHIANG M, ZHANG T., Fog and IoT: An Overview of Research Opportunities [J], IEEE Internet of Things Journal, 3, 6, pp. 854-864, (2016)
  • [10] JI M, CAIRE G, MOLISCH A F., Optimal Throughput-Outage Trade-off in Wireless On E-Hop Caching Networks [C], 2013 IEEE International Symposium on Information Theory, pp. 1-5, (2013)