Joint broadcast scheduling and user's cache management for efficient information delivery
MobiCom '98 Proceedings of the 4th annual ACM/IEEE international conference on Mobile computing and networking
Broadcast scheduling for information distribution
Wireless Networks
Proceedings of the 3rd ACM international workshop on Modeling, analysis and simulation of wireless and mobile systems
SAIU: an efficient cache replacement policy for wireless on-demand broadcasts
Proceedings of the ninth international conference on Information and knowledge management
Push-Based Information Delivery in Two Stage Satellite-Terrestrial Wireless Systems
IEEE Transactions on Computers
Adaptive Power-Aware Prefetching Schemes for Mobile Broadcast Environments
MDM '03 Proceedings of the 4th International Conference on Mobile Data Management
Caching and Scheduling for Broadcast Disk Systems
Journal of Experimental Algorithmics (JEA)
Performance Evaluation of an Optimal Cache Replacement Policy for Wireless Data Dissemination
IEEE Transactions on Knowledge and Data Engineering
Broadcast program generation for webcasting
Data & Knowledge Engineering
Web Caching in Broadcast Mobile Wireless Environments
IEEE Internet Computing
A multi-version cache replacement and prefetching policy for hybrid data delivery environments
VLDB '02 Proceedings of the 28th international conference on Very Large Data Bases
Information delivery through broadcasting in satellite communication networks
Automatica (Journal of IFAC)
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.08 |
Data broadcasting has been considered as a promising way of disseminating information to a massive number of users in a wireless communication environment. In a broadcast data delivery system, there is a server which is broadcasting data to a user community. Due to the lack of communication from the users to the server, the server cannot know what a user needs. In order to access a certain item, a user has to wait until the item appears in the broadcast. The waiting time will be considerably long if the server's broadcast schedule does not match the user's access needs. If a user has a local memory, it can alleviate its access latency by selectively prefetching the items from the broadcast and storing them in the memory. A good memory management strategy can substantially reduce the user's access latency, which is a major concern in a broadcast data delivery system. An optimal memory management policy is identified that minimizes the expected aggregate latency. We present optimal memory update strategies with limited look ahead as implementable approximations of the optimal policy. Some interesting special cases are given for which the limited look-ahead policies are optimal. We also show that the same formulation can be used to find the optimal memory management policy which minimizes the number of deadline misses when users generate information requests which have to be satisfied within some given deadlines