Scheduling policies for an on-demand video server with batching
MULTIMEDIA '94 Proceedings of the second ACM international conference on Multimedia
Dynamic batching policies for an on-demand video server
Multimedia Systems
Patching: a multicast technique for true video-on-demand services
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia
Improving bandwidth efficiency of video-on-demand servers
IC3N '97 Selected papers of the 6th international conference on Computer communications and networks
Comparing random data allocation and data striping in multimedia servers
Proceedings of the 2000 ACM SIGMETRICS international conference on Measurement and modeling of computer systems
The Maximum Factor Queue Length Batching Scheme for Video-on-Demand Systems
IEEE Transactions on Computers
The Split and Merge Protocol for Interactive Video-on-Demand
IEEE MultiMedia
IPDPS '02 Proceedings of the 16th International Parallel and Distributed Processing Symposium
A Scalable Video Server Using Intelligent Network Attached Storage
MMNS '02 Proceedings of the 5th IFIP/IEEE International Conference on Management of Multimedia Networks and Services: Management of Multimedia on the Internet
On Optimal Batching Policies for Video-on-Demand Storage Servers
ICMCS '96 Proceedings of the 1996 International Conference on Multimedia Computing and Systems
Video-on-Demand Server Efficiency through Stream Tapping
IC3N '97 Proceedings of the 6th International Conference on Computer Communications and Networks
Threshold-based multicast for continuous media delivery
IEEE Transactions on Multimedia
Channel allocation problem in VoD system using both batching and adaptive piggybacking
IEEE Transactions on Consumer Electronics
Heuristic batching policies for video-on-demand services
Computer Communications
Tradeoff between system profit and user delay/loss in providing near video-on-demand service
IEEE Transactions on Circuits and Systems for Video Technology
Hi-index | 0.00 |
Parallel video servers provide highly scalable video-on-demand service for a huge number of clients. The conventional stream-scheduling scheme does not use I/O and network bandwidth efficiently. Some other schemes, such as batching and stream merging, can effectively improve server I/O and network bandwidth efficiency. However, the batching scheme results in long startup latency and high reneging probability. The traditional stream-merging scheme does not work well at high client-request rates due to mass retransmission of the same video data. In this paper, a novel stream-scheduling scheme, called Medusa, is developed for minimizing server bandwidth requirements over a wide range of client-request rates. Furthermore, the startup latency raised by Medusa scheme is far less than that of the batching scheme.