Efficient processing of requests with network coding in on-demand data broadcast environments

  • Authors:
  • Jun Chen;Victor C. S. Lee;Kai Liu;G. G. M. N. Ali;Edward Chan

  • Affiliations:
  • School of Information Management, Wuhan University, Wuhan, Hubei, China;Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong;Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong;Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong;Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2013

Quantified Score

Hi-index 0.07

Visualization

Abstract

On-demand broadcast is an effective wireless data dissemination technique to enhance system scalability and the ability to handle dynamic user access patterns. In traditional on-demand broadcast, only one data item can be retrieved by mobile clients during the course of each broadcast, which limits bandwidth utilization and throughput. In this paper, we consider data broadcast with network coding in on-demand broadcast environments. We analyze the coding problem in on-demand broadcast and transform it into the problem of finding the maximum clique in graph theory. Based on our analysis, we first propose a new coding strategy called AC, which exploits the cached information related to clients and data items requested by them, to implement a flexible coding mechanism. Then, based on AC, we propose two novel coding assisted algorithms called ADC-1 and ADC-2 which consider data scheduling, in addition to network coding. In ADC-1 data scheduling and coding are considered separately, while these two factors are fully integrated in ADC-2. The performance gain of our proposed algorithms over traditional and other coding assisted broadcast algorithms is demonstrated through simulation results. Our algorithms not only reduce request response time but also utilize broadcast channel bandwidth more efficiently.