A context-aware prefetching strategy for mobile computing environments

  • Authors:
  • Stylianos Drakatos;Niki Pissinou;Kia Makki;Christos Douligeris

  • Affiliations:
  • Florida International University, Miami, FL;Florida International University, Miami, FL;Florida International University, Miami, FL;University of Piraeus, Piraeus, Greece

  • Venue:
  • Proceedings of the 2006 international conference on Wireless communications and mobile computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In a mobile wireless environment, the latency (time-delay) observed by a user before s/he receives up-to-date information may be high because of the limited available bandwidth. An efficient prefetching strategy must be tailored to the competing goals of keeping latency low (which requires more prefetching) and reducing resource waste in a mobile environment, which is characterized by scarce bandwidth and resource-poor user devices. Current research is based on the tangent velocity approach, which is effective only within a short time interval and has a high cost of continuous geometric estimations.This paper proposes a cache management method that maintains the mobile terminal's cache content by prefetching data items with maximum benefit and evicting cache data entries with minimum benefit. The data item benefit is evaluated based on the user's query context defined as a set of constraints (predicates), which define both the movement pattern and information context requested by the mobile user. A context-aware cache is formed and maintained using a set of neighboring locations (we call the prime list) restricted by the validity of the data fetched from the server. Simulation results show that the proposed strategy using different levels of granularity can greatly improve the system performance in terms of cache hit ratio.