Distributed cache management for context-aware services in large-scale networks

  • Authors:
  • Masaaki Takase;Takeshi Sano;Kenichi Fukuda;Akira Chugo

  • Affiliations:
  • Fujitsu Limited, Kawasaki, Kanagawa, Japan;Fujitsu Limited, Kawasaki, Kanagawa, Japan;Fujitsu Limited, Kawasaki, Kanagawa, Japan;Fujitsu Limited, Kawasaki, Kanagawa, Japan

  • Venue:
  • APNOMS'07 Proceedings of the 10th Asia-Pacific conference on Network Operations and Management Symposium: managing next generation networks and services
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In recent years, the number of messages transferred through networks has skyrocketed with the rising number of network nodes. For example, servers collect variant sensor information and store it. Although related works exist that present cache servers intended to reduce network costs, those cache functions do not work well for short lifetime content because most cached contents with short lifetimes are expired before referral. In this paper, we propose a method to resolve the problem for the coming ubiquitous network society, which is an asynchronous cache management method according to the application requirement. This method enables the CPU load of the servers to be reduced through service and application management for short-lifetime content, too. Furthermore, we propose a load-balancing method using autonomous message exchange instead of a management system. This method enables the CPU load to be balanced over multiple servers.