Concurrent caching

  • Authors:
  • Jay Nelson

  • Affiliations:
  • DuoMark International, Inc.

  • Venue:
  • Proceedings of the 2006 ACM SIGPLAN workshop on Erlang
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A concurrent cache design is presented which allows cached data to be spread across a cluster of computers. The implementation separates persistent storage from cache storage and abstracts the cache behaviour so that the user can experiment with cache size and replacement policy to optimize performance for a given system, even if the production data store is not available. Using processes to implement cached objects allows for runtime configurability and adaptive use policies as well as parallelization to optimize resource access efficiency.