Mid-tier caching: the TimesTen approach
Proceedings of the 2002 ACM SIGMOD international conference on Management of data
Middle-tier database caching for e-business
Proceedings of the 2002 ACM SIGMOD international conference on Management of data
MTCache: Transparent Mid-Tier Database Caching in SQL Server
ICDE '04 Proceedings of the 20th International Conference on Data Engineering
Relaxed currency and consistency: how to say "good enough" in SQL
SIGMOD '04 Proceedings of the 2004 ACM SIGMOD international conference on Management of data
Transparent caching with strong consistency in dynamic content web sites
Proceedings of the 19th annual international conference on Supercomputing
Relaxed-currency serializability for middle-tier caching and replication
Proceedings of the 2006 ACM SIGMOD international conference on Management of data
Cache tables: paving the way for an adaptive database cache
VLDB '03 Proceedings of the 29th international conference on Very large data bases - Volume 29
Consistency anomalies in multi-tier architectures: automatic detection and prevention
The VLDB Journal — The International Journal on Very Large Data Bases
Hi-index | 0.00 |
Application servers rely on caching and clustering to achieve high performance and scalability. While queries benefit from middletier caching, updates introduce a distributed cache consistency problem. The standard approaches to this problem, cache invalidation and cache replication, either do not guarantee full cache consistency or impose a performance penalty. This paper proposes a novel approach: Freshness-Aware Caching (FAC). FAC tracks the freshness of cached data and allows clients to explicitly trade freshness-of-data for response times. We have implemented FAC in an open-source application server and compare its performance to cache invalidation and cache replication. The evaluation shows that both cache invalidation and FAC provide better update scalability than cache replication. We also show that FAC can provide a significant better read performance than cache invalidation in the case of frequent updates.