A fair replica placement for parallel download on cluster grid
NBiS'07 Proceedings of the 1st international conference on Network-based information systems
Replica placement in data grid: a multi-objective approach
GCC'05 Proceedings of the 4th international conference on Grid and Cooperative Computing
Using classification techniques to improve replica selection in data grid
ODBASE'06/OTM'06 Proceedings of the 2006 Confederated international conference on On the Move to Meaningful Internet Systems: CoopIS, DOA, GADA, and ODBASE - Volume Part II
Effective dynamic replica maintenance algorithm for the grid environment
GPC'06 Proceedings of the First international conference on Advances in Grid and Pervasive Computing
A Decentralized Periodic Replication Strategy Based on Knapsack Problem
GRID '12 Proceedings of the 2012 ACM/IEEE 13th International Conference on Grid Computing
Hi-index | 0.00 |
Grid computing emerges from the need to integrate a collection of distributed computing resources to offer performance unattainable by any single machine. Grid technology facilitates data sharing across many organizations in different geographical locations. Data replication is an excellent technique to move and cache data close to users. Replication reduces access latency and bandwidth consumption. It also facilitates load balancing and improves reliability by creating multiple data copies. However, Gird environments introduce significant new challenges such as dynamic resource availability and network performance changes. As users requests vary constantly, the system needs a dynamic replication strategy that adapts to users' dynamic behavior. To address such issues, this paper presents and evaluates the performance of six dynamic replication strategies for two different kinds of access patterns. Our replication strategies are mainly based on utility and risk. Before placing a replica at a site, we calculate an expected utility and risk index for each site by considering current network load and user requests. A replication site is then chosen by optimizing expected utility or risk indexes.