Indirect reference counting: a distributed garbage collection algorithm
PARLE '91 Proceedings on Parallel architectures and languages Europe : volume I: parallel architectures and algorithms: volume I: parallel architectures and algorithms
Author-oriented link management
Proceedings of the fifth international World Wide Web conference on Computer networks and ISDN systems
Fixing the “broken-link” problem: the W3Objects approach
Proceedings of the fifth international World Wide Web conference on Computer networks and ISDN systems
Recursive functions of symbolic expressions and their computation by machine, Part I
Communications of the ACM
A method for overlapping and erasure of lists
Communications of the ACM
The Memory Behavior of the WWW, or The WWW Considered as a Persistent Store
POS-9 Revised Papers from the 9th International Workshop on Persistent Object Systems
ATEC '00 Proceedings of the annual conference on USENIX Annual Technical Conference
Distributed garbage collection for wide area replicated memory
COOTS'01 Proceedings of the 6th conference on USENIX Conference on Object-Oriented Technologies and Systems - Volume 6
Extending Middleware Protocols for Database Replication with Integrity Support
OTM '08 Proceedings of the OTM 2008 Confederated International Conferences, CoopIS, DOA, GADA, IS, and ODBASE 2008. Part I on On the Move to Meaningful Internet Systems:
DSNotify: handling broken links in the web of data
Proceedings of the 19th international conference on World wide web
Learning resources in federated environments: a broken link checker based on URL similarity
International Journal of Metadata, Semantics and Ontologies
Hi-index | 0.00 |
Replication of web content, through mirroring of web sites or browsing off-line content, is one of the most used techniques to increase content availability, reduce network bandwidth usage and minimize browsing delays in the world-wide-web.The world-wide-web does not support referential integrity, i.e., broken links do exist. This has been considered, for some years now, one of the most serious problems of the web. This is true in various fields, e.g.: i) if a user pays for some service in the form of web pages, he requires such pages to be reachable all the time, and ii) archived web resources, either scientific, legal or historic, that are still referenced, need to be preserved and remain available.Current approaches to the broken-link problem are not able to preserve referential integrity on the web and, simultaneously, support replication and minimize storage waste due to memory leaks. Some of them also impose specific authoring and management systems. Thus, the limitations of current systems reside in three issues: transparency, completeness and safety.We propose a system, RepWeb, comprised of an application to access and manage replicated web content and an implementation of an acyclic distributed garbage collection algorithm for wide-area replicated memory, that satisfies all these requirements. It supports replication, enforces referential integrity on the web and minimizes storage waste.