Fixing the “broken-link” problem: the W3Objects approach
Proceedings of the fifth international World Wide Web conference on Computer networks and ISDN systems
Introduction to Reinforcement Learning
Introduction to Reinforcement Learning
RepWeb: replicated Web with referential integrity
Proceedings of the 2003 ACM symposium on Applied computing
A framework for distributed digital object services
International Journal on Digital Libraries
Measuring similarity to detect qualified links
AIRWeb '07 Proceedings of the 3rd international workshop on Adversarial information retrieval on the web
REINFORCEMENT LEARNING FOR POMDP USING STATE CLASSIFICATION
Applied Artificial Intelligence
A Federated Authorization Service for Bridging Learning Object Distribution Models
ICWL '009 Proceedings of the 8th International Conference on Advances in Web Based Learning
DSNotify: handling broken links in the web of data
Proceedings of the 19th international conference on World wide web
Towards a pan-European learning resource exchange infrastructure
NGITS'09 Proceedings of the 7th international conference on Next generation information technologies and systems
Analyzing information retrieval methods to recover broken web links
ECIR'2010 Proceedings of the 32nd European conference on Advances in Information Retrieval
Updating broken web links: An automatic recommendation system
Information Processing and Management: an International Journal
Hi-index | 0.00 |
In a large federation of learning resource repositories metadata can become out of date, especially when resources are moved or deleted, frustrating users and making it critical to assure the availability of resources in any learning environment. Regularly checking all of URLs in a federation with thousands of resources negatively impacts content providers causing poor system performance, heavy network traffic and impoliteness. This paper describes a solution that ensures a balance between the number of resources and the number of checking requests performed. It uses a heuristic algorithm to select a group of resources to verify. Depending on the results, the system either stops checking or continues with a new selection of links constructed from URLs that are similar to the broken links in the previous round. To reach the global optimisation, the selection process also randomly picks other resources with a small probability of error.