Mercator: A scalable, extensible Web crawler
World Wide Web
Design and Implementation of a High-Performance Distributed Web Crawler
ICDE '02 Proceedings of the 18th International Conference on Data Engineering
Implementation of a web robot and statistics on the Korean web
HSI'03 Proceedings of the 2nd international conference on Human.society@internet
Hi-index | 0.00 |
Syntactically different URLs could represent the same web page on the World Wide Web, and duplicate representation for web pages causes web applications to handle a large amount of same web pages unnecessarily. In the standard communities, there are on-going efforts to define the URL normalization that helps eliminate duplicate URLs. On the other hand, there are research efforts to extend the standard URL normalization methods to reduce false negatives further while allowing false positives on a limited level. This paper presents a method that evaluates the effectiveness of a URL normalization method in terms of page loss/gain/change and the URL reduction. Over 94 million URLs were extracted from web pages for our experiment and interesting statistical results are reported in this paper.