A technique for measuring the relative size and overlap of public Web search engines
WWW7 Proceedings of the seventh international conference on World Wide Web 7
Measuring index quality using random walks on the Web
WWW '99 Proceedings of the eighth international conference on World Wide Web
Results and challenges in Web search evaluation
WWW '99 Proceedings of the eighth international conference on World Wide Web
Proceedings of the 9th international World Wide Web conference on Computer networks : the international journal of computer and telecommunications netowrking
Rank aggregation methods for the Web
Proceedings of the 10th international conference on World Wide Web
Precision Evaluation of Search Engines
World Wide Web
Approximating Aggregate Queries about Web Pages via Random Walks
VLDB '00 Proceedings of the 26th International Conference on Very Large Data Bases
Methods for comparing rankings of search engine results
Computer Networks: The International Journal of Computer and Telecommunications Networking - Web dynamics
Web search enhancement by mining user actions
Information Sciences: an International Journal
User rankings of search engine results
Journal of the American Society for Information Science and Technology
Tagging and searching: Search retrieval effectiveness of folksonomies on the World Wide Web
Information Processing and Management: an International Journal
Search engines evaluation for P2P based digital libraries
Proceedings of the 2008 Euro American Conference on Telematics and Information Systems
A coherent measurement of web-search relevance
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Information Sciences: an International Journal
Web search solved?: all result rankings the same?
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
An overview of Web search evaluation methods
Computers and Electrical Engineering
Semantic ranking of web pages based on formal concept analysis
Journal of Systems and Software
Adapting domain ontology for personalized knowledge search and recommendation
Information and Management
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In an Internet search, the user uses a query language to describe the nature of documents, and in response, a search engine locates the documents that "best match" the description. A number of search engines are available to Internet users today and more are likely to appear in the future. These systems differ from one another in the indexing technique they use to construct the repository of documents and the search algorithm they employ to generate the response. As a result, the results for the same query from different search engines vary considerably. In this work, our aim is to outline a procedure for assessing the quality of search results obtained through several popular search engines. This procedure would enable us to compare the performance of the popular search engines like Alta Vista, Direct Hit, Excite, Google, Hot Bot, Lycos and Yahoo, etc.We measure the "satisfaction" a user gets when presented with the search results. We watch the actions of the user on the search results presented before him in response to his query, and infer the feedback of the user therefrom. The implicit ranking thus provided by the user is compared with the original ranking given by the search engine. The correlation coefficient thus obtained is averaged for a set of queries. We show our results pertaining to 7 public search engines and 15 ad hoc queries. Our emphasis is more to demonstrate the procedure of quality measurement than to carry out the actual performance measurement of these search engines.