A methodology for benchmarking Java Grande applications
JAVA '99 Proceedings of the ACM 1999 conference on Java Grande
DynBench: A Dynamic Benchmark Suite for Distributed Real-Time Systems
Proceedings of the 11 IPPS/SPDP'99 Workshops Held in Conjunction with the 13th International Parallel Processing Symposium and 10th Symposium on Parallel and Distributed Processing
Benchmarking RDF Schemas for the Semantic Web
ISWC '02 Proceedings of the First International Semantic Web Conference on The Semantic Web
Using benchmarking to advance research: a challenge to software engineering
Proceedings of the 25th International Conference on Software Engineering
Software Inspection Benchmarking - A Qualitative and Quantitative Comparative Opportunity
METRICS '02 Proceedings of the 8th International Symposium on Software Metrics
AI Magazine
A large dataset for the evaluation of ontology matching
The Knowledge Engineering Review
Interoperability results for Semantic Web technologies using OWL as the interchange language
Web Semantics: Science, Services and Agents on the World Wide Web
The state of semantic technology today: overview of the first SEALS evaluation campaigns
Proceedings of the 7th International Conference on Semantic Systems
Benchmark suites for improving the RDF(S) importers and exporters of ontology development tools
ESWC'06 Proceedings of the 3rd European conference on The Semantic Web: research and applications
Hi-index | 0.02 |
Ontology tools performance and scalability are critical to both the growth of the Semantic Web and the establishment of these tools in the industry. In this paper, we present briefly the benchmarking methodology used to improve the performance and the scalability of ontology development tools. We focus on the definition of the infrastructure for evaluating the performance of these tools' ontology management APIs in terms of its execution efficiency. We also present the results of applying the methodology for evaluating the API of the WebODE ontology engineering workbench.