Empirically validated web page design metrics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing accessibility evaluation tools: a method for tool effectiveness
Universal Access in the Information Society
Test case management tools for accessibility testing
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
The BenToWeb Test Case Suites for the Web Content Accessibility Guidelines (WCAG) 2.0
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: applications and services
Development of automatic web accessibility checking modules for advanced quality assurance tools
UAHCI'07 Proceedings of the 4th international conference on Universal access in human computer interaction: coping with diversity
Beyond Specifications: Towards a Practical Methodology for Evaluating Web Accessibility
Journal of Usability Studies
The bentoweb XHTML 1.0 test suite for the web content accessibility guidelines 2.0
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
Test case management tools for accessibility testing
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
Hi-index | 0.00 |
Automatic benchmarking of evaluation and repair tools (ERT) has been recently the subject of several studies as there is a growing interest because of legal and commercial issues on Web compliance with different criteria and standards. This paper addresses the development of a description language targeted to formally represent test case metadata. This language was used to develop a WCAG 2.0 test suite that will support the benchmarking of ERT with regard to the aforementioned W3C recommendation