The relationship between accessibility and usability of websites
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Quantitative metrics for measuring web accessibility
W4A '07 Proceedings of the 2007 international cross-disciplinary conference on Web accessibility (W4A)
The Unified Web Evaluation Methodology (UWEM) 1.2 for WCAG 1.0
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Considering web accessibility in information retrieval systems
ICWE'07 Proceedings of the 7th international conference on Web engineering
Flexible reporting for automated usability and accessibility evaluation of web sites
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Accessibility of eGovernment web sites: towards a collaborative retrofitting approach
ICCHP'10 Proceedings of the 12th international conference on Computers helping people with special needs: Part I
Automatic web accessibility metrics: Where we are and where we can go
Interacting with Computers
Hi-index | 0.00 |
The most adequate approach for benchmarking web accessibility is manual expert evaluation supplemented by automatic analysis tools. But manual evaluation has a high cost and is impractical to be applied on large web sites. In reality, there is no choice but to rely on automated tools when reviewing large web sites for accessibility. The question is: to what extent the results from automatic evaluation of a web site and individual web pages can be used as an approximation for manual results? This paper presents the initial results of an investigation aimed at answering this question. He have performed both manual and automatic evaluations of the accessibility of web pages of two sites and we have compared the results. In our data set automatically retrieved results could most definitely be used as an approximation manual evaluation results.