Understanding the space for co-design in riders' interactions with a transit service
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
OneBusAway: results from providing real-time arrival information for public transit
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lurking? cyclopaths?: a quantitative lifecycle analysis of user behavior in a geowiki
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
VizWiz: nearly real-time answers to visual questions
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Enhancing independence and safety for blind and deaf-blind public transit riders
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design of human-powered access technology
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
Improving public transit accessibility for blind riders: a train station navigation assistant
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
StopFinder: improving the experience of blind public transit riders with crowdsourcing
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
Design of an audio-based mobile journey planner application
Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments
CrossingGuard: exploring information content in navigation aids for visually impaired pedestrians
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Real-time captioning by groups of non-experts
Proceedings of the 25th annual ACM symposium on User interface software and technology
A feasibility study of crowdsourcing and google street view to determine sidewalk accessibility
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Combining crowdsourcing and google street view to identify street-level accessibility problems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Low-vision and blind bus riders often rely on known physical landmarks to help locate and verify bus stop locations (e.g., by searching for a shelter, bench, newspaper bin). However, there are currently few, if any, methods to determine this information a priori via computational tools or services. In this paper, we introduce and evaluate a new scalable method for collecting bus stop location and landmark descriptions by combining online crowdsourcing and Google Street View (GSV). We conduct and report on three studies in particular: (i) a formative interview study of 18 people with visual impairments to inform the design of our crowdsourcing tool; (ii) a comparative study examining differences between physical bus stop audit data and audits conducted virtually with GSV; and (iii) an online study of 153 crowd workers on Amazon Mechanical Turk to examine the feasibility of crowdsourcing bus stop audits using our custom tool with GSV. Our findings reemphasize the importance of landmarks in non-visual navigation, demonstrate that GSV is a viable bus stop audit dataset, and show that minimally trained crowd workers can find and identify bus stop landmarks with 82.5% accuracy across 150 bus stop locations (87.3% with simple quality control).