An evaluation of the effect of response formats on data quality in web surveys
Social Science Computer Review - Computer-based methods: State of the art
Internet-based psychological experimenting: five dos and five don'ts
Social Science Computer Review - Special issue: Psychology and the internet
Explaining response latencies and changing answers using client-side paradata from a web survey
Social Science Computer Review
What they see is what we get: response options for web surveys
Social Science Computer Review
Evaluating the Effectiveness of Visual Analog Scales
Social Science Computer Review
Social Science Computer Review
Designing Effective Web Surveys
Designing Effective Web Surveys
Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method
Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method
Hi-index | 0.00 |
Slider scales and radio buttons scales were experimentally compared in horizontal and vertical orientation. Slider scales lead to statistically significantly higher break-off rates (odds ratio = 6.9) and substantially higher response times. Problems with slider scales were especially prevalent in participants with less than average education, suggesting the slider scale format is more challenging in terms of previous knowledge needed or cognitive load. An alternative explanation, technology-dependent sampling (Buchanan & Reips, 2001), cannot fully account for the current results. The authors clearly advise against the use of Java-based slider scales and advocate low-tech solutions for the design of Web-based data collection. Orientation on screen had no observable effect on data quality or usability of rating scales. Implications of item format for Web-based surveys are discussed.