Statistical design for research
Statistical design for research
Usability evaluation of computer-assisted survey instruments
Social Science Computer Review - Special issue on survey and statistical computing in the new millennium
What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing
CHI '01 Extended Abstracts on Human Factors in Computing Systems
An evaluation of the effect of response formats on data quality in web surveys
Social Science Computer Review - Computer-based methods: State of the art
Technology Trends in Survey Data Collection
Social Science Computer Review
Web-Based Questionnaires and the Mode Effect
Social Science Computer Review
An evaluation of nonresponse and coverage errors in a prerecruited probability web panel survey
Social Science Computer Review
Design parameters of rating scales for web sites
ACM Transactions on Computer-Human Interaction (TOCHI)
Social Science Computer Review
The Length of Responses to Open-Ended Questions
Social Science Computer Review
Matching the Message to the Medium
Social Science Computer Review
Factors affecting response rates of the web survey: A systematic review
Computers in Human Behavior
Building an interaction design pattern language: A case study
Computers in Human Behavior
Experiments in Mobile Web Survey Design
Social Science Computer Review
Interacting with Computers
Words, Numbers, and Visual Heuristics in Web Surveys: Is There a Hierarchy of Importance?
Social Science Computer Review
Sliders for the Smart: Type of Rating Scale on the Web Interacts With Educational Level
Social Science Computer Review
Empirical evaluation of 20 web form optimization guidelines
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Data Quality in PC and Mobile Web Surveys
Social Science Computer Review
Hi-index | 0.00 |
Several alternative response formats are available to the web survey designer, but the choice of format is often made with little consideration of measurement error. The authors experimentally explore three common response formats used in web surveys: a series of radio buttons, a drop box with none of the options initially displayed until the respondent clicks on the box, and a scrollable drop box with some of the options initially visible, requiring the respondent to scroll to see the remainder of the options. The authors reversed the order of the response options for half the sample. The authors find evidence of response order effects but stronger evidence that visible response options are endorsed more frequently, suggesting that visibility may be a more powerful effect than primacy in web surveys. The results suggest that the response format used in web surveys does affect the choices made by respondents.