
Crowdsourcing for usability testing
Author(s) -
Liu Di,
Bias Randolph G.,
Lease Matthew,
Kuipers Rebecca
Publication year - 2012
Publication title -
proceedings of the american society for information science and technology
Language(s) - English
Resource type - Journals
eISSN - 1550-8390
pISSN - 0044-7870
DOI - 10.1002/meet.14504901100
Subject(s) - usability , crowdsourcing , usability lab , computer science , usability engineering , web usability , usability inspection , cognitive walkthrough , system usability scale , pluralistic walkthrough , heuristic evaluation , usability goals , human–computer interaction , world wide web
While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming. The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn‐around, and significant cost savings. To investigate the potential of such crowdsourced usability testing, we conducted a usability study which evaluated a graduate school's website using a crowdsourcing platform. In addition, we performed a similar but not identical traditional lab usability test on the same site. While we find that crowdsourcing exhibits some notable limitations in comparison to the traditional lab environment, its applicability and value for usability testing is clearly evidenced. We discuss both methodological differences for crowdsourced usability testing, as well as empirical contrasts to results from more traditional, face‐to‐face usability testing.