Description
Crowdsourcing has become a popular method for collecting information from a large group of people at a low cost. This thesis looks at whether crowdsourcing is viable to collect data for Human Computer Interaction research and comparing collaborative crowdsourcing with individual crowdsourcing. It was hypothesized that collaborative crowdsourcing would provide higher quality results than individual crowdsourcing due to intrinsic motivation. The research draws upon the use of three things: top 10 usability problems, heuristic evaluation and WAMMI survey to measure the two groups. The two groups used these tools to analyze the website: Phoenix.Craigslist.com. The results were compared against each other and against a heuristic evaluation score given by an HCI researcher to determine their accuracy. The results of the experiment failed to confirm this hypothesis. In the end, both groups provided accurate results and were only marginally different from each other.
Download count: 2
Details
Title
- Individual vs. Collaborative Crowdsourcing in HCI Design
Contributors
- Gupta, Kartik (Author)
- Atkinson, Robert (Thesis director)
- Chavez-Echeagaray, Maria Elena (Committee member)
- Computer Science and Engineering Program (Contributor)
- Barrett, The Honors College (Contributor)
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2018-05
Resource Type
Collections this item is in