ABSTRACT
Crowdsourcing is emerging as an effective method for performing tasks that require human abilities, such as tagging photos, transcribing handwriting and categorising data. Crowd workers perform small chunks of larger tasks in return for a reward, which is generally monetary. Reward can be one factor for motivating workers to produce higher quality results. Yet, as highlighted by previous research, the task design, in terms of its instructions and user interface, can also affect the workers' perception of the task, thus affecting the quality of results. In this study we investigate both factors, reward and task design, to better understand their role in relation to the quality of work in crowdsourcing. In Experiment 1 we test a variety of reward schemas while in Experiment 2 we measure the effects of the complexity of tasks and interface on attention. The long-term goal is to establish guidelines for designing tasks with the aim to maximize workers' performance.
- Chipman, S. F., Schraagen, J. M., & Shalin, V. L. Introduction to cognitive task analysis. In J. M. Schraagen et.al. (Eds.), Cognitive task analysis (pp. 3--23). Mahwah, NJ: Lawrence Erlbaum Associates (2000).Google Scholar
- Crump MJC, McDonnell JV, Gureckis TM Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research. PLoS ONE 8 (3). (2013).Google Scholar
- Dow, S., Kulkarni, A. P., Bunge, B., Nguyen, T., Klemmer, S. R., and Hartmann, B. Shepherding the crowd: managing and providing feedback to crowd workers. In CHI Extend. Abstracts (2011), 1669--1674. Google ScholarDigital Library
- Heer, J. and Bostock, M. Crowdsourcing AMT graphical perception: using mechanical turk to assess visualization design. In CHI (2010), 203--212. Google ScholarDigital Library
- Howe, J. The rise of crowdsourcing Wired Magazine 14 (6). (2006)Google Scholar
- Kaufmann, N., and Schulze, T. Worker motivation in crowdsourcing and human computation. Education 17, (2011).Google Scholar
- Kieras, D. and Polson P. G., An approach to the formal analysis of user complexity. International Journal of Man-Machine Studies, 22 (1985), 365--394.Google ScholarCross Ref
- Kittur, A. Crowdsourcing, Collaboration and Creativity. XRDS, 17 (2010), 22--26. Google ScholarDigital Library
- Kittur, A. Nickerson, J V. Bernstein, Ml Gerber, E. Shaw, A. Zimmerman, J. Lease, M. and Horton, J. The future of crowd work. In CSCW (2013)., 1301--1318. Google ScholarDigital Library
- Komarov, S. Reinecke, K. and Gajos K. Z. Crowdsourcing Performance Evaluations of User Interfaces. In CHI (2013), 207--216. Google ScholarDigital Library
- Mason, W. A., and Watts, D. J. Financial incentives and the "performance of crowds". In KDD Workshop on Human Computation (2009), 77--85. Google ScholarDigital Library
- Rogstadius, J., Kostakos, V., Kittur, A., Smus, B., Laredo, J., and Vukovic, M. An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In ICWSM (2011).Google Scholar
- Shaw, A. D., Horton, J. J., and Chen, D. L. Designing incentives for inexpert human raters. In CSCW (2011), 275--284. Google ScholarDigital Library
- Sweller, J. Cognitive Load During Problem Solving: Effects on Learning. Cognitive Science, 12 (1988) 257--85.Google ScholarCross Ref
Index Terms
- Keep it simple: reward and task design in crowdsourcing
Recommendations
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
Curiosity Killed the Cat, but Makes Crowdwork Better
CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing SystemsCrowdsourcing systems are designed to elicit help from humans to accomplish tasks that are still difficult for computers. How to motivate workers to stay longer and/or perform better in crowdsourcing systems is a critical question for designers. ...
"Why would anybody do this?": Understanding Older Adults' Motivations and Challenges in Crowd Work
CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing SystemsDiversifying participation in crowd work can benefit the worker and requester. Increasing numbers of older adults are online, but little is known about their awareness of or how they engage in mainstream crowd work. Through an online survey with 505 ...
Comments