skip to main content
10.1145/3308558.3313716acmotherconferencesArticle/Chapter ViewAbstractPublication PageswwwConference Proceedingsconference-collections
research-article

TurkScanner: Predicting the Hourly Wage of Microtasks

Published:13 May 2019Publication History

ABSTRACT

Workers in crowd markets struggle to earn a living. One reason for this is that it is difficult for workers to accurately gauge the hourly wages of microtasks, and they consequently end up performing labor with little pay. In general, workers are provided with little information about tasks, and are left to rely on noisy signals, such as textual description of the task or rating of the requester. This study explores various computational methods for predicting the working times (and thus hourly wages) required for tasks based on data collected from other workers completing crowd work. We provide the following contributions. (i) A data collection method for gathering real-world training data on crowd-work tasks and the times required for workers to complete them; (ii) TurkScanner: a machine learning approach that predicts the necessary working time to complete a task (and can thus implicitly provide the expected hourly wage). We collected 9,155 data records using a web browser extension installed by 84 Amazon Mechanical Turk workers, and explored the challenge of accurately recording working times both automatically and by asking workers. TurkScanner was created using ~ 150 derived features, and was able to predict the hourly wages of 69.6% of all the tested microtasks within a 75% error. Directions for future research include observing the effects of tools on people's working practices, adapting this approach to a requester tool for better price setting, and predicting other elements of work (e.g., the acceptance likelihood and worker task preferences.)

References

  1. {n. d.}. Turker Nation. Retrieved November 3, 2018 from https://www.reddit.com/r/turkernation/Google ScholarGoogle Scholar
  2. 2016. MTurk Crowd. Retrieved November 3, 2018 from https://www.mturkcrowd.com/Google ScholarGoogle Scholar
  3. Benjamin B Bederson and Alexander J Quinn. 2011. Web workers unite! addressing challenges of online laborers. In CHI'11 Extended Abstracts on Human Factors in Computing Systems. ACM, 97-106. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Janine Berg. 2015. Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. Comp. Lab. L. & Pol'y J. 37 (2015), 543.Google ScholarGoogle Scholar
  5. Robin Brewer, Meredith Ringel Morris, and Anne Marie Piper. 2016. Why would anybody do this?: Understanding older adults' motivations and challenges in crowd work. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2246-2257. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Chris Callison-Burch. 2014. Crowd-workers: Aggregating information across turkers to help them find higher paying work. In Second AAAI Conference on Human Computation and Crowdsourcing.Google ScholarGoogle ScholarCross RefCross Ref
  7. Tianqi Chen and Carlos Guestrin. 2016. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 785-794. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Chun-Wei Chiang, Anna Kasunic, and Saiph Savage. 2018. Crowd Coach: Peer Coaching for Crowd Workers' Skill Growth. Proceedings of the ACM on Human-Computer Interaction 2, CSCW(2018), 37. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. ChrisTurk. 2018. TurkerView. Retrieved November 4, 2018 from https://turkerview.com/Google ScholarGoogle Scholar
  10. Jerome H Friedman. 2001. Greedy function approximation: a gradient boosting machine. Annals of statistics(2001), 1189-1232.Google ScholarGoogle Scholar
  11. Benjamin V Hanrahan, Jutta K Willamowski, Saiganesh Swaminathan, and David B Martin. 2015. TurkBench: Rendering the market for Turkers. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1613-1616. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Kotaro Hara, Abigail Adams, Kristy Milland, Saiph Savage, Chris Callison-Burch, and Jeffrey P Bigham. 2018. A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 449. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Brandon Hellman. 2017. MTurk Suite. Retrieved November 3, 2018 from http://mturksuite.com/Google ScholarGoogle Scholar
  14. Paul Hitlin. 2016. Research in the crowdsourcing age, a case study. Pew Research Center 11(2016).Google ScholarGoogle Scholar
  15. John J Horton. 2011. The condition of the Turking class: Are online employers fair and honest?Economics Letters 111, 1 (2011), 10-12.Google ScholarGoogle Scholar
  16. John Joseph Horton and Lydia B Chilton. 2010. The labor economics of paid crowdsourcing. In Proceedings of the 11th ACM conference on Electronic commerce. ACM, 209-218. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. International Labour Office (ILO). 2016. Non-standard employment around the world: Understanding challenges, shaping prospects.Google ScholarGoogle Scholar
  18. Panagiotis G Ipeirotis. 2010. Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM Magazine for Students 17, 2 (2010), 16-21. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Lilly C Irani and M Silberman. 2013. Turkopticon.https://turkopticon.ucsd.edu/Google ScholarGoogle Scholar
  20. Lilly C Irani and M Silberman. 2013. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 611-620. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Lilly C Irani and M Silberman. 2016. Stories we tell about labor: Turkopticon and the trouble with design. In Proceedings of the 2016 CHI conference on human factors in computing systems. ACM, 4573-4586. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Lilly C Irani and M Silberman. 2017. Turkopticon 2. https://turkopticon.info/Google ScholarGoogle Scholar
  23. Toni Kaplan, Susumu Saito, Kotaro Hara, and Jeffrey P Bigham. 2018. Striving to Earn More: A Survey of Work Strategies and Tool Use Among Crowd Workers.. In HCOMP. 70-78.Google ScholarGoogle ScholarCross RefCross Ref
  24. Miranda Katz. 2017. Amazon Mechanical Turk Workers Have Had Enough. https://www.wired.com/story/amazons-turker-crowd-has-had-enough/Google ScholarGoogle Scholar
  25. Siou Chew Kuek, Cecilia Paradi-Guilford, Toks Fayomi, Saori Imaizumi, Panos Ipeirotis, Patricia Pina, and Manpreet Singh. 2015. The global opportunity in online outsourcing. (2015).Google ScholarGoogle Scholar
  26. David Martin, Benjamin V Hanrahan, Jacki O'Neill, and Neha Gupta. 2014. Being a turker. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM, 224-235. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Brian McInnis, Dan Cosley, Chaebong Nam, and Gilly Leshed. 2016. Taking a HIT: Designing around rejection, mistrust, risk, and workers' experiences in Amazon Mechanical Turk. In Proceedings of the 2016 CHI conference on human factors in computing systems. ACM, 2271-2282. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Jeffrey M Rzeszotarski and Aniket Kittur. 2011. Instrumenting the crowd: using implicit behavioral measures to predict task performance. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 13-22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Louis E Yelle. 1979. The learning curve: Historical review and comprehensive survey. Decision sciences 10, 2 (1979), 302-328.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    WWW '19: The World Wide Web Conference
    May 2019
    3620 pages
    ISBN:9781450366748
    DOI:10.1145/3308558

    Copyright © 2019 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 13 May 2019

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate1,899of8,196submissions,23%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format