ABSTRACT
Crowd feedback systems offer designers an emerging approach for improving their designs, but there is little empirical evidence of the benefit of these systems. This paper reports the results of a study of using a crowd feedback system to iterate on visual designs. Users in an introductory visual design course created initial designs satisfying a design brief and received crowd feedback on the designs. Users revised the designs and the system was used to generate feedback again. This format enabled us to detect the changes between the initial and revised designs and how the feedback related to those changes. Further, we analyzed the value of crowd feedback by comparing it with expert evaluation and feedback generated via free-form prompts. Results showed that the crowd feedback system prompted deep and cosmetic changes and led to improved designs, the crowd recognized the design improvements, and structured workflows generated more interpretative, diverse and critical feedback than free-form prompts.
- Baeza-Yates, R. and Ribeiro-Neto, B. Modern Information Retrieval. ACM press, 1999. Google ScholarDigital Library
- Bernstein, M.S., Brandt, J., Miller, R.C., & Karger, D.R., Crowds in Two Seconds: Enabling Realtime Crowd-Powered Interfaces. In UIST, (2011), 33--42. Google ScholarDigital Library
- Bland, J.M., and Douglas G. Altman Multiple Significance Tests: The Bonferroni Method. British Medical Journal, 310, 6973, (1995).Google Scholar
- Blei, D.M., Ng, A.Y., & Jordan, M.I. Latent Dirichlet Allocation. Journal of machine Learning research, 3, (2003), 993--1022. Google ScholarDigital Library
- Blomberg, J.L. and Henderson, A., Reflections on Participatory Design: Lessons from the Trillium Experience. In CHI, (1990), 353--360. Google ScholarDigital Library
- Carroll, D.W. Patterns of Student Writing in a Critical Thinking Course: A Quantitative Analysis. Assessing Writing, 12, 3, (2007), 213--227.Google ScholarCross Ref
- Chaffin, R. and Imreh, G. Practicing Perfection: Piano Performance as Expert Memory. Psychological Science, 13, 4, (2002), 342--349.Google ScholarCross Ref
- Dannels, D.P. and Martin, K.N. Critiquing Critiques: A Genre Analysis of Feedback across Novice to Expert Design Studios. Journal of Business and Technical Communication, 22, 2, (2008), 135--159.Google ScholarCross Ref
- Dave, B. and Danahy, J. Virtual Study Abroad and Exchange Studio. Automation in Construction, 9, 1, (2000), 57--71.Google ScholarCross Ref
- De Groot, A.D. and Groot, A.D.d. Thought and Choice in Chess. Walter de Gruyter, 1978.Google ScholarCross Ref
- Dow, S.P., Gerber, E., & Wong, A., A Preliminary Study of Using Crowds in the Classroom. In CHI, (2013), 227--236. Google ScholarDigital Library
- Dow, S.P., Glassco, A., Kass, J., Schwarz, M., Schwartz, D.L., & Klemmer, S.R. Parallel Prototyping Leads to Better Design Results, More Divergence, and Increased Self-Efficacy. ACM Transactions on Computer-Human Interaction (TOCHI), 17, 4, (2010). Google ScholarDigital Library
- Dutton, T.A. Design and Studio Pedagogy. Journal of Architectural Education, 41, 1, (1987), 16--25.Google ScholarCross Ref
- Einhorn, H.J. Expert Judgment: Some Necessary Conditions and an Example. Journal of Applied Psychology, 59, 6, (1974), 562--571.Google ScholarCross Ref
- Elkins, J. Art Critiques: A Guide. New Academia Publishing, Washington DC, 2012.Google Scholar
- Feldman, E.B. Varieties of Visual Experience; Art as Image and Idea. H.N. Abrams, New York, 1971.Google Scholar
- Heer, J. and Bostock, M., Crowdsourcing Graphical Perception: Using Mechanical. Turk to Assess Visualization Design. In CHI, (2010), 203--212. Google ScholarDigital Library
- Kamhi, A.G. and Catts, H.W. Language and Reading Disabilities, 2012.Google Scholar
- Kim, K., Bae, J., Nho, M.-W., & Lee, C.H. How Do Experts and Novices Differ? Relation Versus Attribute and Thinking Versus Feeling in Language Use. Psychology of Aesthetics, Creativity, and the Arts, 5, 4, (2011).Google Scholar
- Kittur, A., Nickerson, J.V., Bernstein, M.S., Gerber, E.M., Shaw, A., Zimmerman, J., Lease, M., & Horton, J.J., The Future of Crowd Work. In CSCW, (2013), 1301--1318. Google ScholarDigital Library
- Komarov, S., Reinecke, K., & Gajos, K.Z., Crowdsourcing Performance Evaluations of User Interfaces. In CHI, (2013), 207--216. Google ScholarDigital Library
- Kulkarni, C. and Klemmer, S. Learning Design Wisdom by Augmenting Physical Studio Critique with Online SelfAssessment. Stanford University technical report, (2012).Google Scholar
- Luther, K., Pavel, A., Wu, W., Tolentino, J.-l., Agrawala, M., Hartmann, B., & Dow., S.P., Crowdcrit: Crowdsourcing and Aggregating Visual Design Critique. In CSCW, (2014), 21--24. Google ScholarDigital Library
- Mason, W. and Watts, D.J. Financial Incentives and the Performance of Crowds. SigKDD Explorations Newsletter, 11, 2, (2010), 100--108. Google ScholarDigital Library
- Park, C.H., Son, K., Lee, J.H., & Bae, S.-H., Crowd Vs. Crowd: Large-Scale Cooperative Design through Open Team Competition. In CSCW, (2013), 1275--1284. Google ScholarDigital Library
- Perez, R.S. and Emery, C.D. Designer Thinking: How Novices and Experts Think About Instructional Design. Performance Improvement Quarterly, 8, 3, (1995), 80--95.Google Scholar
- Risatti, H. Art Criticism in Discipline-Based Art Education. Journal of Aesthetic Education, 21, 2, (1987), 217--225.Google ScholarCross Ref
- Schön, D.A. Educating the Reflective Practitioner. JosseyBass, San Francisco, 1987.Google Scholar
- Strauss, A.L. Qualitative Analysis for Social Scientists. Cambridge University Press, 1987.Google ScholarCross Ref
- Tohidi, M., Buxton, W., Baecker, R., & Sellen, A., Getting the Right Design and the Design Right. In CHI, (2006), 12431252. Google ScholarDigital Library
- Venturi, L. and Marriott, C. History of Art Criticism. Dutton, New York, 1964.Google Scholar
- Vredenburg, K., Mao, J.-Y., Smith, P.W., & Carey., T., A Survey of User-Centered Design Practice. In CHI, (2002), 471--478. Google ScholarDigital Library
- Willett, W., Heer, J., & Agrawala, M., Strategies for Crowdsourcing Social Data Analysis. In CHI, (2012), 227236. Google ScholarDigital Library
- Xu, A. Designing with Crowds. University of Illinois at Urbana-Champaign, PhD diss., (2014).Google Scholar
- Xu, A. and Bailey, B.P., What Do You Think? A Case Study of Benefit, Expectation, and Interaction in a Large Online Critique Community. In CSCW, (2012), 295--304. Google ScholarDigital Library
- Xu, A., Huang, S.-W., & Bailey, B.P., Voyant: Generating Structured Feedback on Visual Designs Using a Crowd of Non-Experts. In CSCW, (2014), 1433--1444. Google ScholarDigital Library
- Yu, L., Kittur, A., & Kraut, R.E., Distributed Analogical Idea Generation: Inventing with Crowds. In CHI, (2014), 12451254. Google ScholarDigital Library
- Yu, L. and Nickerson, J.V., Cooks or Cobblers? Crowd Creativity through Combination. In CHI, (2011), 1393--1402. Google ScholarDigital Library
- Feedbackarmy. http://www.feedbackarmy.com.Google Scholar
- Fivesecondtest. http://fivesecondtest.com.Google Scholar
- UITests. http://www.uitests.com.Google Scholar
- Usabilla. http://www.usabilla.com.Google Scholar
Index Terms
- A Classroom Study of Using Crowd Feedback in the Iterative Design Process
Recommendations
ProtoChat: Supporting the Conversation Design Process with Crowd Feedback
CSCWSimilar to a design process for designing graphical user interfaces, conversation designers often apply an iterative design process by defining a conversation flow, testing with users, reviewing user data, and improving the design. While it is possible ...
Enhancing the Usage of Crowd Feedback for Iterative Design
C&C '17: Proceedings of the 2017 ACM SIGCHI Conference on Creativity and CognitionOnline crowd platforms (e.g. social networks, online communities, task markets) enable designers to gain insights from large audiences quickly and affordably. However, there is no guidance for designers to better allocate their social capital, time, and ...
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
Comments