Weitere Kapitel dieses Buchs durch Wischen aufrufen
The Sustainability Cultural Indicators Program (SCIP) at the University of Michigan is designed to measure and track the university’s progress (Callewaert and Marans 2017) in moving the campus community towards a culture of sustainability. SCIP gathers this data using a web survey conducted annually. Web surveys generally attain lower response rates than other modes of data collection. Web surveys are also at risk of other forms of nonresponse, such as breakoffs, which happen less frequently in other modes. Breakoffs commonly occur very early in a web survey, often on informed consent screens required by Institutional Review Boards (IRBs), before respondents have a chance to get to the survey content. There are many methods used (prenotification, incentives, etc.) to try to increase participation and reduce breakoffs. This paper investigates the efficacy of two experiments designed to increase participation and reduce breakoffs in two SCIP surveys. The first experiment examines the effect of “celebrity endorsement”. As part of the final email reminder, respondents were randomized to receive a reminder with a link to the survey or a reminder that also contained a link to a video of a head coach from the U-M Department of Athletics encouraging non-respondents to participate. The second experiment investigates informed consent screen design. One group was presented a screen appearing as a traditional informed consent form. The other group was presented a screen with the most important items visible and the rest of the information available via a series of accordion menus.
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
Bosnjak, M., & Tuten, T. L. (2003). Prepaid and promised incentives in web surveys: An experiment. Social Science Computing Review,21(2), 208–217. CrossRef
Bosnjak, M., Neubarth, W., Couper, M. P., Bandilla, W., & Kaczmirek, L. (2008). Prenotification in web surveys: The influence of mobile text messaging versus e-mail on response rates and sample composition. Social Science Computing Review,26(2), 213–223. CrossRef
Callegaro, M. (2010). Do you know which device your respondent has used to take your online survey? Survey Practice, 3(6), 1–12.
Callegaro, M. (2013). Paradata in web surveys. Improving Surveys with Paradata, 259–279.
Callegaro, M., Manfreda, K. L., & Vehovar, V. (2015). Web survey methodology. Thousand Oaks: Sage.
Callewaert, J., & Marans, R. W. (2017). Measuring progress over time: The sustainability cultural indicators program at the University of Michigan. In W. Leal (ed.). Handbook of theory and practice of sustainable development in higher education (Vol. 2). New York: Springer.
Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web-or internet-based surveys. Educational and Psychological Measurement, 60(6), 821–836.
Couper, M. P. (1998). Measuring survey quality in a CASIC environment. Proceedings of the survey research methods section, ASA, pp. 41–49.
Couper, M. P. (2000). Review: Web surveys: A review of issues and approaches. Public Opinion Quarterly,64(4), 464–494. CrossRef
Couper, M. P. (2008). Designing effective web surveys. New York: Cambridge University Press. CrossRef
Crawford, S. D., McCabe, S. E., Saltz, B., Boyd, C. J., Freisthler, B., & Paschall, M. J. (2004). Gaining respondent cooperation in college web-based alcohol surveys: Findings from experiments at two universities. Paper presented at the 59th annual conference of the American Association for Public Opinion Research, Phoenix, AZ, May.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys. Hoboken: Wiley.
Harmon, M. A., Westin, E. C., & Levin, K. Y. (2005). Does type of pre-notification affect web survey response rates? Paper presented at the 60th annual conference of the American Association for Public Opinion Research, Miami Beach, FL, May.
Holland, L., Couper, M. P., & Schroeder, H. (2014), Pre-notification strategies for mixed-mode data collection. Paper presented at the 69th annual conference of the American Association for Public Opinion Research, Anaheim, CA, May.
Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly,68(1), 94–101. CrossRef
Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research,50(1), 79–104.
Pew Research Center. (April, 2015). The smartphone difference. http://www.pewinternet.org/2015/04/01/us-smartphone-use-in-2015/.
Peytchev, A. (2009). Survey breakoff. Public Opinion Quarterly,73(1), 74–97. CrossRef
Schonlau, M., Asch, B. J., & Du, C. (2003). Web surveys as part of a mixed-mode strategy for populations that cannot be contacted by e-mail. Social Science Computer Review,21(2), 218–222. CrossRef
The American Association for Public Opinion Research. (2016). Standard definitions: Final dispositions of case codes and outcome rates for surveys. 9th edition. AAPOR.
Trouteaud, A. R. (2004). How you ask counts: A test of internet-related components of response rates to a web-based survey. Social Science Computer Review,22(3), 385–392. CrossRef
Tuten, T. L. (2005). Do reminders encourage response by affect response behaviors? reminders in web-based surveys. Paper presented at the ESF workshop on internet survey methodology, Dubrovnik, Croatia, September.
Vehovar, V., Batagelj, Z., Lozar Manfreda, K., & Zaletel, M. (2002). Nonresponse in web surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (pp. 229–242). New York: Wiley.
- Promoting Participation in a Culture of Sustainability Web Survey
Heather M. Schroeder
Andrew L. Hupp
Andrew D. Piskorowski
Fallstudie Überschwemmungskarten/© Thaut Images | Fotolia