DOI QR코드

DOI QR Code

The Effect of Incentives on Internet Surveys: Response Rate Changes After the Introduction of Incentives

  • Kennedy, John M. (Indiana University Bloomington) ;
  • Ouimet, Judith A. (Indiana University Bloomington)
  • Published : 2014.02.28

Abstract

Incentives are often included in survey design because they are known to improve response rates, at least moderately. This paper describes the changes in the response rates when incentives were introduced into a longitudinal survey. The National Survey of Student Engagement was conducted annually at Indiana University Bloomington from 2000 through 2012. In 2010, incentives were introduced in an attempt to reverse the declining response rates. The incentives performed as expected, raising the AAPOR Response Rate 3 from 24% in 2009 to 36% in 2010. From 2010 through 2012, different types of incentives were tried but the response rates did not change substantially. The results from the changes in incentives can help survey practitioners decide the number and types of incentives that might be used effectively to increase response rates.

Keywords

References

  1. Bosnjak, M.,& Tuten, T.L. (2003). Prepaid and promised incentives in web surveys: An experiment, Social Science Computer Review. 21(2), 208-217. doi:10.1177/0894439303021002006
  2. Deutskens, E., Ruyter, K. D., & Wetzels, M. (2004). Response rate and response quality of Internet-based surveys: An experimental study, Marketing Letters. 15(1), 21-36. https://doi.org/10.1023/B:MARK.0000021968.86465.00
  3. Dillman, D. A., Smyth, J.D., & Christian, L.M. (2009). Internet, mail, and mixed-mode surveys:The tailored design method. Hoboken: John Wiley and Sons.
  4. Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis, Public Opinion Quarterly, 72(2), 167-189. doi:10.1093/poq/nfn011
  5. Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation description and an illustration. Public Opinion Quarterly, 64(3), 299-308. Retrieved from http://poq.oxfordjournals.org/content/64/3/299 https://doi.org/10.1086/317990
  6. Heerwegh, D. (2006). An investigation of the effect of lotteries on web survey response rates, Field Methods 18(2), 205-220. doi:10.1177/1525822X05285781
  7. Merkle, D. M., & Edelman, M. (2009). An experiment on improving response rates and its unintended impact on survey error. Survey Practice. Retrieved from http://surveypractice.wordpress.com/2009/03/24/improving-response-rates/
  8. Martin, G. L., & Loes, V.N. (2010). What incentives can teach us about missing data in longitudinal assessment? New Directions for Institutional Research. doi:10.1002/ir.369.
  9. Millar, M. M., & Dillman, D. A. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly. 75(2), 249-269. doi:10.1093/poq/nfr003
  10. Sanchez-Fernandez, J., Munoz-Leiva, F., Montoro-Rios, F. J., & Ibanez-Zapata, J. A. (2010). An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys. Quality & Quantity, 44(2), 357-373. https://doi.org/10.1007/s11135-008-9197-4
  11. Sauermanna, H., & Roach, M. (2013). Increasing web survey response rates in innovation research: An experimental study of static and dynamic contact design features. Research Policy, 42(1), 273-286. https://doi.org/10.1016/j.respol.2012.05.003
  12. Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645(1), 112-141. doi:10.1177/0002716212458082