Skip to main navigation menu Skip to main content Skip to site footer

Notes

Vol. 22 No. 1 (2020)

Is respondents’ inattention in online surveys a major issue for research?

DOI
https://doi.org/10.3280/ecag1-2020oa10069
Submitted
June 22, 2020
Published
2020-06-22

Abstract

Participant attentiveness may represent a major concern for all researchers using online self-report survey data, as findings from non-diligent participants add noise and can significantly decrease results reliability. Therefore, attention checks have become a popular method in survey design across social sciences to capture careless or insufficient-effort of respondents, thus increasing quality of samples and the internal validity of the research. The aim of this note is to offer an overview and categorization of the different techniques adopted to flag inattentive respondents and present the potential drawbacks of not considering the issue in social sciences research.

References

  1. Abbey, J.D. & Meloy, M.G. (2017). Attention by design: Using attention checks to detect inattentive respondents and improve data quality. Journal of Operations Management, 53-56, 63-70, doi: 10.1016/j.jom.2017.06.001.
  2. Alvarez, R.M., Atkeson, L.R., Levin, I. & Li, Y. (2019). Paying attention to inattentive survey respondents. Political Analysis, 27(2), 145-162, doi: 10.1017/pan.2018.57.
  3. Bassett, J., Cleveland, A., Acorn, D., Nix, M. & Snyder, T. (2017). Are they paying attention? Students’ lack of motivation and attention potentially threaten the utility of course evaluations. Assessment & Evaluation in Higher Education, 42(3), 431-442, doi: 10.1080/02602938.2015.1119801.
  4. Beach, D.A. (1989). Identifying the random responder. Journal of Psychology: Interdisciplinary and Applied, 123(1), 101-103, doi: 10.1080/00223980.1989.10542966.
  5. Berinsky, A.J., Margolis, M.F. & Sances, M.W. (2014). Separating the shirkers from the workers? Making sure respondents pay attention on self-administered surveys. American Journal of Political Science, 58(3), 739-753, doi: 10.1111/ajps.12081.
  6. Börger, T. (2016). Are Fast Responses More Random? Testing the Effect of Response Time on Scale in an Online Choice Experiment. Environmental and Resource Economics, 65(2), 389-413, doi: 10.1007/s10640-015-9905-1.
  7. Bowling, N.A., Huang, J.L., Bragg, C.B., Khazon, S., Liu, M. & Blackmore, C.E. (2016). who cares and who is careless? Insufficient effort responding as a reflection of respondent personality. Journal of Personality and Social Psychology, 111(2), 218-229, doi: 10.1037/pspp0000085.
  8. Buhrmester, M., Kwang, T. & Gosling, S.D. (2011). Amazon’s mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6(1), 3-5, doi: 10.1177/1745691610393980.
  9. Carlsson, F. (2012). Non-Market Valuation: Stated Preference Methods. In The Oxford Handbook of the Economics of Food Consumption and Policy.
  10. Curran, P.G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66, 4-19, doi: 10.1016/j.jesp.2015.07.006.
  11. DeSimone, J.A. & Harms, P.D. (2018). Dirty Data: The Effects of Screening Respondents Who Provide Low-Quality Data in Survey Research. Journal of Business and Psychology, 33(5), 559-577, doi: 10.1007/s10869-017-9514-9.
  12. Desimone, J.A., Harms, P.D. & Desimone, A.J. (2015). Best practice recommendations for data screening. Journal of Organizational Behavior, 36(2), 171-181, doi: 10.1002/job.1962.
  13. Dunn, A.M., Heggestad, E.D., Shanock, L.R. & Theilgard, N. (2018). Intra-individual Response Variability as an Indicator of Insufficient Effort Responding: Comparison to Other Indicators and Relationships with Individual Differences. Journal of Business and Psychology, 33(1), 105-121, doi: 10.1007/s10869-016-9479-0.
  14. Edwards, J.R. (2019). Response invalidity in empirical research: Causes, detection, and remedies. Journal of Operations Management, doi: 10.1016/j.jom.2018.12.002.
  15. Francavilla, N.M., Meade, A.W. & Young, A.L. (2019). Social Interaction and Internet-Based Surveys: Examining the Effects of Virtual and In-Person Proctors on Careless Response. Applied Psychology-an International Review-Psychologie Appliquee-Revue Internationale, 68(2), 223-249, doi: 10.1111/apps.12159.
  16. Galesic, M. (2006). Dropouts on the web: Effects of interest and burden experienced during an online survey. Journal of Official Statistics, 22(2), 313.
  17. Gao, Z.F., House, L. & Bi, X. (2016). Impact of satisficing behavior in online surveys on consumer preference and welfare estimates. Food Policy, 64, 26-36, doi: 10.1016/j.foodpol.2016.09.001.
  18. Gibson, A.M. & Bowling, N.A. (2019). The effects of questionnaire length and behavioral consequences on careless responding. European Journal of Psychological Assessment, doi: 10.1027/1015-5759/a000526.
  19. Huang, J.L., Bowling, N.A., Liu, M.Q. & Li, Y.H. (2015). Detecting Insufficient Effort Responding with an Infrequency Scale: Evaluating Validity and Participant Reactions. Journal of Business and Psychology, 30(2), 299-311, doi: 10.1007/s10869-014-9357-6.
  20. Huang, J.L., Curran, P.G., Keeney, J., Poposki, E.M. & DeShon, R.P. (2012). Detecting and Deterring Insufficient Effort Responding to Surveys. Journal of Business and Psychology, 27(1), 99-114, doi: 10.1007/s10869-011-9231-8.
  21. Kostyk, A., Zhou, W.K. & Hyman, M.R. (2019). Using surveytainment to counter declining survey data quality. Journal of Business Research, 95, 211-219, doi: 10.1016/j.jbusres.2018.10.024.
  22. Krosnick, J.A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213-236, doi: 10.1002/acp.2350050305.
  23. Krosnick, J.A. (1999) Survey research. Annual Review of Psychology, 50, 537-567.
  24. Liberati, A., Altman, D.G., Tetzlaff, J., Mulrow, C., Gøtzsche, P.C., Ioannidis, J.P.A., Moher, D. (2009). The prisma statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. Italian Journal of Public Health, 6(4), 354-391. -- Retrieved from https://www.scopus.com/inward/record.uri?eid=2-s2.0-76549089252&partnerID=40&md5=dc8847f1671dc34e0b515045479a8b14.
  25. Malone, T. & Lusk, J.L. (2018). Consequences of Participant Inattention with an Application to Carbon Taxes for Meat Products. Ecological Economics, 145, 218-230, doi: 10.1016/j.ecolecon.2017.09.010.
  26. Malone, T. & Lusk, J.L. (2019). Releasing the trap: a method to reduce inattention bias in survey data with application to us beer taxes. Economic Inquiry, 57(1), 584-599, doi: 10.1111/ecin.12706.
  27. Mancosu, M., Ladini, R. & Vezzoni, C. (2019). ‘Short is Better’. Evaluating the Attentiveness of Online Respondents Through Screener Questions in a Real Survey Environment. BMS Bulletin of Sociological Methodology/ Bulletin de Methodologie Sociologique, 141(1), 30-45, doi: 10.1177/0759106318812788.
  28. Maniaci, M.R. & Rogge, R.D. (2014). Caring about carelessness: Participant inattention and its effects on research. Journal of Research in Personality, 48(1), 61-83, doi: 10.1016/j.jrp.2013.09.008.
  29. McKibben, W.B. & Silvia, P.J. (2015). Inattentive and Socially Desirable Responding: Addressing Subtle Threats to Validity in Quantitative Counseling Research. Counseling Outcome Research and Evaluation, 7(1), 53-64, doi: 10.1177/2150137815613135.
  30. Meade, A.W. & Craig, S.B. (2012). Identifying Careless Responses in Survey Data. Psychological Methods, 17(3), 437-455, doi: 10.1037/a0028085.
  31. Murphy, J.J., Allen, P.G., Stevens, T.H. & Weatherhead, D. (2005). A meta-analysis of hypothetical bias in stated preference valuation. Environmental and Resource Economics, 30(3), 313-325, doi: 10.1007/s10640-004-3332-z.
  32. Oppenheimer, D.M., Meyvis, T. & Davidenko, N. (2009). Instructional manipulation checks: Detecting satisficing to increase statistical power. Journal of Experimental Social Psychology, 45(4), 867-872, doi: 10.1016/j.jesp.2009.03.009.
  33. Tourangeau, R., Rips, L.J. & Rasinski, K. (2000). The psychology of survey response. Cambridge University Press.

Metrics

Metrics Loading ...