PLoS ONE (Jan 2025)

Assessing construct reliability through open-ended survey response analysis.

  • Katherine E Koralesky,
  • Marina A G von Keyserlingk,
  • Daniel M Weary

DOI
https://doi.org/10.1371/journal.pone.0320570
Journal volume & issue
Vol. 20, no. 4
p. e0320570

Abstract

Read online

Online surveys often include quantitative attention checks, but inattentive participants might also be identified using their qualitative responses. We used the software Turnitin™ to assess the originality of open-ended responses in four mixed-method surveys that included validated multi-item rating scales (i.e., constructs). Across surveys, 18-35% of participants (n = 3,771) were identified as having copied responses from online sources. We assessed indicator reliability and internal consistency reliability and found that both were lower for participants identified as using copied text versus those who wrote more original responses. Those who provided more original responses also provided more consistent responses to the validated scales, suggesting that these participants were more attentive. We conclude that this process can be used to screen open-ended responses from online surveys. We encourage future research to replicate this screening process using similar tools, investigate strategies to reduce copying behaviour, and explore the motivation of participants to search for information online, including what sources they find compelling.