Respondent’s Answering Time for Question as a Factor of Data Quality Estimation

Authors

  • Mykola Sydorov Taras Shevchenko National University of Kyiv

DOI:

https://doi.org/10.29038/2306-3971-2017-01-36-43

Keywords:

median equality test, median confidence inerval, paradata, vignette, online survey, timestamps

Abstract

Computer assisted tools for survey conduction enabled a record a lot of additional information, such as questions answering time as an additional factor to assess data quality. Within the article, the author offers an approach to evaluate the quality of data by evaluating thresholds of temporal characteristics of responses to vignettes in an online poll created with factorial design in program R. We used for this answers to a series of five vignettes describing complex experimental situations and minimal their reading in at least 10 seconds, obtained in a solid survey of students of sociology faculty, Taras Shevchenko National University of Kyiv in on-line shell LimeSurvey (2015).
Construction and analysis of the confidence interval for the sample or the confidence interval for median author used as statistical method for eliminating the outliers. The difference between the model constructed for a specified time interval, and total full model is not too large. Significant difference between the distribution time is observed only between 1st and all other vignettes: vignettes are homogeneous, so the time spent on building understanding experimental situation only for 1st vignette, and everyone else has seen «by analogy». Fullness of time intervals by gender shows that the proportion of women who gave a quick response, with every vignette is reduced, man, on the contrary – are increasing. Perhaps men are less closely to the tasks they become more bored to read vignettes with similar experimental situations.
Because the answer time of question depend of a lot of different factors – the structure and complexity of the questions, the personal qualities of the respondent, etc. – require additional research, including using regression analysis.

References

1. Marchenko, A. M., Sydorov, M. V.-S. (2014), Methodical Specificity of Factorial Surveys (on the example of pilot study of the role of ideological issues in friendships), Methodology, Theory and Practice of Sociological Analysis of Contemporary Society, Issue 20, Pp. 116–122.
2. Sydorov, M. V.-S., Sereda, O. S., Mramornova, O. M. (2015), Use of LimeSurvey for Online Implementation of Factorial Design in Surveys, Topical Issues of Sociology, Psychology and Pedagogy, No. 29, Vol. 4, Pp. 134–141.
3. Bassili, J. N., Fletcher, J. F. (1991), Response-Time Measurement in Survey Research a Method for CATI and a New Look at Nonattitudes, Public Opinion Quarterly, Vol. 55, No. 3, Pp. 331–346.
4. Draisma, S., Dijkstra, W. (2004), Response Latency and (Para)Linguistic Expressions as Indicators of Response Error, Methods for Testing and Evaluating Survey Questionnaires, S. Presser … [et al.], John Wiley and Sons, Inc., Hoboken, New Jersey, Pp. 131–147.
5. Heerwegh, D. (2003), Explaining response latencies and changing answers using client-side paradata from a web survey, Social Science Computer Review, No. 21, Pp. 360–373.
6. Kreuter, F. (2013), Improving surveys with paradata: analytic uses of process information, Joint Program in Survey Methodology, University of Maryland, Institute for Employment Research, Nuremberg Ludwig Maximilian University, Munich, Wiley, 418 p.
7. LimeSurvey (2016), The most popular Free Open Source Software survey tool on the web, https://www.limesurvey.org/
8. Malhotra, N. (2008), Completion Time and Response Order Effects in Web Surveys, Public Opinion Quarterly, No. 72, Vol. 5, Pp. 914–934.
9. Ratcliff, R. (1993), Methods for dealing with reaction time outliers, Psychological Bulletin, No. 114, Pp. 510–532.
10. Sendelbah, A., Vehovar, V., Slavec, A., Petrovcic, A. (2016), Investigating respondent multitasking in web surveys using paradata, Computers in Human Behavior, No. 55, Pp. 777–787.
11. Yan, T., Tourangeau, R. (2008), Fast Times and Easy Questions: The Effects of Age, Experience and Question Complexity on Web Survey Response Times, Applied Cognitive Psychology, No. 22, Vol. 1, Pp. 51–68.
12. Zandt van, T. (2002), Analysis of response time distributions, Stevens’ handbook of experimental psychology, 3rd ed., Vol. 4: Methodology in experimental psychology, J. Wixted, H. Pashler, New York: John Wiley and Sons, Inc

Published

31.05.2018

Issue

Section

METHODOLOGY AND METHODS OF SOCIOLOGICAL RESEARCH