Questionnaire Length and Its Impact on Response Quality
DOI:
https://doi.org/10.58210/rie3737Keywords:
Questionnaire length, response quality, data quality, digital surveysAbstract
Digital surveys have shown a growing trend in research. The convenience of completing a survey on a smartphone has created incentives to design lengthy questionnaires; however, evidence on the effect such instruments may have on data quality due to factors such as respondent fatigue remains scarce. This study aims to explore the impact of questionnaire length on response quality in management research. It consists of a bibliometric review analyzing 30 articles published in SCOPUS between 2008 and 2024. The findings suggest that questionnaire size and forced-response conditions contribute to a loss of response quality. The results also point to the strategy of dividing questionnaires into sections as a means of preventing respondent fatigue. The study advances the hypothesis of quality loss in lengthy questionnaires, which calls into question the results and conclusions of research employing instruments with an excessive number of items.
References
Alqudah, A. & Al-Shraifin, N. (2020). The effect of the length of a questionnaire on the accuracy estimations of the ability and the psychometric properties of the item and scale in the light of item response theory. An-Najah University Journal for Research - B (Humanities), 34(6), pp. 953–982. DOI: 10.35552/0247-034-006-002
Andreadis, I. & Kartsounidou, E. (2020). The impact of splitting a long online questionnaire on data quality. Survey Research Methods, 14(1), pp. 31–42. https://doi.org/10.18148/srm/2020.v14i1.7294
Avolio B., & Bass B. (2004). Multifactor Leadership Questionnaire. Instrument (Leader and Rater Form) and Scoring Guide (Form 5X-Short). English and Spanish versions. Mind Garden, Inc.
Bauer, I., Kunz, T. & Gummer, T., (2023). Plain language in web questionnaires: effects on data quality and questionnaire evaluation. International Journal of Social Research Methodology. https://doi.org/10.1080/13645579.2023.2294880
Cernat, A., Sakshaug, J., Christmann, P. & Gummer, T. (2022). The Impact of Survey Mode Design and Questionnaire Length on Measurement Quality. Sociological Methods and Research. https://doi.org/10.1177/00491241221140139
Chauliac, M., Willems, J., Gijbels, D. & Donche, V. (2023). The prevalence of careless response behaviour and its consequences on data quality in self-report questionnaires on student learning. Frontiers in Education, 8, 1197324. https://doi.org/10.3389/feduc.2023.1197324
Cornesse, C. & Blom, A.G. (2023). Response Quality in Nonprobability and Probability-based Online Panels. Sociological Methods and Research, 52(2), pp. 879–908. https://doi.org/10.1177/0049124120914940
Creswell, J. W., & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (5td. Edition). SAGE Publications.
Damrongpanit, S. (2019) Factor Structure and Measurement Invariance of the Self-discipline Model Using the Different-length Questionnaires: Application of Multiple Matrix Sampling. Universal Journal of Educational Research, 7(1), pp. 133–145. DOI: 10.13189/ujer.2019.070118
Décieux, J.P. (2022). Sequential On-Device Multitasking within Online Surveys: A Data Quality and Response Behavior Perspective. Sociological Methods and Research. https://doi.org/10.1177/00491241221082593
De Rada, V.D. (2022). Strategies to improve response rates to online surveys. Papers, 107(4), e3073. DOI: 10.5565/rev/papers.3073
Eisele, G., Vachon, H., Lafit, G. Myin-Germeys, I. & Viechtbauer, W. (2022). The Effects of Sampling Frequency and Questionnaire Length on Perceived Burden, Compliance, and Careless Responding in Experience Sampling Data in a Student Population. Assessment, 29(2), pp. 136–151. doi: 10.1177/1073191120957102
Funa, A.A., Gabay, R.A.E., Estonanto, A.J.J., Prudente, M.S. (2022). Development and Validation of Online Survey Instrument on Sustainable Development for Science Teachers: Focus on Pili (Canarium ovatum). Journal of Turkish Science Education, 19(2), pp. 559–576. https://doi.org/10.36681/
Ganassali, S. (2008) The Influence of the Design of Web Survey Questionnaires on the Quality of Responses. Survey Research Methods, 2(1), 21–32. https://doi.org/10.18148/srm/2008.v2i1.598
Gibson, A.M. & Bowling, N.A. (2020). The Effects of Questionnaire Length and Behavioral Consequences on Careless Responding. European Journal of Psychological Assessment, 36(2), pp. 410–420 https://doi.org/10.1027/1015-5759/a000526
Goerres, A., Höhne, J.K. (2023). Evaluating the response effort and data quality of established political solidarity measures: a pre-registered experimental test in an online survey of the German adult resident population in 2021. Quality and Quantity, 57(6), pp. 5431–5447. DOI: 10.1007/s11135-022-01594-4
Gonçalves, A.P.V., Dos Santos, R.S.P. (2023). Quantitative research in sociology: resources and dilemmas of carrying out online surveys with workers during the covid-19 pandemic. Sociologia e Antropologia, 13(2), e220007.
Gummer, T., Bach, R., Daikeler, J. & Eckman, S. (2021). The relationship between response probabilities and data quality in grid questions. Survey Research Methods, 15(1), pp. 65–77. https://doi.org/10.18148/srm/2021.v15i1.7727
Hernández, R., Fernández, C. y Baptista, M. (2014). Metodología de la investigación. McGraw Hill Education. https://www.esup.edu.pe/wpcontent/uploads/2020/12/2.%20Hernandez,%20Fernandez%20y%20Baptista-Metodolog%C3%ADa%20Investigacion%20Cientifica%206ta%20ed.pdf
Ito, D., Todoroki, M. (2021). Evaluating the Quality of Online Survey Data Collected in 2018 in the USA: Univariate, Bivariate, and Multivariate Analyses. International Journal of Japanese Sociology, 30(1), pp. 140–162. DOI: 10.1111/ijjs.12117
Korytnikova, N.V. (2021). Paradata as indicators of online survey data quality: Classification experience. Sotsiologicheskie Issledovaniya, 2021(3), pp. 111–120. https://doi.org/10.31857/S013216250010298-0
Lawlor, J., Thomas, C., Guhin, A.T., Lerner, M.D., Drahota, A. (2021). Suspicious and fraudulent online survey participation: Introducing the REAL framework. Methodological Innovations, 14(3). https://doi.org/10.1177/20597991211050467
Malhotra N. (2016). Investigación de mercados. Quinta edición. PEARSON EDUCACIÓN.
Meitinger, K., Behr, D. & Braun, M. (2021). Using Apples and Oranges to Judge Quality? Selection of Appropriate Cross-National Indicators of Response Quality in Open-Ended Questions. Social Science Computer Review, 39(3), pp. 434–455. https://doi.org/10.1177/0894439319859848
Neves, C., Augusto, C., Terra, A.L. (2020). Online surveys: comparative tool analysis for the creation and administration of e-surveys. AtoZ, 9(2), pp. 69 –78. https://revistas.ufpr.br/atoz/article/download/75826/43559
Nur, A.A., Leibbrand, C., Curran, S.R., Votruba-Drzal, E., Gibson-Davis, C. (2023). Managing and minimizing online survey questionnaire fraud: lessons from the Triple C project. International Journal of Social Research Methodology. http://dx.doi.org/10.1080/13645579.2023.2229651
Peytchev, A. & Peytcheva, E. (2017). Reduction of measurement error due to survey length: Evaluation of the split questionnaire design approach. Survey Research Methods, 11(4), pp. 361–368 https://doi.org/10.18148/srm/2017.v11i4.7145
Real Academia Española - RAE (2014). Diccionadio de la lengua española. https://dle.rae.es/cuestionario
Robb, K.A., Gatting, L., & Wardle, J. (2017). What impact do questionnaire length and monetary incentives have on mailed health psychology survey response? British Journal of Health Psychology, 22(4), pp. 671–685 https://doi.org/10.1111/bjhp.12239
Sischka, P.E., Décieux, J.P., Mergener, A., Neufang, K.M., Schmidt, A.F. (2022). The Impact of Forced Answering and Reactance on Answering Behavior in Online Surveys. Social Science Computer Review, 40(2), pp. 405–425. https://doi.org/10.1177/0894439320907067
Smyk, M., Tyrowicz, J., Van der Velde, L. (2021). A Cautionary Note on the Reliability of the Online Survey Data: The Case of Wage Indicator. Sociological Methods and Research, 50(1), pp. 429–464. https://doi.org/10.1177/0049124118782538
Van Laar, S. & Braeken J. (2024). Prevalence of random responders as a function of scale position and questionnaire length in the TIMSS 2015 eighth-grade student questionnaire. International Journal of Testing, 24(1), pp. 24–52. https://doi.org/10.1080/15305058.2023.2263206
Vehovar, V., Couper, M.P. & Čehovin, G. (2023). Alternative Layouts for Grid Questions in PC and Mobile Web Surveys: An Experimental Evaluation Using Response Quality Indicators and Survey Estimates. Social Science Computer Review, 41(6), pp. 2122–2144. https://doi.org/10.1177/08944393221132644
Wang, Y., Chen, X. & Zhou, X. (2023). A New Method for Identifying Low-Quality Data in Perceived Usability Crowdsourcing Tests: Differences in Questionnaire Scores. International Journal of Human-Computer Interaction. https://doi.org/10.1080/10447318.2023.2263694
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Jorge Izaguirre Olmedo, Ernesto Rangel

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors retain copyright and grant Revista Inclusiones the right of publication under Creative Commons Attribution 4.0 International (CC BY 4.0). This allows use, distribution, and reproduction in any medium, provided proper attribution is given to the author.





