La extensión del cuestionario y su incidencia en la calidad de respuesta
DOI:
https://doi.org/10.58210/rie3737Palavras-chave:
Extensión del cuestionario, calidad de respuesta, calidad de data, encuestas digitalesResumo
Con la creciente sofisticación de las herramientas de encuestas en línea y la necesidad de investigaciones a distancia durante la pandemia de COVID-19, el uso de las encuestas digitales, con fines de investigación, se ha multiplicado. La comodidad que representa responder una encuesta digital ha generado un incentivo a desarrollar cuestionarios que incluyen una alta cantidad de preguntas y requieren varios minutos para completarlas, sin embargo, existe escasa información del efecto que podría tener la extensión del cuestionario sobre la calidad de la data obtenida. El objetivo de este estudio es explorar la incidencia de la extensión del cuestionario sobre la calidad de respuesta para estudios administrativos. La metodología empleada en esta investigación es la revisión bibliométrica con metodología PRISMA. Para el efecto se han analizado 30 artículos publicados en SCOPUS desde el 2008 al 2024. Los resultados sugieren que el tamaño del cuestionario y la condición de respuesta forzada tienen una incidencia en la pérdida de calidad de respuesta. Así mismo, los resultados sugieren la estrategia de división del cuestionario en secciones para evitar el agotamiento del encuestado. El estudio plantea la hipótesis de una pérdida de calidad en cuestionarios extensos, lo cual invita a cuestionar los resultados y conclusiones de investigaciones que utilicen instrumentos con excesivo número de ítems.
Referências
Alqudah, A. & Al-Shraifin, N. (2020). The effect of the length of a questionnaire on the accuracy estimations of the ability and the psychometric properties of the item and scale in the light of item response theory. An-Najah University Journal for Research - B (Humanities), 34(6), pp. 953–982. DOI: 10.35552/0247-034-006-002
Andreadis, I. & Kartsounidou, E. (2020). The impact of splitting a long online questionnaire on data quality. Survey Research Methods, 14(1), pp. 31–42. https://doi.org/10.18148/srm/2020.v14i1.7294
Avolio B., & Bass B. (2004). Multifactor Leadership Questionnaire. Instrument (Leader and Rater Form) and Scoring Guide (Form 5X-Short). English and Spanish versions. Mind Garden, Inc.
Bauer, I., Kunz, T. & Gummer, T., (2023). Plain language in web questionnaires: effects on data quality and questionnaire evaluation. International Journal of Social Research Methodology. https://doi.org/10.1080/13645579.2023.2294880
Cernat, A., Sakshaug, J., Christmann, P. & Gummer, T. (2022). The Impact of Survey Mode Design and Questionnaire Length on Measurement Quality. Sociological Methods and Research. https://doi.org/10.1177/00491241221140139
Chauliac, M., Willems, J., Gijbels, D. & Donche, V. (2023). The prevalence of careless response behaviour and its consequences on data quality in self-report questionnaires on student learning. Frontiers in Education, 8, 1197324. https://doi.org/10.3389/feduc.2023.1197324
Cornesse, C. & Blom, A.G. (2023). Response Quality in Nonprobability and Probability-based Online Panels. Sociological Methods and Research, 52(2), pp. 879–908. https://doi.org/10.1177/0049124120914940
Creswell, J. W., & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (5td. Edition). SAGE Publications.
Damrongpanit, S. (2019) Factor Structure and Measurement Invariance of the Self-discipline Model Using the Different-length Questionnaires: Application of Multiple Matrix Sampling. Universal Journal of Educational Research, 7(1), pp. 133–145. DOI: 10.13189/ujer.2019.070118
Décieux, J.P. (2022). Sequential On-Device Multitasking within Online Surveys: A Data Quality and Response Behavior Perspective. Sociological Methods and Research. https://doi.org/10.1177/00491241221082593
De Rada, V.D. (2022). Strategies to improve response rates to online surveys. Papers, 107(4), e3073. DOI: 10.5565/rev/papers.3073
Eisele, G., Vachon, H., Lafit, G. Myin-Germeys, I. & Viechtbauer, W. (2022). The Effects of Sampling Frequency and Questionnaire Length on Perceived Burden, Compliance, and Careless Responding in Experience Sampling Data in a Student Population. Assessment, 29(2), pp. 136–151. doi: 10.1177/1073191120957102
Funa, A.A., Gabay, R.A.E., Estonanto, A.J.J., Prudente, M.S. (2022). Development and Validation of Online Survey Instrument on Sustainable Development for Science Teachers: Focus on Pili (Canarium ovatum). Journal of Turkish Science Education, 19(2), pp. 559–576. https://doi.org/10.36681/
Ganassali, S. (2008) The Influence of the Design of Web Survey Questionnaires on the Quality of Responses. Survey Research Methods, 2(1), 21–32. https://doi.org/10.18148/srm/2008.v2i1.598
Gibson, A.M. & Bowling, N.A. (2020). The Effects of Questionnaire Length and Behavioral Consequences on Careless Responding. European Journal of Psychological Assessment, 36(2), pp. 410–420 https://doi.org/10.1027/1015-5759/a000526
Goerres, A., Höhne, J.K. (2023). Evaluating the response effort and data quality of established political solidarity measures: a pre-registered experimental test in an online survey of the German adult resident population in 2021. Quality and Quantity, 57(6), pp. 5431–5447. DOI: 10.1007/s11135-022-01594-4
Gonçalves, A.P.V., Dos Santos, R.S.P. (2023). Quantitative research in sociology: resources and dilemmas of carrying out online surveys with workers during the covid-19 pandemic. Sociologia e Antropologia, 13(2), e220007.
Gummer, T., Bach, R., Daikeler, J. & Eckman, S. (2021). The relationship between response probabilities and data quality in grid questions. Survey Research Methods, 15(1), pp. 65–77. https://doi.org/10.18148/srm/2021.v15i1.7727
Hernández, R., Fernández, C. y Baptista, M. (2014). Metodología de la investigación. McGraw Hill Education. https://www.esup.edu.pe/wpcontent/uploads/2020/12/2.%20Hernandez,%20Fernandez%20y%20Baptista-Metodolog%C3%ADa%20Investigacion%20Cientifica%206ta%20ed.pdf
Ito, D., Todoroki, M. (2021). Evaluating the Quality of Online Survey Data Collected in 2018 in the USA: Univariate, Bivariate, and Multivariate Analyses. International Journal of Japanese Sociology, 30(1), pp. 140–162. DOI: 10.1111/ijjs.12117
Korytnikova, N.V. (2021). Paradata as indicators of online survey data quality: Classification experience. Sotsiologicheskie Issledovaniya, 2021(3), pp. 111–120. https://doi.org/10.31857/S013216250010298-0
Lawlor, J., Thomas, C., Guhin, A.T., Lerner, M.D., Drahota, A. (2021). Suspicious and fraudulent online survey participation: Introducing the REAL framework. Methodological Innovations, 14(3). https://doi.org/10.1177/20597991211050467
Malhotra N. (2016). Investigación de mercados. Quinta edición. PEARSON EDUCACIÓN.
Meitinger, K., Behr, D. & Braun, M. (2021). Using Apples and Oranges to Judge Quality? Selection of Appropriate Cross-National Indicators of Response Quality in Open-Ended Questions. Social Science Computer Review, 39(3), pp. 434–455. https://doi.org/10.1177/0894439319859848
Neves, C., Augusto, C., Terra, A.L. (2020). Online surveys: comparative tool analysis for the creation and administration of e-surveys. AtoZ, 9(2), pp. 69 –78. https://revistas.ufpr.br/atoz/article/download/75826/43559
Nur, A.A., Leibbrand, C., Curran, S.R., Votruba-Drzal, E., Gibson-Davis, C. (2023). Managing and minimizing online survey questionnaire fraud: lessons from the Triple C project. International Journal of Social Research Methodology. http://dx.doi.org/10.1080/13645579.2023.2229651
Peytchev, A. & Peytcheva, E. (2017). Reduction of measurement error due to survey length: Evaluation of the split questionnaire design approach. Survey Research Methods, 11(4), pp. 361–368 https://doi.org/10.18148/srm/2017.v11i4.7145
Real Academia Española - RAE (2014). Diccionadio de la lengua española. https://dle.rae.es/cuestionario
Robb, K.A., Gatting, L., & Wardle, J. (2017). What impact do questionnaire length and monetary incentives have on mailed health psychology survey response? British Journal of Health Psychology, 22(4), pp. 671–685 https://doi.org/10.1111/bjhp.12239
Sischka, P.E., Décieux, J.P., Mergener, A., Neufang, K.M., Schmidt, A.F. (2022). The Impact of Forced Answering and Reactance on Answering Behavior in Online Surveys. Social Science Computer Review, 40(2), pp. 405–425. https://doi.org/10.1177/0894439320907067
Smyk, M., Tyrowicz, J., Van der Velde, L. (2021). A Cautionary Note on the Reliability of the Online Survey Data: The Case of Wage Indicator. Sociological Methods and Research, 50(1), pp. 429–464. https://doi.org/10.1177/0049124118782538
Van Laar, S. & Braeken J. (2024). Prevalence of random responders as a function of scale position and questionnaire length in the TIMSS 2015 eighth-grade student questionnaire. International Journal of Testing, 24(1), pp. 24–52. https://doi.org/10.1080/15305058.2023.2263206
Vehovar, V., Couper, M.P. & Čehovin, G. (2023). Alternative Layouts for Grid Questions in PC and Mobile Web Surveys: An Experimental Evaluation Using Response Quality Indicators and Survey Estimates. Social Science Computer Review, 41(6), pp. 2122–2144. https://doi.org/10.1177/08944393221132644
Wang, Y., Chen, X. & Zhou, X. (2023). A New Method for Identifying Low-Quality Data in Perceived Usability Crowdsourcing Tests: Differences in Questionnaire Scores. International Journal of Human-Computer Interaction. https://doi.org/10.1080/10447318.2023.2263694
Publicado
Como Citar
Edição
Seção
Licença
Copyright (c) 2026 Jorge Izaguirre Olmedo, Ernesto Rangel

Este trabalho está licenciado sob uma licença Creative Commons Attribution 4.0 International License.
Os autores retêm os direitos autorais e concedem à Revista Inclusiones o direito de publicação sob Creative Commons Attribution 4.0 International (CC BY 4.0). Isso permite o uso, distribuição e reprodução em qualquer meio, desde que a devida atribuição seja concedida ao autor.





