Unpacking the Impact of Item Difficulty: Traditional Testing in Online Learning

Authors

DOI:

https://doi.org/10.46328/ijte.1210

Keywords:

distance education and online learning, Academic achievement, higher education students, Multiple Choice Questions, Student Perceptions, Assessment

Abstract

This study examines the effect of item order (random, increasingly difficult, and decreasingly difficult) on student performance, test parameters, and student perceptions in multiple-choice tests administered in a paper-and-pencil format after online learning. In the research conducted using an explanatory sequential mixed methods design, quantitative data were first analyzed and then qualitative data were collected to examine these findings in depth. 2131 freshman university students participated in the quantitative part of the study and 312 students participated in the qualitative part. After 14 weeks of online foreign language education, tests with different item orders were applied to measure the academic achievement of the students. The findings revealed that item order did not significantly affect academic achievement of students. Item order was found to not affect test parameters such as test difficulty and reliability. The most striking finding the difficulty level of the items changes depending on the item ordering. Findings regarding student perceptions show that encountering difficult questions at the beginning of the test reduces motivation, increases anxiety, and creates a negative perception of the assessment process. Additionally, students emphasize that the assessment process should be compatible with the pedagogical structure of online learning. These findings indicate that students' perceptions as well as test statistics should be taken into account in test design. In this context, conducting the assessment of education provided through online learning in an online environment can eliminate many discussions regarding the item order. Based on these findings, it is recommended that future research be expanded to include different courses, item order method, and individual student differences.

References

Taskin, N. (2025). Unpacking the impact of item difficulty: Traditional testing in online learning. International Journal of Technology in Education (IJTE), 8(4), 998-1021. https://doi.org/10.46328/ijte.1210

Downloads

Published

2025-09-01

Issue

Section

Articles

How to Cite

Unpacking the Impact of Item Difficulty: Traditional Testing in Online Learning. (2025). International Journal of Technology in Education, 8(4), 998-1021. https://doi.org/10.46328/ijte.1210