Unpacking the Impact of Item Difficulty: Traditional Testing in Online Learning
DOI:
https://doi.org/10.46328/ijte.1210Keywords:
distance education and online learning, Academic achievement, higher education students, Multiple Choice Questions, Student Perceptions, AssessmentAbstract
This study examines the effect of item order (random, increasingly difficult, and decreasingly difficult) on student performance, test parameters, and student perceptions in multiple-choice tests administered in a paper-and-pencil format after online learning. In the research conducted using an explanatory sequential mixed methods design, quantitative data were first analyzed and then qualitative data were collected to examine these findings in depth. 2131 freshman university students participated in the quantitative part of the study and 312 students participated in the qualitative part. After 14 weeks of online foreign language education, tests with different item orders were applied to measure the academic achievement of the students. The findings revealed that item order did not significantly affect academic achievement of students. Item order was found to not affect test parameters such as test difficulty and reliability. The most striking finding the difficulty level of the items changes depending on the item ordering. Findings regarding student perceptions show that encountering difficult questions at the beginning of the test reduces motivation, increases anxiety, and creates a negative perception of the assessment process. Additionally, students emphasize that the assessment process should be compatible with the pedagogical structure of online learning. These findings indicate that students' perceptions as well as test statistics should be taken into account in test design. In this context, conducting the assessment of education provided through online learning in an online environment can eliminate many discussions regarding the item order. Based on these findings, it is recommended that future research be expanded to include different courses, item order method, and individual student differences.
References
Taskin, N. (2025). Unpacking the impact of item difficulty: Traditional testing in online learning. International Journal of Technology in Education (IJTE), 8(4), 998-1021. https://doi.org/10.46328/ijte.1210
Downloads
Published
Issue
Section
License
Copyright (c) 2025 International Journal of Technology in Education

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Articles may be used for research, teaching, and private study purposes. Authors alone are responsible for the contents of their articles. The journal owns the copyright of the articles. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of the research material.
The author(s) of a manuscript agree that if the manuscript is accepted for publication in the International Journal of Technology in Education (IJTE), the published article will be copyrighted using a Creative Commons “Attribution 4.0 International” license. This license allows others to freely copy, distribute, and display the copyrighted work, and derivative works based upon it, under certain specified conditions.
Authors are responsible for obtaining written permission to include any images or artwork for which they do not hold copyright in their articles, or to adapt any such images or artwork for inclusion in their articles. The copyright holder must be made explicitly aware that the image(s) or artwork will be made freely available online as part of the article under a Creative Commons “Attribution 4.0 International” license.

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

