Assessment of Difficulty Levels in Understanding Science Among Undergraduate Students Using the Rasch Analysis

First published: 01 April 2026 | https://doi.org/10.63871/unvl.jsuv1.2.22
Technical Science Section
Original Research Article

Authors

Elmira Kushta

Department of Mathematics and Physics, Faculty of Technical and Natural Sciences, University “Ismail Qemali” Vlore, Albania | ORCID ID: https://orcid.org/0000-0002-6200-4635


Florida Kadena

Department of Physics, Faculty of Natural Sciences, University of Tirana, Albania | ORCID ID: #


Dode Prenga

Department of Physics, Faculty of Natural Sciences, University of Tirana, Albania | ORCID ID: https://orcid.org/0000-0002-7211-9014


Abstract

The assessment of students' understanding of the difficulty levels in physics, provided by Rasch analysis, is carried out here using two different tests as measurement tools. We used a poly-tomous version of the standard FCI test by selecting items that maintain the core of this conceptual inventory. To do this, we randomly divided the 380 responses collected during a recent FCI analysis into five groups and then restructured six items into six separate evaluation scale items. At this stage, we carefully selected items to high-light the dominance of one common-sense error in mechanics. Next, we created our own test with items designed to cover six levels of difficulty, alternating simpler calculation steps within phys-ics-based problems. For both cases, the threshold parameter was estimated using the polytomous Rasch model. The results were analysed and discussed.

Keywords: Concept Inventory, Physics, Knowledge, Index theory, The Rasch Model, Socio-metric Measuring and Instruments.


Background

The assessment of conceptual knowledge in physics often relies on the Force Concept Inventory (FCI), a tool designed to measure the depth of understanding in mechanics beyond simple procedural calculations. While the standard FCI uses binary (correct/incorrect) scoring, there is a growing interest in using polytomous (multi-level) models to better capture students' reasoning and common-sense errors. The Rasch model, specifically its polytomous application, provides a threshold parameter that measures the difficulty of transitioning between different levels of knowledge. This study aims to evaluate undergraduate students' understanding of physics by applying Rasch analysis to restructured conceptual tests, identifying how difficulty is distributed across various levels of proficiency.


Methods

The study utilized two measurement tools: a restructured, polytomous version of the standard FCI and an ad-hoc test designed to cover six levels of difficulty. For the FCI version, the researchers analyzed 380 existing student responses, grouping them into a 6-point evaluation scale based on a taxonomy of common-sense errors in mechanics. Initially, an ad-hoc test was validated by 11 physics educators, but it yielded a low Content Validity Ratio (CVR) of 0.4 due to a lack of consensus on scoring. Consequently, the authors developed "cloned items" that maintained the philosophy of the FCI. Data analysis was conducted using the polytomous Rasch model to estimate threshold parameters (τ), which represent the difficulty of obtaining a score h relative to h−1.


Results

The analysis revealed that students' measured ability typically falls into 5–6 distinct levels. A key finding was the non-uniform distribution of difficulty: the first and last threshold parameters showed significantly higher magnitudes than the intermediate levels. For example, in a sample of 280 students, the first threshold difference was as high as 3.7471, while intermediate steps dropped to 0.4317. This indicates that the "pedagogical effort" required to bring a student from a basic level to an admissible one, or from a good level to an excellent one, is much greater than the effort needed to progress through the middle range of knowledge. The threshold levels were also found to be nearly symmetrical around the zero point, reflecting the consistency of the standardized FCI-cloned test.


Conclusions

The authors conclude that combining dichotomous and polytomous scoring algorithms through Rasch analysis provides a more nuanced understanding of conceptual knowledge structures. The study highlights that difficulty thresholds are not uniformly distributed, suggesting that pedagogical investment is most "cost-efficient" when focused on students at intermediate proficiency levels. Conversely, elevating students with very low conceptual knowledge or pushing high-achievers toward total mastery requires disproportionately more resources. The study recommends that educators and researchers adopt Item Response Theory (IRT)-based techniques to inform evidence-based instructional practices and design better conceptual inventories that move beyond simple binary assessments.

CONFLICT OF INTEREST

The authors declare no conflict of interest.

REFERENCES

1. Andrich D. The Rasch model explained. In: Ap-plied Rasch measurement: A book of exem-plars: Papers in honour of John P. Keeves. 2005. p. 27–59.
2. Crooks NM, Alibali MW. Defining and measur-ing conceptual knowledge in mathematics. Dev Rev. 2014;34(4):344–377.
3. Ding L, Chabay R, Sherwood B, Beichner R. Evaluating an electricity and magnetism as-sessment tool: Brief Electricity and Magnetism Assessment. Phys Rev ST Phys Educ Res. 2006;2(1):010105.
4. Hamolli L, Prenga D. Force concept inventory analysis by using indexes and the Rasch model. J Nat Sci. 2024;36:210–230.
5. Hestenes D, Wells M, Swackhamer G. Force concept inventory. Phys Teach. 1992;30(3):141–158.
6. Jones M, Jones G. Biologjia 10–11. Tirana: Pegi; 2016.
7. Kim JS, Sunderman GL. Measuring academic proficiency under the No Child Left Behind Act: Implications for educational equity. Educ Res. 2005;34(8):3–13.
8. Klymkowsky MW, Garvin-Doxas K. Concept in-ventories: Design, application, uses, limita-tions, and next steps. In: Active learning in col-lege science: The case for evidence-based practice. 2020. p. 775–790.
9. Kumar P, et al. Using empirical science educa-tion in schools to improve climate change lit-eracy. Renew Sustain Energy Rev. 2023;178:113232.
10. Kushta E, Prenga D. Using tests’ indexes to im-prove the assessment of the conceptual knowledge: A case study. Int J Educ Learn Syst. 2023;8.
11. Planinic M, Ivanjek L, Susac A. Rasch model-based analysis of the Force Concept Inventory. Phys Rev ST Phys Educ Res. 2010;6(1).
12. Planinic M, Boone WJ, Susac A, Ivanjek L. Rasch analysis in physics education research: Why measurement matters. Phys Rev Phys Educ Res. 2019;15(2):020111.
13. Pople S. Fizika 10–11. Tirana: Erik Botime; 2022.
14. Prenga D, Kushta E, Musli F. Enhancing con-cept inventory analysis by using indexes, optimal histogram idea, and Likert analysis. J Hum Earth Future. 2023;4(1):103–120.
15. Prenga D. A thematic review on the combina-tion of statistical tools and measuring instru-ments for analyzing knowledge and students’ achievement in science. Eur Mod Stud J. 2024;8(3):687–706.
16. Rasch G. On general laws and the meaning of measurement in psychology. In: Proc 4th Berkeley Symp Math Stat Probab. Berkeley: University of California Press; 1961. p. 321–333.
17. Savinainen A, Scott P. The Force Concept In-ventory: A tool for monitoring student learn-ing. Phys Educ. 2002;37(1):45–51.
18. Vilia PN, Candeias AA, Neto AS, Franco MDGS, Melo M. Academic achievement in physics-chemistry: The predictive effect of attitudes and reasoning abilities. Front Psychol. 2017;8:1064.
19. Wright BD. Solving measurement problems with the Rasch model. J Educ Meas. 1977;14:97–116.
20. Wright BD, Masters GN. Rating scale analysis. Chicago: MESA Press; 1982.

Citing Literature

How to cite this article:

Kushta, E., Kadena, F., Prenga,D. DOI: 10.63871…. UniVlora Scientific Journal 2025, no.I, volume II