Effect of 2-PL and 3-PL Models on the Ability Estimate in Mathematics Binary Items
DOI:
https://doi.org/10.7160/eriesj.2024.170308Keywords:
Items, Binary items, Models, Item response theory, Item parametersAbstract
The investigation delves into examining the influence of 2-parameter logistic (PL) and 3-parameter logistic models on the ability estimates of students in mathematical binary items. It ascertained the parameters of the items in the 2-PL and 3-PL models. We employed Item Response Theory (IRT) in the design of this research survey, with a sample comprising 1015 senior secondary (SS) students in SS III classes who were analyzed using both models in the investigation. The Mathematics Achievement Test instrument was adapted from the General Mathematics Paper 1 of the Senior School Certificate Examination administered by the West Africa Examinations Council (WAEC). Results indicated that the 2-PL model shows lower difficulty levels but higher discriminatory indices. Statistical analysis revealed a significant (F = 19.52, p < 0.05 and F = 18.52, p < 0.05) effect of both models, respectively, on ability estimates in mathematics binary items among Nigerian secondary school students. We established that item parameters in the 2-PL and 3-PL models significantly affected the ability estimate of Nigeria secondary school students in binary mathematics items, while the 3-PL model provided a better ability estimate than the 2-PL model.
References
Adedoyin, O. O. and Adedoyin, J. (2013) ‘Assessing the Comparability between Classical Test Theory (CTT) and Item Response Theory (IRT) Models in Estimating Test Item Parameters’, Herald Journal of Education and General Studies, Vol. 2, No. 3, pp. 107–114. http://dx.doi.org/10.12691/education-6-3-11
Adetutu, P. O. and Iwintolu, R. O. (2017) ‘An Item Response Theory Analysis of the Academic Amotivation Inventory for Secondary School Students in Southwestern Nigeria’, Journal of Research and Method in Education (IOSR-JRME), Vol. 7, No. 4, pp. 22–31. https://dx.doi.org/10.9790/7388-0704032231
Alordiah, C. (2015) ‘Comparison of Index of Differential Item Functioning under the Methods of Item Response Theory and Classical Test Theory in Mathematics’, An unpublished Ph. D thesis of Delta State University, Abraka, Delta State, Nigeria.
Awopeju, O. and Afolabi, E. (2016) ‘Comparative Analysis of Classical Test Theory and Item Response Theory based Item Parameter Estimates of Senior School Certificate Mathematics Examination’, European Scientific Journal, Vol. 12, No. 28, pp. 263–284. http://dx.doi.org/10.19044/esj.2016.v12n28p263
Ayanwale, M. A. (2023) ‘Test Score Equating of Multiple-Choice Mathematics Items: Techniques from Characteristic Curve of Modern Psychometric Theory’, Discover Education, Vol. 2, No. 1, p. 30. https://doi.org/10.1007/s44217-023-00052-z
Baker, F. B. (2001) The basics of item response theory, 2nd Edition, Available: https://files.eric.ed.gov/fulltext/ED458219.pdf [10 January 2024].
Bichi, A. A. and Talib, R. (2018) ‘Item Response Theory: An Introduction to Latent Trait Models to Test and Item Development’, International Journal of Evaluation and Research in Education, Vol. 7, No. 2, pp. 142–151. https://doi.org/10.11591/ijere.v7i2.12900
Breuer, S., Scherndl, T. and Ortner, T. M. (2023) “Effects of Response Format on Achievement and Aptitude Assessment Results: Multi-Level Random Effects Meta-Analyses’, Royal Society Open Science, Vol. 10, No. 5, p. 220456. https://doi.org/doi:10.1098/rsos.220456
Choi, S., Jang, Y. and Kim, H. (2023) ‘Influence of Pedagogical Beliefs and Perceived Trust on Teachers’ Acceptance of Educational Artificial Intelligence Tools’, International Journal of Human–Computer Interaction, Vol. 39, No. 4, pp. 910–922. https://doi.org/10.1080/10447318.2022.2049145
Danh, T., Desiderio, T., Herrmann, V., Lyons, H. M., Patrick, F., Wantuch, G. A. and Dell, K. A. (2020) ‘Evaluating the Equality of Multiple-Choice Questions in a NAPLEX Preparation Book’, Currents in Pharmacy Teaching and Learning, Vol. 12, No. 10, pp. 1188–1193. https://doi.org/10.1016/j.cptl.2020.05.006
Douglas, K. A., Neumann, K. and Oliveri, M. E. (2023) ‘Contemporary Approaches to Assessment of Engineering Competencies for Diverse Learners’, in Johri, A. (ed.) International handbook of Engineering Education Research, New York: Routledge. https://doi.org/10.4324/9781003287483-37
Ferreira-Junior, M., Reinaldo, J. T., Neto, E. A. L. and Prudencio, R. B. (2023) ‘β4-IRT: A new β3-IRT with Enhanced Discrimination Estimation’, ArXiv Preprint arXiv:2303.17731. https://doi.org/10.48550/arXiv.2303.17731
Frick, S., Krivosija, A. and Munteanu, A., (2024) ‘Scalable Learning of Item Response Theory Models’, International Conference on Artificial Intelligence and Statistics (AISTATS), London, pp. 1234–1242.
Gates, J. A. (2023) ‘School Board Member Perceptions of the Race Based Academic Achievement Disparity [Ed.D., Concordia University Chicago]’, ProQuest Dissertations & Theses Global. United States -- Illinois. https://www.proquest.com/dissertations-theses/school-board-member-perceptions-race-based/docview/2910119963/se-2?accountid=13425
Jimoh, K., Opesemowo, O. A. and Faremi, Y. A. (2022) ‘Psychometric Analysis of Senior Secondary School Certificate Examination (SSCE) 2017 Neco English Language Multiple Choice Test Items in Kwara State Using Item Response Theory’, Journal of Applied Research and Multidisciplinary Studies, Vol. 3, No. 2, pp. 1–19. https://doi.org/10.32350/jarms.32.01
Kalhori, R. P. and Abbasi, M. (2017) ‘Are Faculty Members of Paramedics Able to Designed Accurate Multiple Choice Questions’, Global Journal of Health Science, Vol. 9, No. 1, pp. 211–216. http://dx.doi.org/10.5539/gjhs.v9n1p211
Kennedy, I. and Ebuwa, S. O. (2022) ‘Assessing Score Dependability of West Africa Examinations Council (WAEC) 2019 Mathematics Objective Test using Generalisability Theory’, British Journal of Contemporary Education, Vol. 2, No. 1, pp. 64–73. https://doi.org/10.52589/BJCE-OCA9OZJT
Kim, K. Y. (2017) ‘IRT Linking Methods for the Bifactor Model: A Special Case of the Two-Tier Item Factor Analysis Model [Ph.D., The University of Iowa]’, ProQuest Dissertations & Theses Global. United States -- Iowa. https://www.proquest.com/dissertations-theses/irt-linking-methods-bifactor-model-special-case/docview/1964934014/se-2?accountid=13425
Mutiawani, V., Athaya, A. M., Saputra, K. and Subianto, M. (2022) ‘Implementing Item Response Theory (IRT) Method in Quiz Assessment System’, TEM Journal, Vol. 11, No. 1, pp. 210–218. https://doi.org/10.18421/TEM111-26
O’Connor, P. J., Hill, A. and Martin, B. (2019) ‘The Measurement of Emotional Intelligence: A Critical Review of the Literature and Recommendations for Researchers and Practitioners’, Frontiers in Psychology, Vol. 10, No. 1116, pp. 1–19. https://doi.org/10.3389/fpsyg.2019.01116
Oladele, J. I., Ndlovu, M. and Spangenberg, E. D. (2022) ‘Simulated Computer Adaptive Testing Method Choices for Ability Estimation with Empirical Evidence’, International Journal of Evaluation and Research in Education (IJERE) Vol. 3, pp. 1392–1399. https://doi.org/10.11591/ijere.v11i3.21986
Olagunju, B. A. and Iwintolu, R. O. (2023) ‘Development of Pedagogical Competence Scale for Lecturers in Universities Using Item Response Theory’, Mimbar Sekolah Dasar, Vol. 10, No.1, pp. 135–148. https://doi.org/10.53400/mimbar-sd.v10i1.51422
Opesemowo, O., Afolabi, E. and Oluwatimilehin, T. (2018) ‘Development of a Scale for Measuring Students’ Testwiseness in Senior Secondary School Examination in Nigerian’, International Journal of Research, Vol. 5, No. 19, pp. 464–474.
Opesemowo, O. A. G., Ayanwale, M. A., Opesemowo, T. R. and Afolabi, E. R. I. (2023) ‘Differential Bundle Functioning of National Examinations Council Mathematics Test Items: An Exploratory Structural Equation Modelling Approach’, Journal of Measurement and Evaluation in Education and Psychology, Vol. 14, No. 1, pp. 1–18. https://doi.org/10.21031/epod.1142713
Opesemowo, O. A. G. and Ndlovu, M. (2023) ‘Status and Experience of Mathematics Teachers’ Perception of Integrating Computer Adaptive Testing into Unified Tertiary Matriculation Examination Mathematics’, Multicultural Education, Vol. 9, No. 2, pp. 66–78.
Paek, I., Lin, Z., and Chalmers, R. P. (2023) ‘Investigating Confidence Intervals of Item Parameters When Some Item Parameters Take Priors in the 2PL and 3PL Models’, Educational and Psychological Measurement, Vol. 83, No. 2, pp. 375–400. https://doi.org/10.1177/00131644221096431
Perez, A. L. and Loken, E. (2023) ‘Person Specific Parameter Heterogeneity in the 2PL IRT Model’, Multivariate Behavioral Research, pp. 1–7. https://doi.org/10.1080/00273171.2023.2224312
Powers, K. (2019) ‘Personality, Attitudes, and Behaviors’, in Workplace Psychology, Chemeketa Community College.
Rafi, I., Retnawati, H., Apino, E., Hadiana, D., Lydiati, I. and Rosyada, M. N. (2023) ‘What Might Be Frequently Overlooked Is Actually Still Beneficial: Learning from Post National-Standardized School Examination’, Pedagogical Research, Vol. 8, No. 1, pp. 1–15. https://doi.org/10.29333/pr/12657
Rios, J. A. and Soland, J. (2022) ‘An Investigation of Item, Examinee, and Country Correlates of Rapid Guessing in PISA’, International Journal of Testing, Vol. 22, No. 2, pp. 154–184. https://doi.org/10.1080/15305058.2022.2036161
Rudner, L. M. (2019) ‘Scoring and Classifying Examinees using Measurement Decision Theory’, Practical Assessment, Research, and Evaluation, Vol. 14, pp. 1–12. https://doi.org/10.7275/vksg-rh07
Scheibling-Sève, C., Pasquinelli, E. and Sander, E. (2020) ‘Assessing Conceptual Knowledge through Solving Arithmetic Word Problems’, Educational Studies in Mathematics, Vol. 103, No. 3, 293–311. https://doi.org/10.1007/s10649-020-09938-3
Setiawati, F. A., Amelia, R. N., Sumintono, B. and Purwanta, E. (2023) ‘Study Item Parameters of Classical and Modern Theory of Differential Aptitude Test: is it Comparable?’, European Journal of Educational Research, Vol. 12, No. 2, pp. 1097–1107. https://doi.org/10.12973/eu-jer.12.2.1097
Sideridis, G. and Alahmadi, M. (2022) ‘Estimation of Person Ability under Rapid and Effortful Responding’, Journal of Intelligence, Vol. 10, No. 3, p. 67. https://www.mdpi.com/2079-3200/10/3/67
Stemler, S. E. and Naples, A. (2021) ‘Rasch Measurement v. Item Response Theory: knowing when to cross the line’, Practical Assessment, Research and Evaluation, Vol. 26, No. 11, pp. 1–16. https://doi.org/10.7275/v2gd-4441
Suparman, S. and Juandi, D. (2022) ‘Self-efficacy and Mathematical Ability: A Meta-Analysis of Studies conducted in Indonesia’, Pedagogika/Pedagogy, Vol. 147, No. 3, pp. 26–57. https://doi.org/10.15823/p.2022.147.2
Sweeney, S. M., Sinharay, S., Johnson, M. S. and Steinhauer, E. W. (2022) ‘An Investigation of the Nature and Consequence of the Relationship between IRT Difficulty and Discrimination’, Educational Measurement: Issues and Practice, Vol. 41, No. 4, pp. 50–67. https://doi.org/10.1111/emip.12522
von Davier, M. (2019) ‘TIMSS 2019 Scaling Methodology: Item Response Theory, Population Models, and linking across Modes’, Methods and procedures: TIMSS, Vol. 11, No. 11, pp. 11–25.
Yustiandi, Y. and Saepuzaman, D. (2021) ‘Analysis of Model Fit and Item Parameter of Work and Energy Test using Item Response Theory’, Gravity: Jurnal Ilmiah Penelitian dan Pembelajaran Fisika, Vol. 7, No, 2, pp. 21–28. https://doi.org/10.30870/gravity.v7i2.10563
Zanon, C., Hutz, C. S., Yoo, H. and Hambleton, R. K. (2016) ‘An Application of Item Response Theory to Psychological Test Development’, Psicologia: Reflexão e Crítica, Vol. 29, No. 18. https://doi.org/10.1186/s41155-016-0040-x
Additional Files
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Rukayat Oyebola Iwintolu, Oluwaseyi Aina Gbolade Opesemowo, Phebean Oluwaseyi Adetutu
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors declare with this manuscript intended for publication to ERIES Journal that:
- all co-authors agree with the publication of the manuscript even after amendments arising from peer review;
- all co-authors agree with the posting of the full text of this work on the web page of ERIES Journal and to the inclusion of references in databases accessible on the internet;
- no results of other researchers were used in the submitted manuscript without their consent, proper citation, or acknowledgement of their cooperation or material provided;
- the results (or any part of them) used in the manuscript have not been sent for publication to any other journal nor have they already been published (or if so, that the relevant works are cited in this manuscript);
- submission of the manuscript for publication was completed in accordance with the publishing regulations pertaining to place of work;
- experiments performed comply with current laws and written consent of the Scientific Ethics Committee / National Animal Care Authority (as is mentioned in the manuscript submitted);
- grant holders confirm that they have been informed of the submitted manuscript and they agree to its publication.
Authors retain copyright and grant ERIES Journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the published work with an acknowledgement of its initial publication in ERIES Journal. Moreover, authors are able to post the published work in an institutional repository with an acknowledgement of its initial publication in ERIES Journal. In addition, authors are permitted and encouraged to post the published work online (e.g. institutional repositories or on their website) as it can lead to productive exchanges, as well as earlier and greater citation of published work.