Exploring the Quality of Multiple-Choice Question Type of Test Items in Information and Communication Technology Using Item Analysis
Issue: Vol.4 No.1 January 2023 Article 4 pp.50-58
DOI: https://doi.org/10.38159/ehass.2023414 | Published online 26th January, 2023
© 2023 The Author(s). This is an open access article under the CCBY license (http://creativecommons.org/licenses/by/4.0/).
The study explored the quality of multiple-choice question types of test items in Information and Communication Technology (ICT) using item analysis. The study was an exploratory type where secondary data was collected and analysed. The secondary data was made up of ten (10) Multiple Choice Questions (MCQ) items on an ICT test instrument. Thirty learners’ test instruments were sampled. The item analysis procedure included the item reliability coefficient, coefficient of variation, difficulty level, discrimination index, and distractor analysis. The study revealed that the test items had a low-reliability power and the class is homogenous. It was also found that ten per cent (10⁒) of the items were highly appropriate, fifty per cent (50⁒) needed revision, and forty per cent (40⁒) needed to be discarded. The study also found that some of the distractors were not appropriate and hence needed review while others need to be replaced. It was, therefore, recommended that item analysis should embrace race by teachers since it will foster effectiveness and efficiency carry out their duties in teaching and learning. Moreover, the Ministry of Education in Ghana must liaise with various heads of educational institutions to contract measurement and evaluation experts to organize workshops for their staff on how to do item analysis.
Keywords: Reliability coefficient, Discrimination index, Difficulty level, Distractor analysis, Test items
Amedahe, Francis. K., and Kenneth Asamoah-Gyimah. Introduction to Measurement and Evaluation . Cape Coast: Hampton Press, 2016.
Anamuah-Mensah, J, and K A Quaigrain. “Teacher Competence in the Use of Essay Test.” The Oguaa Educator University of Cape Coast 12 (1998): 31–42.
Anane, Eric, and Kenneth Asamoah-Gyimah. Assessment in Education. Cape Coast: UCC Press, 2019.
Asamoah, Daniel, and Moses K K Ocansey. “Item Discrimination and Distractor Analysis: A Technical Report on Thirty Multiple Choice Core Mathematics Achievement Test Items.” International Journal of Research and Scientific Innovation (IJRSI) 6 (2019).
Boopathiraj, C, and K Chellamani. “Analysis of Test Items on Difficulty Level and Discrimination Index in the Test for Research in Education.” International Journal of Social Science & Interdisciplinary Research 2, no. 2 (2013): 189–93.
Crocker, Linda, and James Algina. Introduction to Classical and Modern Test Theory. ERIC, 1986.
El-Uri, Fahmi Ishaq, and Naser Malas. “Analysis of Use of a Single Best Answer Format in an Undergraduate Medical Examination.” Qatar Medical Journal 2013, no. 1 (2013): 1.
Estey, Young K. Educational Statistics. Cape Coast: UCC Press, 2012.
Gochyyev, P, and D Sabers. “Item Analysis.” J. Res. Methods. Https://Srmo. Sagepub. Com/View/Encyc-of-Researchdesign, no. 199 (2012).
Odukoya, Jonathan A., Olajide Adekeye, Angie O. Igbinoba, and A. Afolabi. “Item Analysis of University-Wide Multiple Choice Objective Examinations: The Experience of a Nigerian Private University.” Quality & Quantity 52, no. 3 (May 14, 2018): 983–97. https://doi.org/10.1007/s11135-017-0499-2.
Quaigrain, Kennedy, and Ato Kwamina Arhin. “Using Reliability and Item Analysis to Evaluate a Teacher-Developed Test in Educational Measurement and Evaluation.” Edited by Sammy King Fai Hui. Cogent Education 4, no. 1 (January 1, 2017): 1301013. https://doi.org/10.1080/2331186X.2017.1301013.
Sabri, Shafizan. “Item Analysis of Student Comprehensive Test for Research in Teaching Beginner String Ensemble Using Model Based Teaching among Music Students in Public Universities.” International Journal of Education and Research 1, no. 12 (2013): 1–14.
Suruchi, S, and S S Rana. “Test Item Analysis and Relationship between Difficulty Level and Discrimination Index of Test Items in an Achievement Test in Biology.” PIJR 3, no. 6 (2014): 56–58.
Toksöz, Sibel, and Ayşe Ertunç. “Item Analysis of a Multiple-Choice Exam.” Advances in Language and Literary Studies 8, no. 6 (2017): 141–46.
Xu, Yueting, and Yongcan Liu. “Teacher Assessment Knowledge and Practice: A Narrative Inquiry of a Chinese College EFL Teacher’s Experience.” Tesol Quarterly 43, no. 3 (2009): 492–513.
Yuniarti, Nurhening, Ahmad Luthfi Setiawan, and Didik Hariyanto. “The Development and Comprehensive Evaluation of Control System Training Kit as a Modular-Based Learning Media.” TEM Journal 9, no. 3 (2020): 1234.
David Arhin is currently an MPhil student in Measurement and Evaluation awaiting Viva Voce at the Department of Education and Psychology, Faculty of Educational Foundations, University of Cape Coast, Ghana. His research focuses on Assessment, Mathematics, ICT, Teacher Education, Basic Education, Data Analysis, Monitoring, and Evaluation.
Ruth Annan-Brew is currently a Lecturer at the Department of Education and Psychology, Faculty of Educational Foundations, University of Cape Coast, Ghana. She holds a PhD in Measurement and Evaluation. Her research focuses on Assessment, Evaluation, Gender, Teacher Education and Psychology.
Ruth Owusuah is currently an MPhil student in Measurement and Evaluation at the Department of Education and Psychology, Faculty of Educational Foundations, University of Cape Coast, Ghana. Her research focuses on Assessment, Mathematics, ICT, Teacher Education and Supervision.
Winfred Bonsu Owusu is currently an MPhil student in Measurement and Evaluation awaiting Viva Voce at the Department of Education and Psychology, Faculty of Educational Foundations, University of Cape Coast, Ghana. His research focuses on Assessment, Mathematics, ICT, Teacher Education and Supervision.
Arhin D., Annan-Brew R., Owusuaah R. & Opoku W.B. “Exploring the Quality of Multiple-Choice Question Type of Test Items in Information and Communication Technology Using Item Analysis” E-Journal of Humanities, Arts and Social Sciences 4, no.1 (2023):50-58. https://doi.org/10.38159/ehass.2023414
© 2023 The Author(s). Published and Maintained by Noyam Publishers. This is an open access article under the CCBY license (http://creativecommons.org/licenses/by/4.0/).