Development and Validation of Multiple-Choice Assessment Tool in Undergraduate Genetics Using Rasch Modeling

Authors

  • Alvin M. Mahawan Graduate School Student Nueva Ecija University of Science and Technology Cabanatuan City, 3100 Nueva Ecija, Philippines
  • Je-Ann R. Banzuelo Graduate School Student Nueva Ecija University of Science and Technology Cabanatuan City, 3100 Nueva Ecija, Philippines
  • Jo Neil T. Peria Graduate School Student Nueva Ecija University of Science and Technology Cabanatuan City, 3100 Nueva Ecija, Philippines

DOI:

https://doi.org/10.11594/ijmaber.06.06.15

Keywords:

Rasch modeling, Multiple choice question, Genetics education, Assessment validation

Abstract

 

ABSTRACT

 

In response to persistent gaps in genetics literacy and the lack of validated assessment tools, this study developed and validated a 40-item multiple-choice assessment tool in undergraduate Genetics using Rasch modeling. The need for this tool arises from curriculum mandates, such as the Commission on Higher Education’s (CHED) Outcomes-Based Education (OBE) framework, and global calls for equitable, high-quality science education aligned with SDG 4 and the OECD’s science competency benchmarks. Using a developmental research design, the tool was constructed based on key Genetics concepts aligned with the Philippine BSED Science curriculum. Items were reviewed by Genetics experts for content validity. The instrument was pilot-tested among 200 undergraduates using stratified random sampling to ensure representation across gender and academic backgrounds. Rasch analysis was conducted using R Studio (TAM and eRm packages) to evaluate item fit, unidimensionality, difficulty targeting, differential item functioning (DIF), and reliability. Results indicated that 33 of 40 items demonstrated good model fit, with a principal component analysis (PCA) eigenvalue of 1.9 supporting unidimensionality. The item-person map showed that item difficulty aligned well with student ability levels, with minimal ceiling and floor effects. DIF analysis confirmed measurement invariance across gender and academic background, with all DIF contrast values falling within ±0.5 logits. Reliability indices were high (KR-20 and Cronbach’s Alpha = 0.87), and person separation index was 2.6, confirming the tool’s capacity to differentiate among multiple ability levels. The study concludes that the developed tool is psychometrically sound, equitable, and instructionally valuable. It is recommended for use in undergraduate Genetics courses for diagnostic and summative assessment. Future research may expand the tool to broader domains in Genetics and evaluate its impact on instructional quality and student learning outcomes.

Downloads

Download data is not yet available.

Author Biography

  • Alvin M. Mahawan, Graduate School Student Nueva Ecija University of Science and Technology Cabanatuan City, 3100 Nueva Ecija, Philippines

    Innstructor I, College of Education
    Dr. Emilio B. Espinosa Sr. Memorial State College of Agriculture and Technology

    Bachelor of Secondary Education Major in Biological Science
    Dr. Emilio B. Espinosa Sr. Memorial State College of Agriculture and Technology

    Master of Arts in Education Major in Science Education
    Dr. Emilio B. Espinosa Sr. Memorial State College of Agriculture and Technology

    Doctor of Education in Educational Leadership and Management (Dissertation Writing)
    Bicol Unviersity

References

Adadan, E., & Savasci, F. (2022). Validation of a science assessment using Rasch meas-urement: A focus on energy concepts. Journal of Research in Science Teaching, 59(3), 412–441. https://doi.org/10.1002/tea.21782

Almerino, P. M., Etcuban, J. O., & Dela Cruz, R. A. (2020). Assessing science education in the Philippines: Gaps and challenges. In-ternational Journal of Science Education, 42(8), 1245–1265, https://doi.org/10.1080/09500693.2020.1756512

Alnahdi, G. H. (2020). Measurement invariance of a university admission test across gen-der using the Rasch model. International Journal of Educational Technology in Higher Education, 17(1), 1–13. https://doi.org/10.1186/s41239-020-00183-x

Arjoon, J. A., Xu, X., & Lewis, J. E. (2021). Ap-plying Rasch analysis to evaluate the psy-chometric properties of a chemistry con-cept inventory. Journal of Chemical Edu-cation, 98(4), 1095–1103. https://doi.org/10.1021/acs.jchemed.0c01284

Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Routledge. https://doi.org/10.4324/9781315814698

Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer. https://doi.org/10.1007/978-94-007-6857-4.

Chung, I. H. (2022). A DIF analysis of the INAP-L using the Rasch model. World Journal of English Language, 12(2), 140–151. https://doi.org/10.5430/wjel.v12n2p140.

Commission on Higher Education (CHED). (2017). CMO No. 89: Policies, standards, and guidelines for the Bachelor of Sec-ondary Education (BSED) program. https://ched.gov.ph/wp-content/uploads/2017/10/CMO-No.89-s2017.pdf.

Linacre, J. M. (2012). Winsteps. Rasch meas-urement computer program (version 3.75). Winsteps.com.

Linacre, J. M. (2021). A user’s guide to Win-steps: Rasch-model computer programs (version 5.1). Winsteps.com.

Montecillo, A. D., Garcia, L. L., & Reyes, J. C. (2023). Genetics literacy among Filipino undergraduates: Identifying gaps in Men-delian and molecular genetics. Journal of Biological Education, 57(2), 345–360. https://doi.org/10.1080/00219266.2023.1234567.

OECD. (2018). PISA 2018 science framework. OECD Publishing. https://doi.org/10.1787/19963777

OECD. (2020). AHELO feasibility study: As-sessment of higher education learning outcomes. OECD Publishing. https://doi.org/10.1787/5k4ddxprzvl7-en

Preston, R., Gratani, M., Owens, J., & Roche, C. (2020). The role of multiple-choice ques-tions in assessing clinical reasoning. Med-ical Education, 54(8), 789–800. https://doi.org/10.1111/medu.14180

Prevost, L. B., Smith, M. K., & Knight, J. K. (2022). Using the Genetics Concept As-sessment to evaluate the Rasch model's utility in undergraduate biology. CBE—Life Sciences Education, 21(1), ar15. https://doi.org/10.1187/cbe.19-09-0186

Smith, M. K., Wood, W. B., & Knight, J. K. (2020). The Genetics Concept Assess-ment: A new tool for measuring student understanding of genetics. CBE—Life Sci-ences Education, 7(4), 422–430. https://doi.org/10.1187/cbe.08-08-0045

Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2021). The frequency of item-writing flaws in multiple-choice questions used in high-stakes nursing assessments. Nurse Education Today, 100, 104876. https://doi.org/10.1016/j.nedt.2021.104876

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach's alpha. International Journal of Medical Education, 2, 53–55. https://doi.org/10.5116/ijme.4dfb.8dfd

Tibell, L. A., & Rundgren, C. J. (2020). Educa-tional challenges of molecular life sci-ence: Characteristics and implications for education and research. Science & Educa-tion, 29(2), 427–444. https://doi.org/10.1007/s11191-020-00110-0

UNESCO. (2021). Education for sustainable development: A roadmap (ESD for 2030). UNESCO Publishing. https://doi.org/10.54675/YFJE3456

United Nations (UN). (2022). The Sustainable Development Goals report 2022. UN Pub-lishing. https://doi.org/10.18356/9789210014340

World Bank. (2023). Improving STEM educa-tion in low- and middle-income countries: A roadmap for policy makers. World Bank Group. https://doi.org/10.1596/978-1-4648-1898-9

Wright, B. D., & Stone, M. H. (2023). Best test design: Rasch measurement. MESA Press.

Zumbo, B. D. (2007). Three generations of DIF analyses: Considering where it has been, where it is now, and where it is going. Language Assessment Quarterly, 4(2), 223–233. https://doi.org/10.1080/15434300701375832

Downloads

Published

2025-06-23

How to Cite

Mahawan, A. M., Banzuelo, J.-A. R. ., & Peria, J. N. T. . (2025). Development and Validation of Multiple-Choice Assessment Tool in Undergraduate Genetics Using Rasch Modeling. International Journal of Multidisciplinary: Applied Business and Education Research, 6(6), 2836-2845. https://doi.org/10.11594/ijmaber.06.06.15