Evaluating reliability of final exam questions via rasch model

https://doi.org/10.53730/ijhs.v6nS2.5064

Authors

  • N. Lohgheswary Department of Electrical and Electronics Engineering, XIAMEN University Malaysia, Sepang, 43900, Selangor Darul Ehsan, Malaysia
  • S. Salmaliza Centre of Engineering Education Research, SEGi University, Kota Damansara, 47810 Selangor Darul Ehsan, Malaysia
  • A. Wei Lun Centre for Sustainable Process Technology, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, Bangi, 43600 Selangor Darul Ehsan, Malaysia
  • A. Jedi Centre of Engineering and Built Environment Education Research, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, Bangi, 43600 Selangor Darul Ehsan, Malaysia
  • Z. M. Nopiah Centre of Engineering and Built Environment Education Research, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, Bangi, 43600 Selangor Darul Ehsan, Malaysia

Keywords:

reliability, engineering statistics, course outcome, final exam, rasch model

Abstract

Students grade in a subject is determine mainly from the final examination marks. The final exam questions need to be reliable in order to measure the performance of students in the particular subject. Rasch model able to measure the reliability of an instrument, the final exam questions. A total of 114 students from Mechanical Engineering department sat for the Engineering Statistics subject. Marks were entered in excel and transferred as *prn format. Marks were analyzed against Rasch model, WINSTEPS. The summary statistics for person, summary statistics for item, item statistics, item dimensionality and item correlation are among the output being analyzed. One item is identified as misfit item. The Engineering Sattistics exam questios was very reliable and able to give insight view of the questions. A corrective action is taken to review the misfit item and rephrase the item.

Downloads

Download data is not yet available.

References

G. Rasch, “Probabilistic model for some intelligence and attainment tests”, The University of Chicago Press, 1960.

E.B. Anderson, “Sufficient statistics and latent trait models”, Psychometrika, 42, pp. 69-81, 1977.

D. Andrich, “A rating formulation for ordered responses categories”, Psychometrika, 43, pp. 561-573, 1978.

N.A.C. Musa, Z. Mahmud and N. Bahrun, “Exploring students’ perceived and actual ability in solving statistical problem based on Rasch measurement tools”, IOP Conference series: Journal of Physics, 2017.

T.G. Bond and C. M. Fox, “Applying the Rasch model: Fundamental measurement in the human sciences”, 2nd Edition, New Jersey: Lawrence Elbaum Inc. Publishers, 2007.

A.A. Azrilah, N. Azlinah, H.A. Noor, A.G. Hamzah, Z. Sohaimi and M. Saifudin, “Application of Rasch model in validating the construct measurement instrument”, International Journal of Education and Information Technologies, 2, pp. 105-112, 2008.

A.A. Azrilah, S.M. Mohd and Z. Azami, “Asas model pengukuran Rasch, Bangi, Penerbit Universiti Kebangsaan Malaysia, 2013.

J.M. Linacre, A user’s guide to WINSTEPS, Chicago: winsteps.com, 2005.

XjJ.M. Linacre, Data variance explained by measure, Rasch Measurement Transactions, 20(1), pp. 1045, 2006.

J.W.P. Fisher, “Rating scale instrument quality criteria”, Rasch Measurement Transcation, 21(1), pp. 1095, 2007.

A.M. Eakman, “Measurement characteristics of the engagement in meaningful activities survey in an age-diverse sample”, American Journal of Occupational Therapy, 66(2), pp. 20-29, 2012.

Published

24-03-2022

How to Cite

Lohgheswary, N., Salmaliza, S., Lun, A. W., Jedi, A., & Nopiah, Z. M. (2022). Evaluating reliability of final exam questions via rasch model. International Journal of Health Sciences, 6(S3), 1065–1074. https://doi.org/10.53730/ijhs.v6nS2.5064

Issue

Section

Peer Review Articles