Quality of multiple choice questions in undergraduate pharmacology assessments in a teaching hospital of Kerala, India: an item analysis

Manju K. Nair, Dawnji S. R.


Background: Carefully constructed, high quality multiple choice questions can serve as effective tools to improve standard of teaching. This item analysis was performed to find the difficulty index, discrimination index and number of non functional distractors in single best response type questions.

Methods: 40 single best response type questions with four options, each carrying one mark for the correct response, was taken for item analysis. There was no negative marking. The maximum marks was 40. Based on the scores, the evaluated answer scripts were arranged with the highest score on top and the least score at the bottom. Only the upper third and lower third were included. The response to each item was entered in Microsoft excel 2010. Difficulty index, Discrimination index and number of non functional distractors per item were calculated.

Results: 40 multiple choice questions and 120 distractors were analysed in this study. 72.5% items were good with a difficulty index between 30%-70%. 25% items were difficult and 2.5% items were easy. 27.5% items showed excellent discrimination between high scoring and low scoring students. One item had a negative discrimination index (-0.1). There were 9 items with non functional distractors.

Conclusions: This study emphasises the need for improving the quality of multiple choice questions. Hence repeated evaluation by item analysis and modification of non functional distractors may be performed to enhance standard of teaching in Pharmacology.


Difficulty index, Discrimination index, Distractors, High scoring, Item

Full Text:



Bailey PH, Mossey S, Moroso S, Cloutier JD, Love A. Implications of multiple-choice testing in nursing education. Nurse Educ Today. 2012;32(6):e 40-4.

Collins J. Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics. 2006;26 (2):543-51.

Sayyah M, Vakili Z, Alavi NM, Bigdeli M, Soleymani A, Assarian M. An Item Analysis of Written Multiple-Choice Questions: Kashan University of Medical Sciences. Nurs Midwifery Stud. 2012;1(2):83-7.

Kuechler WL, Simkin MG. How well do multiple choice tests evaluate student understanding in computer programming classes? J Inf Syst Educ. 2003;14(4):389-400.

Lau PNK, Lau SH, Hong KS, Usop H. Guessing, Partial Knowledge, and Misconceptions in Multiple-Choice Tests. Educ Tech Soc. 2011;14 (4):99-110.

Clifton SL, Schriner CL. Assessing the quality of multiple-choice test items. Nurse Educ. 2010;35(1):12-6.

Shete AN, Kausar A, Lakhkar K, Khan ST. Item analysis: An evaluation of multiple choice questions in Physiology examination. J Contemp Med Edu. 2015;3(3):106-9.

Norman G. The program for educational development. Evaluation Methods A Resource Handbook, Multiple Choice Questions.Ch. 4.1. Hamilton, Canada: McMaster University; 1995:47-54.4.

Peitzman SJ, Nieman LZ, Gracely EJ. Comparison of fact-recall with higher-order questions in multiple-choice examinations as predictors of clinical performance of medical students. Acad Med. 1990;65:S59-60.

Karkal YR, Kundapur GS. Item analysis of multiple choice questions of undergraduate pharmacology examinations in an International Medical School in India. Journal of Dr. NTR University of Health Sciences. 2016;5(3):183-6.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian J Community Med. 2014 Jan-Mar;39(1):17-20.

Hingorjo MR, Jaleel F. Analysis of one best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc. 2012;62(2):142-7.