You are here

Analytical study of written examination papers of undergraduate anatomy: Focus on it’s content validity.

Journal Name:

Publication Year:

Abstract (2. Language): 
Background: The 'Question Paper' in the form of written examination forms the most important instrument of assessment. Content validity refers to the extent that a test actually measures the intended content area. Adequate coverage of the course content is necessary for the validity of assessment. The content of First professional M.B.B.S written examination of Anatomy was given in syllabus but the weightage of different subdivision of Anatomy is not mentioned. Present study was done to observe the content validity of different subdivisions of Anatomy in written examinations. It was the question paper analysis based study. Results: It is evident from the questions paper analysis that different subdivisions of Anatomy are usually not given proper weightage in the Anatomy written examinations. There are some subdivisions of Anatomy that are usually covered less than required. These include genetics, general Anatomy, histology and neuroanatomy. Some subdivisions of Anatomy remained uncovered in some question papers. For example, questions from Genetics were found in the question papers of only one session out of twenty sessions examined. Conclusion: Methods like test blueprinting and table of specifications should be used during test construction process for proper validation of our assessment system. By harmonizing course objectives with assessment content, educators can ensure a unified curriculum.
FULL TEXT (PDF): 
1110-1116

REFERENCES

References: 

1. Abbatt FR, Teaching for better learning: a guide for teachers of primary health care staff. Geneva:
World Health Organization; 1980, pp 107-109
2. Newble D, Cannon R. A handbook for Medical teachers Fourth edition: Kluwer Academic Publishers,
Dordrecht; 2001, pp 126-129.
3. Adkoli BV, 1995. Attributes of a good question paper. In: Sood R, (Chief editor). Assessment in
medical education trends and tools. K.L. Wig Centre for Medical Education and Technology, All India
Institute of Medical Sciences, Ansari Nagar, New Delhi, p 67-84.
4. Downing SM, 2003.Validity: on the meaningful interpretation of assessment data Medical education;
37: p 830–837.
5. Shumway JH, Harden RM, 2003. Amce guide no.25: the assessment of learning outcomes for the
competent and reflective physician. Med Teach; 25(6): p 569-84.
6. McAleer S, 2001. Choosing assessment instruments. In : Dent JA, Harden RM, (Editors). A practical
guide for medical teachers. Edinbergh: Churchill Livingstone; p.303-313.
7. Syllabus: M.B.B.S, Scheme of examination and courses of study, University of Rajasthan, Jaipur,
2006-2007.
8. Sultana R, Shamim KM, Nahar L, Hasam F, 2009. Content Validity of Written Examinations in
Undergraduate Anatomy; Bangl J of Anat; 7(1); p 14-18.
9. Davis MH, 2001. Constructed response questions. In : Dent JA, Harden RM, (Editors). A practical
guide for medical teachers. Edinbergh: Churchill Livingstone; p 326-35.
10. American Educational Research Association, American Psychological Association, National Council
on Measurement in Education, 1999. Standards for Educational and Psychological Testing.
Washington, DC: American Educational Research Association.
11. Zubair A, Chong YS, Khoo HE, 2006. Practical guide to medical student assessment; World scientific
publishing, Singapore; 130 p.
12. Patrick D. Bridge, Joseph M, Robert F, Thomas R, Shlomo S, 2003. Measurement practices: methods
for developing content-valid student examinations. Med Teach; 25(4): p 414-421.

Thank you for copying data from http://www.arastirmax.com