Glyxambi
Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 2665
  • Home
  • Print this page
  • Email this page


 
   Table of Contents      
ORIGINAL ARTICLE
Year : 2011  |  Volume : 59  |  Issue : 3  |  Page : 211-214

Objective structured clinical examination for undergraduates: Is it a feasible approach to standardized assessment in India?


1 Department of Ophthalmology, Pad. Dr. D. Y. Patil Medical College, Dhankawadi, India
2 Medical Education Unit, Bharati Vidyapeeth University Medical College, Dhankawadi, India
3 Department of Community Medicine, Pad. Dr. D. Y. Patil Medical College, Pune, Maharashtra, India

Date of Submission27-May-2009
Date of Acceptance18-Nov-2010
Date of Web Publication13-May-2011

Correspondence Address:
Kavita R Bhatnagar
B4/21, Brahma Aangan, Off Salunke Vihar Road, Kondwa, Pune - 411 048, Maharashtra
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/0301-4738.81032

Rights and Permissions
  Abstract 

Background: There has been a growing concern among medical educators about the quality of medical graduates trained in various medical colleges in our country. Data based on the faculty and student perceptions of undergraduate curriculum indicate a need for laying more stress on practical skills during their training and assessment. The Objective Structured Clinical Examination (OSCE) is a reliable and an established and effective multistation test for the assessment of practical skills in an objective and a transparent manner. The aim of this article is to sensitize universities, examiners, organizers, faculty, and students across India to OSCE. Materials and Methods: We designed an assessment based on 22-station OSCE and administered it to 67 students during their final year, integrating all the domains of learning, that is higher order cognitive domain, psychomotor domain, and affective domain. Data analysis was done using SPSS version 15. Results: The OSCE was feasible to conduct and had high perceived construct validity. There was a significant correlation between the station score and total examination score for 19 stations. The reliability of this OSCE was 0.778. Both students and faculty members expressed a high degree of satisfaction with the format. Conclusion: Integrating a range of modalities into an OSCE in ophthalmology appears to represent a valid and reliable method of examination. The biggest limitation with this format was the direct expenditure of time and energy of those organizing an OSCE; therefore, sustaining the motivation of faculty might pose a challenge.

Keywords: Objective Structured Clinical Examination, ophthalmology, undergraduate


How to cite this article:
Bhatnagar KR, Saoji VA, Banerjee AA. Objective structured clinical examination for undergraduates: Is it a feasible approach to standardized assessment in India?. Indian J Ophthalmol 2011;59:211-4

How to cite this URL:
Bhatnagar KR, Saoji VA, Banerjee AA. Objective structured clinical examination for undergraduates: Is it a feasible approach to standardized assessment in India?. Indian J Ophthalmol [serial online] 2011 [cited 2019 Mar 18];59:211-4. Available from: http://www.ijo.in/text.asp?2011/59/3/211/81032

There has been a growing concern among medical educators about the quality of medical graduates trained in various medical colleges in our country. Data based on the faculty and student perceptions of the undergraduate curriculum indicate a need for laying more stress on practical skills during their training and assessment. [1] It is a well- known fact that the way students learn is largely determined by the way they are assessed. [2] There is a need to rationalize the examination system by giving due emphasis on the "formative" or internal assessment, and supplementing the traditional long/short case examination with more valid and reliable instruments for the assessment of clinical skills like the Objective Structured Clinical Examination (OSCE) introduced in 1979 by Harden and Gleeson. [3] It allows for the actual demonstration of applied knowledge and skills rather than testing knowledge alone. [4],[5] The opportunity for formative as well as summative feedback makes OSCE an excellent teaching tool as well [6],[7]

OSCE is a reliable and an established and effective multistation test for the assessment of practical skills in an objective and a transparent manner. [8] It provides an opportunity to test their attitude and communication skills as well. The clinical competence to be tested is broken down into specific skills, each of which can be tested at a time. The examination is organized in the form of several "stations" (usually 10-20, more stations, better the reliability), through which the candidates rotate. Each station focuses on testing a particular skill such as taking the history of a patient, performing examination of specific organ systems, interpretation of test results, management of patient, etc. A checklist is prepared by breaking the skill being tested into its essential steps and the precautions to be observed. Each step done well and each precaution observed gets the student a score, proportional to the importance of the step or the precaution, with provision for negative scoring in the case of an important omission or mistakes. The objectivity in assessment is achieved by having each component tested at one fixed station by the same examiner and having the student rotate through several such locations. [3] The stations on which the performance skills are tested (procedure stations) are "manned", where the examiner matches the performance of the student with a checklist, and assigns scores. We can have 6-10 such stations. The answers at the "unmanned" stations are written in the answer sheet, and submitted at the end of the examination and matched against the checklist at the time of marking. Typically, a student spends 5 min within a station, which in practice means that approximately 10 students can be assessed in a period of 2 h.

We understand that OSCE is in a very nascent stage in India. It is one pattern of examination introduced in DNB (Diploma in National Board) few years back, at postgraduate qualifying examination (practical) level. It is not the pattern of examination at master's and doctorate in ophthalmology level (namely, MS and MD in ophthalmology) at majority of universities in India. OSCE is not introduced at the undergraduate level in any subject in India in medical colleges as a qualifying examination. Undergraduate medical education is considered a continuum leading to postgraduate training and ultimately medical practice. [9] However, studies show a poor correlation between medical school performance and resident performance. [10] It has also been pointed out that medical students are often told that they would learn certain skills when they became residents, but discovered once they became residents that they were expected to have them learned in the medical school. [11] Considering this lacuna, we made an attempt to introduce OSCE in the ophthalmology department of our medical college for undergraduates as part of internal assessment with subsequent introduction in university examination with time and experience.


  Materials and Methods Top


A comprehensive 22-station OSCE was administered to 67 final year medical students as the end-of-attachment assessment in ophthalmology.

The faculty experts reviewed the standard text book of Ophthalmology, and Graduate Medical Education Regulations 1997, [12] based on which the most common tasks that students at their level of training are expected to perform, were listed by faculty consensus. We then defined the expected knowledge and skills for these tasks and based the OSCE on this list. Accordingly, we modified the clinical teaching for these students. They were informed early during the posting about this change in the examination pattern.

The stations were designed to test a variety of problem solving, technical, diagnostic, therapeutic, communication, examination, and history taking skills. [Table 1] shows the contents of each station and the domains it tested. For instance, station 1 consisted of a healthy male to demonstrate steps of distant visual acuity testing on a Snellen chart, in a fixed order. Another station consisted of a healthy male playing the role of a son of a critically ill father. Students had to motivate him for eye donation of his father and educate him about myths and misconceptions about eye donation testing social and soft skills.
Table 1: Characteristics of individual stations


Click here to view


The class of 67 students was divided into 3 groups of 22, 22, and 23 students. Each group took the examination consecutively, with one group immediately following the other on the same day. An examiner at each station assessed the candidate's performance with a prepared checklist. The checklist items were the ones that had been deemed by the expert faculty to be critical to a competent performance. At least three experts were consulted for finalizing each station. The number of checklist items ranged from 7 to 10. Some of the stations were evaluated by senior postgraduates in ophthalmology who were trained by the authors. This served the dual benefit of removing faculty bias if any, and also saved faculty time. The length of each station was 5 min. The feedback was obtained from students as well as the participating faculty.

Data analysis was done using SPSS version 15. Mean and SD scores for each station and for the examination overall were calculated. The reliability of the overall examination was examined by calculating Cronbach's alpha, an internal consistency statistics. The correlation coefficient, a measure of station discrimination, was calculated by correlating station scores with overall test scores, using linear regression analysis. This added to the internal structure validity of the examination. [13]


  Results Top


A total of 57 out of 67 students (85.07%) passed the OSCE (mean grade, >60%). The scoring pattern agreed upon for the undergraduates for an overall interpretation of their performance is shown in [Table 2]. A total of 40 out of 67 (59.7%) had an above 70 score, 17 (25.37%) had 60-69, and 10 (14.92%) had scores between 40 and 59. Significant correlations between the station score and total examination score were found for 19 of the 22 stations [Table 1]. The overall reliability of this OSCE assessed by Cronbach's alpha [14] was 0.778, which is very good for an end-of-attachment evaluation.
Table 2: Score interpretation and grading of the performance for competency level


Click here to view


Though the OSCE format was new and everyone participated in an OSCE for the first time, the feedback indicated a high acceptance level and a good correlation between self-rating and actual performance. Almost all participants agreed that the format should be used for testing undergraduates regularly.


  Discussion Top


This pilot study was clear in its purpose. It was meticulously planned and prepared in careful, systematic details. The objectives were defined, blueprinting was done, and adequate attention was paid to the response process. The response process is defined here as evidence of data integrity such that all sources of error associated with the test administration are controlled or eliminated to the maximum extent possible. [15]

We introduced the OSCE as a component of our end-of-attachment examination after 1 month Ophthalmology attachment for final year medical students at our medical college, in the year 2008. We found, as have others, [7,8] that this method of examining clinical skills is superior to the previously used oral examination in several ways and was preferred, compared with an oral examination, by both faculty members and students in our institution.

The OSCE has been shown to be an objective and a valid and reliable system for evaluating the clinical skills of students. All students are examined under the similar conditions with identical problems. This contributes to a high degree of standardization, which is one of the main pitfalls of other forms of the examination, such as the traditional oral examination. The strength of OSCE is the objectivity of the assessments obtained through the use of structured checklists. On the other hand, the direct expenditure of time of those organizing an OSCE and the financial costs involved in implementing an OSCE substantially exceed those associated with a more traditional oral examination. This is a major limiting factor in introducing this format.

An important feature of any examination process is the ability to reliably differentiate between performance levels of candidates. Reliability is an important aspect of an assessment's validity evidence. Reliability refers to the reproducibility of the scores on the assessment; a high score reliability indicates that if the test were to be repeated over time, examinees would receive about the same scores on retesting as they received the first time. Unless assessment scores are reliable and reproducible (as in an experiment), it is nearly impossible to interpret the meaning of those scores; thus, validity evidence is lacking. [16],[17] The overall reliability of the OSCE used was 0.78 but the lower level of reliability required when making pass-fail decisions is 0.80. [18],[19] As this OSCE was part of the end-of-attachment examination, Cronbach's alpha value of 0.78 can be considered good.

The student feedback was well taken by the faculty and improvements were planned for stations with low validity and internal consistency scores, so also for those found difficult or unclear by the students.

In conclusion, we believe that the OSCE we have implemented has contributed substantially to our ability to assess the clinical competence of students in an objective and a reliable and valid manner with a couple of limitations. OSCE tests clinical competence in bits and does not look at a patient in totality. To overcome this limitation, long cases may be combined with the OSCE. OSCE needs considerable time and effort to plan and organize. Therefore, to sustain the motivation of faculty might pose a challenge. OSCE will be an expensive preposition for an average of 100-150 students for examination in each batch. The cost may come down with the development of OSCE station banks and physical setting for the conduction of OSCE. As the same patient is seen by a large number of students, it may be harassing to the patient, compromising their cooperation. Though we have mentioned certain statistical associations, we concede that these were on small samples. Because of this limitation in our study, further studies are required to corroborate our findings.

It requires determination and zeal on the part of the faculty members to switch from the traditional method of examinations to the more rational, objective, and methodical OSCE for undergraduates, as well as postgraduates (MD/MS) in ophthalmology.


  Acknowledgments Top


We gratefully acknowledge Dr (Col) A Bhardwaj, Dr (Lt Col) N Ramchandran, Dr VN Kulkarni, and Dr D Muzumdar, all of whom are faculty at the Department of Ophthalmology, Bharati Vidyapeeth University Medical College, Pune, India for their wholehearted support and active participation in the project. We also thank Dr Payal K Bansal, Associate Professor, Department of Medical Education, Maharashtra University of Health Sciences, Pune, India, for her excellent help and valuable inputs in carrying out this project.

 
  References Top

1.
Sood R, Adkoli BV. Medical education in India: Problems and prospects. J Indian Acad Clin Med 2000;1:210-2.  Back to cited text no. 1
    
2.
Sood R, Paul VK, Mittal S, Adkoli BV, Sahni P. Assessment in medical education: Trends and tools. New Delhi: KL Wig CMET, AIIMS; 1995.  Back to cited text no. 2
    
3.
Sood R. A rational approach for the assessment of clinical competence of undergraduate medical students. J Assoc Physicians India 1999;47:980-4.  Back to cited text no. 3
[PUBMED]    
4.
Varkey P, Natt N. An Objective Structured Clinical Examination (OSCE) For the assessment of systems based practice and practice based learning and interpretation. ACGME Competency Assessment. 04-14-2008.   Back to cited text no. 4
    
5.
Varkey P, Reller MK, Smith A. An experiential interdisciplinary quality improvement education initiative. Am J Med Qual 2006;21:317-22.   Back to cited text no. 5
    
6.
Varkey P, Natt N. The objective structured clinical examination as an educational tool in patient safety. J Comm J Qual Pat Saf 2007;33:48-53.  Back to cited text no. 6
    
7.
Kaufmann DM, Mann KV, Muijtjens AM, van der Vleuten CP. A comparison of standard setting procedures for an OSCE for undergraduate medical education. Acad Med 2000;75:267-71.  Back to cited text no. 7
    
8.
Agarwal A, Batra B, Sood AK, Ramakantan R, Bhargava SK, Chidambaranathan N, et al. Objective structured clinical examination in radiology. Indian J Radiol Imaging 2010;20:83-8.   Back to cited text no. 8
[PUBMED]  Medknow Journal  
9.
Wilkinson TJ, Frampton CM. Assessing performance in final year medical students: Can a postgraduate measure be used in an undergraduate setting? Med Educ 2003;37:233-40.  Back to cited text no. 9
[PUBMED]  [FULLTEXT]  
10.
Kahn MJ, Merril WW, Anderson DS, Szerlip HM. Residency program directors evaluations do not correlate with performance on a required 4 th -year objective structured clinical exam. Teach Learn Med 2001;13:9-12.  Back to cited text no. 10
    
11.
Lypson ML, Fronha JG, Gruppen LD, Wooliscraft JO. Assessing resident′s competencies at the baseline: Identifying the gap. Acad Med 2004;79:564-70.  Back to cited text no. 11
    
12.
Medical council of India. Salient features of regulations on graduate medical education, 1997. Available from: http://www.mciindia.org/know/rules/rules_mbbs.htm. [accessed 2008 Mar 10].  Back to cited text no. 12
    
13.
Downing SM. Validity: On meaningful interpretation of assessment data. Med Educ 2003;37:830-7.   Back to cited text no. 13
[PUBMED]  [FULLTEXT]  
14.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examinations. Br Med J 1975;1:447-54.  Back to cited text no. 14
[PUBMED]  [FULLTEXT]  
15.
Nicole M, Freeth D. Assessment of clinical skills: A new approach to an old problem. Nurse Educ Today 1998;18:601-9.  Back to cited text no. 15
    
16.
Harden RM, Gleeson FA. Assessment of clinical competence using an Objective Structured Clinical Examination (OSCE). Med Educ 1979;31:41-54.  Back to cited text no. 16
    
17.
Cohen R, Reznick RK. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160:302-5.  Back to cited text no. 17
[PUBMED]    
18.
Bark H, Cohen R. Use of an objective structured clinical examination as a component of the final-year examination in small animal internal medicine and surgery. J Am Vet Med Assoc 2002;221:1262-5.  Back to cited text no. 18
[PUBMED]    
19.
Cronbach IJ, Glaser GC, Nanda H. The dependability of behavioral measurements: The theory of generalizability for scores and profiles. New York: John Wiley and Sons; 1972.  Back to cited text no. 19
    



 
 
    Tables

  [Table 1], [Table 2]



 

Top
 
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Materials and Me...
Results
Discussion
Acknowledgments
References
Article Tables

 Article Access Statistics
    Viewed1640    
    Printed46    
    Emailed1    
    PDF Downloaded257    
    Comments [Add]    

Recommend this journal