Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 
  • Users Online: 1393
  • Home
  • Print this page
  • Email this page

   Table of Contents      
ORIGINAL ARTICLE
Year : 2021  |  Volume : 69  |  Issue : 1  |  Page : 43-47

The International Council of Ophthalmology Ophthalmic clinical evaluation exercise


1 Department of Ophthalmology, Hospital Italiano de Buenos Aires, Buenos Aires, Argentina
2 Department of Ophthalmology, Clínica Universidad de Navarra, Navarra Institute for Health Research (IdiSNA), Pamplona, Spain
3 Department of Ophthalmology, Charles Nicolle University Hospital, University of Tunis El Manar, Tunis, Tunisia
4 Higher Institute of Health Sciences, Université des Montagnes, Bangangte, Cameroon
5 Medical Research Foundation, Chennai, India
6 Cincinnati Eye Institute and the University of Cincinnati, United States of America

Date of Submission24-Jan-2020
Date of Acceptance03-Jun-2020
Date of Web Publication15-Dec-2020

Correspondence Address:
Dr. Ana G Palis
Amenabar 1492, 1A (C1426AJZ) Buenos Aires
Argentina
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/ijo.IJO_154_20

Rights and Permissions
  Abstract 


Purpose: Fifteen years after the publication of the Ophthalmic Clinical Evaluation Exercise (OCEX), it was deemed necessary to review and revise it, and to validate it for an international audience of ophthalmologists. This study to revise the OCEX and validate it for international use. Methods: The OCEX rubric was changed to a modified Dreyfus scale; a behavioral descriptor was created for each category. An international panel of ophthalmic educators reviewed the international applicability and appropriateness of the tool. Results: A tool for assessing and giving feedback on four aspects of clinical competence during the ophthalmic consultation (interview skills, examination, interpersonal and communication skills, and case presentation) was revised. The original scoring tool was improved to a new behavioral one, and relevant comments and suggestions from international reviewers were incorporated. The new tool has face and content validity for an international audience. Conclusion: The OCEX is the only tool for workplace assessment and feedback specifically for ophthalmology residents and the ophthalmic consultation. This improved and simplified version will facilitate its use and implementation to diverse programs around the world.

Keywords: Ophthalmic education, resident assessment, workplace-based assessment


How to cite this article:
Palis AG, Barrio-Barrio J, Mayorga EP, Mili-Boussen I, Noche CD, Swaminathan M, Golnik KC. The International Council of Ophthalmology Ophthalmic clinical evaluation exercise. Indian J Ophthalmol 2021;69:43-7

How to cite this URL:
Palis AG, Barrio-Barrio J, Mayorga EP, Mili-Boussen I, Noche CD, Swaminathan M, Golnik KC. The International Council of Ophthalmology Ophthalmic clinical evaluation exercise. Indian J Ophthalmol [serial online] 2021 [cited 2021 Jan 22];69:43-7. Available from: https://www.ijo.in/text.asp?2021/69/1/43/303274



In 2004, Golnik and collaborators developed the Ophthalmic Clinical Evaluation Exercise (OCEX), aiming to fulfill the Accreditation Council for Graduate Medical Education of the United States' mandate to develop valid and reliable instruments to evaluate residents' competence.[1] The OCEX reliability, an adaptation of the mini-Clinical Evaluation Exercise (mini-CEX) created for internal medicine, was demonstrated in a study of 94 academic programs in the United States, reaching a total statistical alpha coefficient of 0.81.[2],[3]

This assessment consists of the observation by an instructor of a clinical encounter between a resident and a patient. The instructor evaluates the different aspects of professional competence during the ophthalmic consultation (interview, examination, interpersonal and communication skills, and case presentation) and grades the resident's performance, guided by a rubric with behavioral descriptors. Part of the process involves giving feedback (and writing recommendations down on the form) and developing with the resident a brief improvement plan.

Fifteen years after the publication of the OCEX, we thought it necessary to review and revise it, as well as to validate it for an international audience of ophthalmologists. The purpose of this study is to present the modifications we made to the instrument and the process of validating its content for training programs around the world.


  Methods Top


The design of this study was exploratory. We used the OCEX developed by Golnik and Goldenhar[3] and maintained the same set of skills as the original instrument. A first draft was prepared by the first author by reclassifying the original OCEX skills and behavioral descriptors into a modified Dreyfus and Dreyfus scale of stages of competence: the categories of the original scoring rubric (does not meet/meets some/meets all/exceeds expectations) were changed to novice, beginner, and competent, each one of them with its corresponding behavioral descriptors.[4] This first draft was first agreed upon by the author of the original OCEX and then sent to the rest of the authors to comment; each author received a personalized draft intending to reduce bias from reading what the rest thought. Each of the authors, educators practicing in different countries around the world (Argentina, Cameroon, India, Spain, Tunisia, and the United States of America), reviewed each skill and its behavioral descriptors while answering to the following questions: 1) Is there any important item missing in the questionnaire or the descriptors? 2) Do you think we need to change or delete any items or descriptors; if yes, why? 3) Are the behaviors clearly and accurately defined and described? 4) Would this tool be potentially applicable to your setting or region; if not, why?

We then sent the modified OCEX to a group of 14 educators from the International Council of Ophthalmology Ophthalmic Educators Group (ICO-OEG), practicing in a variety of countries around the world (Bulgaria, Colombia, Congo, Egypt, Hungary, India, Nepal, Philippines, the United Arab Emirates, and the United States of America), who had volunteered to review the instrument.[5] The ICO-OEG is an ICO special interest group, currently consisting of 921 ophthalmologists from around the world. For this study purposes, a call for applications to participate in the ICO-OCEX review panel was sent to all ICO-OEG members who were asked to complete an online application form. The form asked volunteers to provide their city and country, if they had used the OCEX before, a description of interests and skills, and their curriculum vitae. 50 ophthalmologists volunteered, and we selected a panel of 14 ophthalmologists and educators that was geographically diverse, and that had used the OCEX before. We did not get answers from 5 volunteers that were consequently withdrawn from the panel.

Reviewers were asked to review the tool while answering the same questions listed above. We incorporated the comments and suggestions of the nine that finally answered.

We will describe the development of the instrument and the international validation of its content.

The study was considered free of ethical objections by the Hospital Italiano de Buenos Aires Ethics in Research Committee.


  Results Top


The instrument

The new version of the OCEX, as shown in the Appendix[Additional file 1], contemplates the evaluation of four aspects of clinical competence during the ophthalmic consultation: interview skills, examination, interpersonal and communication skills, and case presentation. The following are included within interview skills: introduction, chief complaint, history of present illness, pertinent negatives, pain inquiry, allergies or adverse reactions to medications, review of systems, medication list, past systemic history, past ocular history, social history and hygienic habits, and family history. The exam covers hand/diagnostic instrument hygiene, visual acuity, pupils/relative afferent pupillary defect, confrontational visual fields, motility, external, slit lamp exam, intraocular pressure, and funduscopy. The aspects related to interpersonal and communication skills to assess are patient comfort, empathy, respectfulness (e.g., eye contact while listening), understandability, explanation of findings, explained diagnosis, explained plan/options, and asked if the patient had questions. The case presentation includes conciseness, clarity, organization; pertinent facts (positive and negative), differential diagnosis, appropriate plan, and response to attending.

We considered it appropriate to clarify some points in the form, to facilitate its interpretation. We defined a “pertinent negative” as an element of the patient's history that aids diagnosis because the patient denies that is present (e.g., a patient with an acute floater should be asked about photopsia to help rule out a retinal tear). We clarified that asking about pain is a requirement in several countries. Also, that listing the medications used by the patient includes ophthalmic and systemic medications currently used, including nutritional supplements and other over-the-counter products, and that social history/hygienic habits include, for example: occupation; tobacco, alcohol, or illegal drugs consumption; family and housing situation; social security.[6]

The scoring rubric comprises three columns, and we developed it according to a modification of Dreyfus and Dreyfus stages of competence: we included only the novice, beginner, and competent stages. We created behavioral descriptors for all the skills in each stage. We also included a column “not applicable”, clarifying that it can be used when a specific item is not appropriate or necessary.

In the end we left an open space for specific feedback comments for the resident.

Content validation

Fourteen volunteer educators, members of the ICO Ophthalmic Educators Group, were asked to review the content of the new instrument while answering to a set of four questions; nine responded. We will describe the most significant comments to each of the open-ended questions.

Q1 Are the items and corresponding descriptors clearly defined? Eight reviewers answered positively; one did not answer this question.

Q2 Are we missing anything important? Three respondents answered that nothing important was missing.

One of the reviewers suggested adding an item about dealing with the family, which is a special issue particularly in countries where patients are always accompanied by one or more family members, so we added “family” to the corresponding items listed in communication skills. The same reviewer suggested adding an item about dealing with handicapped or blind patients who may need special help during the examination; we added “consideration of patient comfort, safety, and disabilities” in the item corresponding to patient comfort.

Another reviewer suggested expanding the slit lamp section; we thought that this would make the list too long, so this suggestion was not incorporated. He also recommended adding “suggests appropriate confirmatory testing”; we considered that this was implicit in “case presentation - provides an appropriate and realistic plan”.

Q3 “Do you think we need to change/delete any item?” A reviewer declared finding it difficult to assess empathy externally. We believe that there are indirect signs (e.g., tone of voice, pausing, comments, etc.) through which the assessors can make an impression of the resident's empathy. He also suggested combining “pertinent facts” and “pertinent positives and negatives”, so we incorporated this modification.

Regarding interpersonal skills, a reviewer suggested that: 1) specific descriptors on patient comfort (like appropriate adjustment of height and position, not switching on the slit lamp/indirect full illumination on the patient's eye, putting the chair unit back to zero after examination, etc.) could be added to make the assessment more objective; 2) “disrespectful” is a subjective description and interpretation might have cultural variation, and that we could make it more descriptive including the acceptable behavior; 3) some examples on the nonverbal communication skills should be included in the interpersonal skills (like eye contact while listening, gestures etc.); 4) in “explained plan and options”, a competent resident can be expected to explain about alternate options and the possible pros and cons and participate in the informed consent decision making. We included all these recommendations in the rubric.

Another reviewer suggested adding “including timing, duration, frequency, intensity, and aggravating and alleviating factors” to the history of present illness; we added these to the rubric.

Another reviewer suggested asking about the results of nonprescription medications, as well as provider qualifications; we considered this too detailed to include.

A reviewer suggested combining the explanation of findings, diagnosis, and plan into one item since all of them serve to show how the resident explains the situation to the patient/family. We consider that they are different situations that require different skills (e.g., when explaining the diagnosis that the patient has a malignant tumor requires other skills than explaining the treatment; or giving detailed explanations on how to use glaucoma medication to attain compliance is different than explaining the diagnosis of glaucoma); therefore, we did not incorporate this suggestion.

Three reviewers suggested to include explaining and obtaining the informed consent, so we incorporated this into the rubric.

One of the reviewers suggested adding “and treatment goals” to the plan explanation by the resident. We considered this to be too detailed and in a certain way implicit in “provide a realistic plan”.

Q4 Would this tool be potentially applicable to your setting/region? If not, why? Seven reviewers answered affirmatively, two did not respond.


  Discussion Top


The first version of the OCEX, developed 15 years ago, has been widely and long used in the United States and other countries. In a survey of 56 out of 118 residency programs in the United States (U.S.), Paley and collaborators reported the use of the instrument by more than 50% (31) of U.S. programs.[7] Informal communications with the authors have reported its use in different programs around the world. It has been translated into Portuguese, Chinese, Mongolian, and Spanish.[8] The tool and practical instructions (including example videos) on how to use it have been disseminated by the ICO in its faculty development programs for directors and educators of residency programs around the world.

One of the advantages of this tool, unlike the mini-CEX that inspired its authors, is that it has aspects of the consultation and behavioral descriptors that are specific for ophthalmology.

The original tool has a rubric with descriptors for each item to guide assessors while using the scoring rubric (does not meet/meets some/meets all/exceeds expectations); however, one of the criticisms it has received is the variable interpretation that observers give to the grading scale anchors, which makes difficult to achieve inter-rater reliability.[7] In the aforementioned study, Paley and collaborators retrospectively analyzed OCEX evaluations of 22 second- and third-year residents from two ophthalmology programs over a 3-year period. They were not able to find clinical improvement of residents over time; the varied interpretations of the grading score anchors, the use of a relative rather than the absolute tool's grading scale by evaluators, a lack of clear expectations for each stage of development are among the reasons that the authors list to explain this finding.[7] It seemed therefore appropriate to modify the original tool's grading scale to a scale based on stages of behaviors, such as the Dreyfus scale that would make the year of training less relevant and put more emphasis on actual performance and progression of competence acquisition.

We understand that, as it has been published in other studies about observed assessments, the assessors' judgment is influenced by idiosyncrasies, biases, gestalt, and conflicting contextual factors, as well as the interpretation that they give to the evaluation per se to the scale scores.[9],[10]

We reduced the number of rating points to three intending to simplify the use of the rubric by the assessors. In a study that compares a scale of nine with a 5-point scale in the mini-CEX, Cook and Beckman show that, although interobserver reliability is similar for both scales, the 9-point scale seems to provide more accurate scores.[11] Other studies suggest that evaluators have different interpretations of what constitutes, for example, “superior” performance, and when the scale is accompanied by detailed descriptions to guide the evaluation, assessors do not use them.[10] In addition, assessors tend to be reluctant to use categories that may sound pejorative, such as “unsatisfactory” or “poor”, or to assign low scores to examinees.[10],[12] For all these reasons, and given that we recommend the use of this instrument primarily for the provision of feedback, the simplification of the scale will facilitate its use in training programs.

Another issue about the instrument that was criticized in informal communications with the authors is that it does not contemplate each and every possible situation that may arise in the clinical consultation. Although we have tried to improve some aspects, this granularity escapes the purpose of the instrument. Should situations that are not described in the form or in the rubric arise, the observer may add comments in the space provided for feedback to the resident. Also, not every step of the examination or the interview will be compulsory in every consultation (for example, a confrontational visual field may probably not be needed for a patient who has a corneal foreign body), so the box “Not applicable” will be useful in these cases.

As it has been published, the value of these observed assessments lies fundamentally in the feedback provided by the observer to the resident and in the possibility of developing with the trainee an improvement plan.[7],[13],[14] It is advisable to assess several of these encounters to ensure a diversity of cases, situations, and contexts throughout the years of training, and by different examiners.[10],[12],[15] Residency program directors should consider follow-up of these improvement plans (which should be brief but significant), so that learning and professional development can be truly verified.[16] For summative evaluations, it is recommended that this be one more tool in the range of assessments used.[17],[18]

The review process by educators from a variety of regions in the world is worth noting, as well as the fact that the instrument was of interest and applicable in diverse contexts. We were able to incorporate suggestions and modifications that will expand the possibility of using the tool in programs around the world (we removed, for example, the reference to “shakes hands” from the original instrument, since as one of the reviewers noted, this practice is not accepted everywhere). We believe that these recommendations increase the face and content validity of the tool since they collect opinions from international ophthalmologists and educators, different from the US authors that developed the original OCEX and the 18 content experts that established its face and content validity, and the panel of 94 academic ophthalmology teaching faculty that determined its reliability and construct validity.[1],[2] Considering that part of the mission of the ICO is to enhance ophthalmic education around the world, and specifically for the ICO's “Teaching the Teachers” initiative to increase the quality of ophthalmic training around the world, opinions of educators from different programs around the globe on how valid all the aspects of competency included in the OCEX are, would be crucial to ensure the feasibility of the use of the tool in other settings than the US-based ones, especially for domains of competence such as Professionalism and Interpersonal and Communication Skills that may be culturally variable. We used this kind of content validation (review by a panel of international educators) for other published surgical and clinical competence assessment instruments.[19],[20],[21],[22],[23],[24],[25],[26],[27]

Among the limitations of our work, we can mention the fact that this version of the instrument has not yet been used, the number of international reviewers is relatively small, and reviewers representing cultures such as China and Russia that have large numbers of ophthalmology residents did not participate.

A number of implications for medical education and future research may emerge from this study: 1) testing the tool to demonstrate other aspects of its validity; 2) testing the tool in different years of residency to demonstrate residents' progression throughout their years of training; 3) creating more granular evaluation rubrics for procedures (direct observation of procedural skills - DOPS), especially for some complex procedures included in the OCEX, such as the use of the slit lamp, gonioscopy, or funduscopy that could be used for decisions related to the ACGME milestones or entrusted professional activities; 4) developing a training program for evaluators, to help them with the interpretation of the rubric, the provision of quality feedback, and to develop a plan of improvement with the resident.[9],[14],[28],[29],[30],[31],[32]

To our knowledge, no studies have been published demonstrating how the OCEX improved the evaluation of candidates as compared to a group that has not been evaluated. However, Al Ansari and collaborators conducted a meta-analysis of 11 published studies from 1995 to 2012 that reported the relationship between a similar clinical observation assessment, the mini-CEX, and other standardized academic and clinical performance measures. They demonstrated construct and criterion validity of this tool that was supported by small to large effect-size differences based on measures between trainees' achievement and clinical skills performance, indicating the importance of this kind of assessment tool for the direct observation of trainees' clinical performance.[17]


  Conclusion Top


In conclusion, the OCEX continues to be the only tool for workplace assessment specifically of ophthalmology residents and the ophthalmic consultation. This improved and simplified version will facilitate its use in the observed assessment of residents' competence and delivery of feedback. The international experts' opinion of its relevance and applicability will facilitate its implementation to diverse programs around the world.

Acknowledgements

Our panel of international reviewers, members of the ICO Ophthalmic Educators Group: Sima Das (India), Mohammed El-Hanan (United Arab Emirates), Essam El-Toukhy (Egypt), Matthew Gearinger (United States of America), Jacqueline Hernandez King (Philippines), Alexis Kahindo Kahatane (Democratic Republic of the Congo), Elizabeth Palkovacs (United States of America), Carlos Restrepo (Colombia), and Miklos Schneider (Hungary).

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Golnik KC, Goldenhar LM, Gittinger JW, Lustbader JM. The ophthalmic clinical evaluation exercise (OCEX). Ophthalmology 2004;111:1271-4.  Back to cited text no. 1
    
2.
Norcini JJ, Blank LL, Arnold GK, Kimball HR. The Mini-CEX (clinical evaluation exercise): A preliminary investigation. Ann Intern Med 1995;123:795-9.  Back to cited text no. 2
    
3.
Golnik KC, Goldenhar L. The ophthalmic clinical evaluation exercise: Reliability determination. Ophthalmology 2005;112:1649-54.  Back to cited text no. 3
    
4.
Dreyfus SE, Dreyfus HL. A Five-Stage Model of the Mental Activities Involved in Directed Skill Acquisition. Berkeley: University of California; 1980. Available from: https://apps.dtic.mil/dtic/tr/fulltext/u2/a084551.pdf. [Last accessed on 2019 Jan 04].  Back to cited text no. 4
    
5.
International Council of Ophthalmology Ophthalmic Educators Group. Available from: http://www.icoph.org/resources/410/ICO-Ophthalmic-Educators-Group-ICO-OEG.html. [Last accessed on 2019 Nov 23].  Back to cited text no. 5
    
6.
American Academy of Ophthalmology: Preferred Practice Pattern. Comprehensive Adult Medical Eye Evaluation. Elsevier Inc.; 2016. p. 209-36.  Back to cited text no. 6
    
7.
Paley GL, Shute TS, Davis GK, Culican SM. Assessing progression of resident proficiency during ophthalmology residency training: Utility of serial clinical skill evaluations. J Med Educ Train 2017;1:1-14.  Back to cited text no. 7
    
8.
International Council of Ophthalmology > OCEX Checklist in English, Chinese, Portuguese, and Spanish. Available from: http://www.icoph.org/resources/276/OCEX-Checklist-in-English-Chinese-Portuguese-and-Spanish.html. [Last accessed on 2019 Nov 23].  Back to cited text no. 8
    
9.
Lee V, Brain K, Martin J. Factors influencing mini-CEX rater judgments and their practical implications: A systematic review. Acad Med 2017;92:880-7.  Back to cited text no. 9
    
10.
Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: Construct alignment improves the performance of workplace-based assessment scales. Med Educ 2011;45:560-9.  Back to cited text no. 10
    
11.
Cook DA, Beckman TJ. Does scale length matter? A comparison of nine- versus five-point rating scales for the mini-CEX. Adv in Health Sci Educ 2009;14:655-64.  Back to cited text no. 11
    
12.
Hawkins RE, Margolis MJ, Durning SJ, Norcini JJ. Constructing a validity argument for the mini-clinical evaluation exercise: A review of the research. Acad Med 2010;85:1453-61.  Back to cited text no. 12
    
13.
Lee AG, Carter K. OCEX reliability. Ophthalmology 2006;113:717.  Back to cited text no. 13
    
14.
Lörwald AC, Lahner FM, Greif R, Berendonk C, Norcini J, Huwendiek S. Factors influencing the educational impact of Mini-CEX and DOPS: A qualitative synthesis. Med Teach 2018;40:414-20.  Back to cited text no. 14
    
15.
Norcini JJ, Blank LL, Duffy FD, Fortna GS. The Mini-CEX: A method for assessing clinical skills. Ann Intern Med 2003;138:476-81.  Back to cited text no. 15
    
16.
Norcini J, Anderson MB, Bolella V, Burch V, Costa MJ, Duvivier R, et al. 2018 Consensus framework for good assessment. Med Teach 2018;40:1102-9.  Back to cited text no. 16
    
17.
Al Ansari A, Kauser Ali S, Donnon T. The construct and criterion validity of the mini-CEX: A meta-analysis of the published research. Acad Med 2013;88:413-20.  Back to cited text no. 17
    
18.
Van der Vleuten C, Verhoeven B. In-training assessment developments in postgraduate education in Europe. ANZ J Surg 2013;83:454-9.  Back to cited text no. 18
    
19.
Golnik KC, Beaver H, Gauba V, Lee AG, Mayorga E, Palis G, Saleh M. Cataract surgical skill assessment. Ophthalmology 2011;118:427.e1-5.  Back to cited text no. 19
    
20.
Golnik KC, Haripriya A, Beaver H, Gauba V, Lee AG, Mayorga E, et al. Cataract surgery skill assessment. Ophthalmology 2011;118:2094.  Back to cited text no. 20
    
21.
Golnik KC, Motley WW, Atilla H, Pilling R, Reddy A, Sharma P, et al. The ophthalmology surgical competency assessment rubric for strabismus surgery. J AAPOS 2012;16:318-21.  Back to cited text no. 21
    
22.
Golnik KC, Gauba V, Saleh GM, Collin R, Naik MN, Devoto M, et al. The ophthalmology surgical competency assessment rubric for lateral tarsal strip surgery. Ophthalmic Plast Reconstr Surg 2012;28:350-4.  Back to cited text no. 22
    
23.
Swaminathan M, Ramasubramanian S, Pilling R, Li J, Golnik K. ICO-OSCAR for pediatric cataract surgical skill assessment. J AAPOS 2016;20:364-5.  Back to cited text no. 23
    
24.
Golnik KC, Law JC, Ramasamy K, Mahmoud TH, Okonkwo ON, Singh J, et al. The ophthalmology surgical competency assessment rubric for vitrectomy. Retina 2017;37:1797-804.  Back to cited text no. 24
    
25.
Green CM, Salim S, Edward DP, Mudumbai RC, Golnik K. The ophthalmology surgical competency assessment rubric for trabeculectomy. J Glaucoma 2017;26:805-9.  Back to cited text no. 25
    
26.
Juniat V, Golnik KC, Bernardini FP, Cetinkaya A, Fay A, Mukherjee B, et al. The ophthalmology surgical competency assessment rubric (OSCAR) for anterior approach ptosis surgery. Orbit 2018;14:1-4.  Back to cited text no. 26
    
27.
Palis AG, Golnik KC, Mayorga EP, Filipe HP, Garg P. The international council of ophthalmology 360-degree assessment tool: Development and validation. Can J Ophthalmol 2018;53:145-9.  Back to cited text no. 27
    
28.
Hicks PJ, Margolis MJ, Carraccio CL, Donnelly K, Fromme HB, Gifford KA, et al. A novel workplace-based assessment for competency-based decisions and learner feedback. Med Teach 2018;40:1143-50.  Back to cited text no. 28
    
29.
Ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med 2007;82:542-7.  Back to cited text no. 29
    
30.
Holmboe ES, Yepes M, Williams F, Huot SJ. Feedback and the mini clinical evaluation exercise. J Gen Intern Med 2004;19:558-61.  Back to cited text no. 30
    
31.
Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees. A systematic review. JAMA 2009;302:1316-26.  Back to cited text no. 31
    
32.
Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. The educational impact of mini-clinical evaluation exercise (Mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: A systematic review and meta-analysis. PLoS One 2018;13:e0198009.  Back to cited text no. 32
    




 

Top
 
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Methods
Results
Discussion
Conclusion
References

 Article Access Statistics
    Viewed376    
    Printed6    
    Emailed0    
    PDF Downloaded92    
    Comments [Add]    

Recommend this journal