Educ Res Med Sci

Image Credit:Educ Res Med Sci

Commonly Used Clinical Assessment Methods in Kermanshah University of Medical Sciences: A Five-Year Analysis 2014 - 2019

Author(s):
Leeba RezaieLeeba RezaieLeeba Rezaie ORCID1, Mehdi ZobeiriMehdi ZobeiriMehdi Zobeiri ORCID2,*, Mansour RezaeiMansour Rezaei3, Ghobad RamezaniGhobad RamezaniGhobad Ramezani ORCID4, Ali Akbar Parvizi FardAli Akbar Parvizi FardAli Akbar Parvizi Fard ORCID5, Mohammad Rasool KhazaeiMohammad Rasool KhazaeiMohammad Rasool Khazaei ORCID6, Maryam JanatolmakanMaryam JanatolmakanMaryam Janatolmakan ORCID3, Lida Memar EftekhariLida Memar EftekhariLida Memar Eftekhari ORCID7
1Sleep Disorders Research Center, Kermanshah University of Medical Sciences, Kermanshah, Iran
2Kermanshah University of Medical Sciences, Kermanshah, Iran
3Social Development and Health Promotion Research Center, Health Institute, Kermanshah University of Medical Sciences, Kermanshah, Iran
4Education Development Center, Kermanshah University of Medical Sciences, Kermanshah, Iran
5Department of Clinical Psychology, Kermanshah University of Medical Sciences, Kermanshah, Iran
6Fertility and Infertility Research Center, Health Technology Institute, Kermanshah University of Medical Sciences, Kermanshah, Iran
7Center for Studies and Development of Medical Science Education (EDC), Kermanshah University of Medical Sciences, Kermanshah, Iran

Educational Research in Medical Sciences:Vol. 14, issue 2; e165309
Published online:Sep 29, 2025
Article type:Research Article
Received:Aug 18, 2025
Accepted:Sep 24, 2025
How to Cite:Rezaie L, Zobeiri M, Rezaei M, Ramezani G, Parvizi Fard A A, et al. Commonly Used Clinical Assessment Methods in Kermanshah University of Medical Sciences: A Five-Year Analysis 2014 - 2019.Educ Res Med Sci.2025;14(2):e165309.https://doi.org/10.5812/ermsj-165309.

Abstract

Background:

Clinical assessment of medical students is a critical component of the educational curriculum, and several methods have been proposed. Although each method has its own importance, some are more commonly applied, while others are rarely used.

Objectives:

This study aimed to analyze the clinical assessment methods employed at Kermanshah University of Medical Sciences (KUMS) over a five-year period.

Methods:

In this observational study, educational records from 19 departments of the KUMS School of Medicine were reviewed for the period 2014 - 2019. Data on assessment methods used to evaluate the clinical performance of medical students were collected using two checklists and analyzed with descriptive statistics in SPSS version 25.

Results:

A total of 19 educational supervisors and 195 faculty members participated in the study and completed the checklists. The most frequently used assessment methods were multiple-choice questions (MCQs, 37.9%), short answer questions (SAQs, 20%), mini-clinical evaluation exercise (Mini-CEX, 18.5%), and objective structured clinical examination (OSCE, 16.9%). At the internship level, MCQ (26.7%), short answer (12.3%), OSCE (11.3%), and Mini-CEX (11.3%) were most commonly reported as being used “always”.

Conclusions:

Across both clerkship and internship stages, MCQ, Mini-CEX, OSCE, and SAQs were the most commonly applied assessment methods. While each has specific advantages, none alone is sufficient to comprehensively evaluate clinical competence. Faculty members should be encouraged to learn and apply a wider variety of assessment tools to more effectively measure medical students’ clinical skills.

1. Background

Graduates of medical universities must acquire competencies beyond theoretical knowledge, so the curriculum is based on the competency-based approach, which in turn necessitates changes in student assessment methods from an overemphasis on knowledge to assessment in a real environment (1). Assessment is the study of the achievement of the desired goals in the learning activities of learners and is a tool to improve the quality of educational programs, motivate students to learn, and guide them toward educational goals (1, 2). Therefore, assessment seeks to strengthen effective educational programs and methods and weaken or eliminate ineffective or undesirable programs (3).

Clinical assessment of medical students in clinical environments is of great importance to ensure their proper progress based on the goals of the program and the extent to which these goals are achieved (4, 5). Assessment in the clinical environment is crucial and requires evidence that the student has acquired the necessary competence to function properly in the real environment (6). Medical assessment methods, especially in clinical education, often lack the necessary efficiency in assessing practical skills due to insufficient educational goals (7). Multiple and combined methods are currently needed for the assessment of clinical capability as a complex structure. These assessment methods, such as portfolio, Logbook, mini-clinical evaluation exercise (Mini-CEX), objective structured clinical examination (OSCE), and direct observation of procedural skills (DOPS), may involve patients in hospitals and other healthcare facilities, communities, simulation and learning laboratories, and virtual environments. Many of the student assessment methods in medical universities are conducted using traditional methods such as multiple-choice questions (MCQs) and general assessment by professors (8-11).

Based on studies, the OSCE method can assess students' clinical skills more effectively than common clinical assessment methods, leading to greater student satisfaction (12). Additionally, the use of DOPS and Mini-CEX assessment methods, compared to traditional methods, has improved the clinical skills of students (13). According to research findings, the usual assessment of students is often limited to subjective information, with little attention given to accurately assessing their clinical skills (14). Moreover, the assessment methods in most clinical courses do not align with educational goals and lack efficiency in measuring clinical skills and student performance. Although clinical skills and practical work are central to medical education, the success of medical students in these exams largely depends on their mental reservations (15, 16). While skill and practical work play the main role in medical education, mental information is of secondary importance (17).

Furthermore, the implementation of traditional assessment methods has led to student dissatisfaction. A study showed that 62% of male students and 82% of female students believed that not all skills could be evaluated through conventional assessment, and this dissatisfaction can be an inhibiting factor in learning (18). Given the ever-increasing changes in clinical education approaches, the need to use new assessment methods that are appropriate to these changes is becoming more apparent. Research conducted in the nursing schools of South American states found that 45% of the schools had not revised their clinical assessment methods for 5 years, 35% for 6 - 10 years, 17% for 11 - 15 years, and 3% for more than 15 years (19). Additionally, research in nursing schools in Tehran determined that 62% of the students believed that the clinical assessment conditions and cases were not consistent and satisfactory for all students (20).

At the same time, there is no single method universally used among educational groups, so some tests may be more frequently used while others are less commonly applied. Since each test has a specific score for clinical performance assessment, the proper use of these tests can significantly affect the quality of assessment. Therefore, it seems necessary to investigate the extent and factors affecting the use of these tests.

2. Objectives

This study aims to investigate the extent of use of different clinical assessment methods for medical students at Kermanshah University of Medical Sciences (KUMS).

3. Methods

3.1. Study Design

In this observational study, educational records of the exam methods used to evaluate the clinical performance of medical students by 19 educational departments of the medical school at KUMS from 2014 to 2019 were analyzed. For this purpose, we gathered data from two sources: Educational supervisors and faculty members (attending) of educational departments simultaneously. The inclusion criteria for faculty members were having at least 1 year of experience participating in the exams and willingness to participate in the study. The inclusion criteria for educational supervisors (usually nurses with experience in the field of educational management) were access to records of the exams in the given department and willingness to participate in the study. The exclusion criterion for both faculty members and educational supervisors was incomplete checklists. Informed consent was provided by all participants. Finally, 19 educational supervisors and 195 faculty members participated in this study.

3.2. Measures

We used two separate checklists for educational supervisors and faculty members. The checklist for faculty members included two parts. The first part assessed demographic information of the faculty member, such as their work experience. The second part dealt with questions about the methods used for the assessment of medical students in two stages: Extern (stager stage) and intern. The questions were answered dichotomously with "Yes" or "No", and "Yes" answers were further categorized into three options using a Likert scale: "Occasionally", "Usually", and "Always".

The methods used included OSCE, objective structured practical examination (OSPE), key feature (KF), objective structured lab examination (OSLE), patient management problem (PMP), Mini-CEX, script concordance (SC), DOPS, case-based discussion (CBD), multi-source feedback (MSF), and global rating form (GRF). These methods have been considered in the general practice curriculum.

The checklist for educational supervisors included questions about the methods used for the clinical assessment of medical students in the two stages of extern and intern over a five-year period. The checklist was scored similarly to the checklist used for faculty members. Checklists were distributed as links via email or through training supervisors.

This checklist was initially compiled by two members of the research team. The basis for designing this checklist was a review of past studies regarding the types of tests used in clinical assessment. In the second step, this tool was reviewed by the research team, and their corrective comments were applied. In the third stage, the finalized checklist was approved by the research team members and three medical education experts in terms of face and content validity.

The assessments included written short answer examinations, which evaluate knowledge recall and application but are not clinical tests in the strict sense. Based on Miller’s Pyramid of Clinical Competence, MCQs and short answer questions (SAQs) primarily assess the ‘Knows’ and ‘Knows How’ levels, whereas OSCE and Mini-CEX evaluate higher levels of competence (‘Shows How’ and ‘Does’).

3.3. Statistical Analysis

We used descriptive statistics to describe demographic information and the frequency of exam methods used over five years and in the two stages of extern and intern. A series of cross-tabulations (frequency and percent) tests were performed to calculate the association between demographic characteristics and exam methods.

4. Results

The information on methods used for the clinical assessment of medical students at KUMS over a five-year period (2014 to 2019) was gathered from 19 educational supervisors of educational departments and 195 faculty members. Table 1 shows some demographic characteristics of faculty members who participated in this study. Of the participants, 74.4% were males. The anesthesiology department had the highest frequency of participants, while urology and social medicine had the lowest (9.7% and 1.5%, respectively). The category of 6 - 10 years was the highest frequency of work experience among the participants (Table 1).

Table 1.Some Demographics Characteristics of Study Participants
VariablesNo. (%)
Gender
Female 50 (25.6)
Male145 (74.4)
Departments
Anesthesiology19 (9.7)
Dermatology7 (3.6)
Diagnostic radiology0 (0)
Emergency medicine11 (5.6)
Internal medicine17 (8.7)
Neurology13 (6.7)
Obstetrics and gynecology8 (4.1)
Ophthalmology13 (6.7)
Pathology9 (4.6)
Pediatrics10 (5.1)
Psychiatry5 (2.6)
Surgery18 (9.2)
Urology3 (1.5)
Infectious disease9 (4.6)
Cardiology9 (4.6)
Orthopedic9 (4.6)
Neurosurgery 12 (6.2)
Social medicine3 (1.5)
ENT8 (4.1)
Oncology8 (4.1)
Work history
1 - 545 (23.1)
6 - 1047 (24.1)
11 - 1535 (17.9)
16 - 2024 (12.3)
21 - 2517 (8.7)
26 - 3024 (12.3)
31 - 353 (1.5)
Academic rank
Educational co-worker13 (6.7)
Assistant professor114 (58.5)
Associated professor49 (925.5)
Professor19 (9.7)

In Table 2, the methods used for the clinical assessment of medical students in the intern stage are included. As seen in Table 2, MCQs, short answers, OSCE, and Mini-CEX had the highest frequency of being used "always" (26.7%, 12.3%, 11.3%, and 11.3%, respectively), while portfolio (97.9%), CBD and GRF (97.4%), 360-degree feedback (93.8%), and DOPS (93.3%) were the methods with the highest frequency of "not used" (Table 2).

Table 2.The Frequency of Method Used in Intern Stage a
MethodsNeverSometimesMost of the TimesAlwaysTotal
Multi choice127 (65.1)4 (2.1)12 (6.2)52 (26.7)195 (100.0)
Classify180 (92.3)7 (3.6)3 (1.5)5 (2.6)195 (100.0)
Widen response174 (89.2)6 (3.1)6 (3.1)9 (4.6)195 (100.0)
Short answer158 (81.0)6 (3.1)7 (3.6)24 (12.3)195 (100.0)
Descriptive172 (88.2)7 (3.6)6 (3.1)10 (5.1)195 (100.0)
OSCE153 (78.5)7 (3.6)13 (6.7)22 (11.3)195 (100.0)
OSPE189 (96.9)2 (1.0)3 (1.5)1 (0.5)195 (100.0)
OSLE190 (97.4)3 (1.5)2 (1.0)0 (0.0)195 (100.0)
PMP177 (90.8)3 (1.5)4 (2.1)11 (5.6)195 (100.0)
SC&KF193 (99.0)1 (0.5)0 (0.0)1 (0.5)195 (100.0)
Long case185 (94.9)6 (3.1)0 (0.0)4 (2.1)195 (100.0)
Mini-CEX161 (82.6)3 (1.5)9 (4.6)22 (11.3)195 (100.0)
DOPS182 (93.3)4 (2.1)4 (2.1)5 (2.6)195 (100.0)
CBD191 (97.9)2 (1.0)2 (1.0)0 (0.0)195 (100.0)
Logbook181 (92.8)4 (2.1)2 (1.0)8 (4.1)195 (100.0)
Portfolio191 (97.9)3 (1.5)1 (0.5)0 (0.0)195 (100.0)
360 D (MSF)183 (93.8)5 (2.6)2 (1.0)5 (2.6)195 (100.0)
GRF190 (97.4)2 (1.0)1 (0.5)2 (1.0)195 (100.0)

Abbreviations: OSCE, objective structured clinical examination; OSPE, objective structured practical examination; OSLE, objective structured lab examination; PMP, patient management problem; SC, script concordance; KF, key feature; Mini-CEX, mini-clinical evaluation exercise; DOPS, direct observation of procedural skill; CBD, case-based discussion; 360 D, 360-degree; MSF, multi-source feedback; GRF, global rating form.

a Values are expressed as No. (%).

The results of methods used for clinical assessment in the stager stage are listed in Table 3. In the stager stage, MCQs (37.9%), short answers (20%), Mini-CEX (18.5%), and OSCE (16.9%) were the methods with the highest frequency of being used "always". As seen in Table 3, SC&KF, and GRF (98.5%), OSLE (97.9%), portfolio (97.7%), and CBD (96.6%) were the methods with the highest frequency of "not used" (Table 3).

Table 3.The Frequency of Method Used in Stager Stage a
MethodsNeverSometimesMost of the TimesAlwaysTotal
Multi choice102 (52.3)1 (0.5)18 (9.2)74 (37.9)195 (100.0)
Matching 180 (92.3)8 (4.1)2 (1.0)5 (2.6)195 (100.0)
Widen response167 (85.6)7 (3.6)6 (3.1)15 (7.7)195 (100.0)
Short answer142 (72.8)7 (3.6)7 (3.6)39 (20.0)195 (100.0)
Descriptive168 (86.2)4 (2.1)6 (3.1)17 (8.7)195 (100.0)
OSCI144 (73.8)9 (4.6)9 (4.6)33 (16.9)195 (100.0)
OSPE189 (96.9)2 (1.0)3 (1.5)1 (0.5)195 (100.0)
OSLE191 (97.9)2 (1.0)2 (1.0)0 (0.0)195 (100.0)
PMP179 (91.8)1 (0.5)3 (1.5)12 (6.2)195 (100.0)
SC&KF192 (98.5)2 (1.0)0 (0.0)1 (0.5)195 (100.0)
Long case188 (96.4)5 (2.6)1 (0.5)1 (0.5)195 (100.0)
Mini-CEX146 (74.9)5 (2.6)8 (4.1)36 (18.5)195 (100.0)
DOPS179 (91.8)1 (0.5)5 (2.6)10 (5.1)195 (100.0)
CBD189 (96.9)2 (1.0)3 (1.5)1 (0.5)195 (100.0)
Logbook183 (93.8)0 (0.0)5 (2.6)7 (3.6)195 (100.0)
Portfolio190 (97.4)2 (1.0)3 (1.5)0 (0.0)195 (100.0)
360 D (MSF)185 (94.9)2 (1.0)2 (1.0)6 (3.1)195 (100.0)
GRF192 (98.5)1 (0.5)1 (0.5)1 (0.5)195 (100.0)

Abbreviations: OSPE, objective structured practical examination; OSLE, objective structured lab examination; PMP, patient management problem; SC, script concordance; KF, key feature; Mini-CEX, mini-clinical evaluation exercise; DOPS, direct observation of procedural skill; CBD, case-based discussion; GRF, global rating form.

a Values are expressed as No. (%).

5. Discussion

In this study, we reviewed the methods used for the clinical assessment of medical students in the medical school of KUMS from 2014 to 2019. A total of 195 faculty members from 19 educational departments participated in this study. Our results showed that in both the intern and stager stages, more comprehensive assessment methods were not always used. The most widely used forms of clinical examination were MCQ, short answer, OSCE, and Mini-CEX. These findings are consistent with the results of studies conducted by Bahraini Toosi and Kouhpayezadeh et al. in MUMS and TUMS (11-21).

Although these methods may be useful in assessing some aspects of clinical performance, other important aspects of essential knowledge, such as technical, analytical, communication, counseling, evidence-based, system-based, and interdisciplinary care skills, cannot be assessed. Evaluations are essential steps in the educational process and have a powerful positive steering effect on learning, curriculum, and are purpose-driven (22). Assessment methods should be valid, reliable, and feasible, based on resources and time, and teachers should address what and why should be assessed. Different learning outcomes require different instruments (23-25).

If a large amount of knowledge is required to be tested, MCQs should be used due to their maximum objectivity, high reliability, and relative ease of execution (17). However, limitations of MCQs include the level of applied knowledge (taxonomy), compliance with structural principles, and post-test indicators (26).

The MCQs, essays, and oral examinations can be used to test factual recall and applied knowledge, but more sophisticated methods are needed to assess clinical performance, such as directly observed long and short cases and OSCEs with the use of standardized patients (27). Short answer questions are an open-ended, semi-structured question format. A structured predetermined marking scheme improves objectivity, and the questions can incorporate clinical scenarios (28).

The Mini-CEX is used to assess six core competencies of residents: Medical interviewing skills, physical examination skills, humanistic qualities/professionalism, clinical judgment, counseling skills, and organization and efficiency (29). The OSCE has been widely adopted as a tool to assess students' or doctors' competencies in a range of subjects. It measures outcomes and allows for very specific feedback (13).

Clinical competence has a complex structure, and multiple and combined methods are needed for valid assessment. Choosing appropriate tools for assessment is very important, so clinical teachers should be fully familiar with clinical measurement methods before using the tests appropriately (30). To accept an assessment method, the features of validity, reliability, practicality, and the positive feedback that the method will create on the trainee are very important. In addition, each method has advantages and disadvantages and is able to measure one or at most several specific aspects of students' clinical competence. Therefore, the use of each method depends on the purpose of the assessment and the specific aspect of the students' performance and clinical competence that is to be evaluated. Considering that clinical ability has a very complex structure, it is suggested to evaluate it authentically using multiple and combined methods (31).

In international studies, the use of Mini-CEX together with OSCE has been explored to provide a more holistic assessment of clinical skills. For example, Martinsen et al. in Norway implemented a cluster-randomized trial and found that students who underwent structured Mini-CEX assessments during clerkships had modest improvements on subsequent OSCE and written exams, suggesting that Mini-CEX may enhance formative feedback and observation in clinical settings (32). In another European study, Rogausch et al. observed that Mini-CEX scores were not strongly predicted by prior OSCE performance but were influenced by contextual features such as the clinical environment and trainer characteristics, pointing to the importance of implementation factors (33). Similarly, in Portugal, the translation and adaptation of Mini-CEX showed acceptable reliability when correlated with OSCE performance in various clinical domains, supporting its validity across cultural and linguistic contexts (34).

In the context of Miller’s Pyramid, the most frequently used methods in our study (MCQ and SAQ) were concentrated on lower levels of competence, while performance-based methods such as OSCE and Mini-CEX, which target higher levels of the pyramid, were less frequently applied. This imbalance highlights the need for broader implementation of workplace-based assessments to achieve a comprehensive evaluation of clinical competence.

In conclusion, some methods, including MCQ, short answer, OSCE, and Mini-CEX, were common methods used in the clinical assessment of medical students at KUMS over a five-year period. Given the advantages of these assessment methods for medical students as future physicians, other methods should be used to evaluate the clinical competency of medical students. Therefore, all faculty members and professors should learn assessment methods and use them appropriately.

5.1. Limitations

The study has some limitations that should be mentioned. Firstly, the study was retrospective in nature, so access to more information was limited. Secondly, we did not assess the reasons for using the assessment methods. Finally, due to the small number of some methods used, analytic statistics to assess the association between different variables could not be performed. Further research to overcome these limitations is recommended.

Acknowledgments

Footnotes

References

  • 1.
    Nazem M, Garakyaraghi M, Hosseinpour M, Khoddami A. [Interns’ Viewpoints Concerning their Readiness for Entering Internship in Isfahan Medical University]. Iran J Med Educ. 2005;5(2):157-64. FA.
  • 2.
    Matthiesen V, Wilhelm C. Quality outcomes and program evaluation in nursing education: an overview of the journey. Qual Manag Health Care. 2006;15(4):279-84. [PubMed ID: 17047502]. https://doi.org/10.1097/00019514-200610000-00010.
  • 3.
    Adhami A, Haghdoost AA, Darvishmoqadam S, Shakibi MR, Nouhi E. [Determining Valid Criteria For Evaluating Clinical And Theoretical Teaching Of The Faculty Of Kerman University Of Medical Sciences]. Iran J Med Educ. 2000;1(2):24-30. FA.
  • 4.
    Farahmand S, Asl Soleymani H. [How Interns' Logbook Is Completed In Emergency Ward Of Imam Khomeini Hospital?]. Iran J Med Educ. 2010;10(1):55-63. FA.
  • 5.
    Finch P. A system of performance intervention zones for use during student evaluation in the clinical environment. J Bodyw Mov Ther. 2007;5:295-8. https://doi.org/10.1016/j.jbmt.2007.01.004.
  • 6.
    Reid WA, Duvall E, Evans P. Relationship between assessment results and approaches to learning and studying in Year Two medical students. Med Educ. 2007;41(8):754-62. [PubMed ID: 17661883]. https://doi.org/10.1111/j.1365-2923.2007.02801.x.
  • 7.
    Brady AM. Assessment of learning with multiple-choice questions. Nurse Educ Pract. 2005;5(4):238-42. [PubMed ID: 19038205]. https://doi.org/10.1016/j.nepr.2004.12.005.
  • 8.
    Amini NS. [Bushehr University of Medical Sciences faculty evaluation process by faculty academic members]. Proceedings of the 4th National Conference on Medical Education. Tehran, Iran. Tehran University of Medical Sciences; 2000. FA.
  • 9.
    Komeili GR, Rezai GA. [Methods of Student Assessment used by Faculty Members of Basic Medical Sciences in Medical University of Zahedan]. Iran J Med Educ. 2001;1(4):52-7. FA.
  • 10.
    Abbasi S, Einollahi N, Gharib M, Nabatchian F, Dashti N, Zarebavani M. [Evaluation Methods Of Theoretical And Practical Courses Of Paramedical Faculty Laboratory Sciences Undergraduate Students At Tehran University Of Medical Sciences In The Academic Year 2009-2010]. Payavard Salamat. 2013;6(5):342-53. FA.
  • 11.
    Ramezani G, Norouzi A, Arabshahi SKS, Sohrabi Z, Zazoli AZ, Saravani S, et al. Study of medical students' learning approaches and their association with academic performance and problem-solving styles. J Educ Health Promot. 2022;11:252. [PubMed ID: 36325210]. [PubMed Central ID: PMC9621383]. https://doi.org/10.4103/jehp.jehp_900_21.
  • 12.
    Cheharzad M, Shafipour SZ, Mirzaei M, Kazemnejad E. [Comparison of OSCE and traditional clinical evaluation methods on nursing students' satisfaction]. J Gilan Univ Med Sci. 2007;13(50):8-13. FA.
  • 13.
    Habibi H, Khaghanizade M, Mahmoodi H, Ebadi A, Seyedmazhari M. [Comparison of the Effects of Modern Assessment Methods (DOPS and Mini-CEX) with traditional method on Nursing Students' Clinical Skills: A Randomized Trial]. Iran J Med Educ. 2013;13(5):364-72. FA.
  • 14.
    Kouhpayezadeh J, Dargahi H, Soltani Arabshahi K. [Clinical assessment methods in medical sciences universities of Tehran - clinical instructor's viewpoint]. Hormozgan Med J. 2012;16(5):395-402. FA.
  • 15.
    Schoonheim-Klein M, Walmsley AD, Habets L, van der Velden U, Manogue M. An implementation strategy for introducing an OSCE into a dental school. Eur J Dent Educ. 2005;9(4):143-9. [PubMed ID: 16194245]. https://doi.org/10.1111/j.1600-0579.2005.00379.x.
  • 16.
    Nouhi S, Fekri A, Foroud A. [Investigation of the problems in clinical evaluation according to clinical teachers of medicine and dentists in kerman University of medical sciences]. 6th Medical Education Congress. Tehran, Iran. Shahid Beheshti University of Medical Sciences; 2003. FA.
  • 17.
    Rushforth HE. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27(5):481-90. [PubMed ID: 17070622]. https://doi.org/10.1016/j.nedt.2006.08.009.
  • 18.
    Tazakori Z, Mozafari N, Movahedpour A, Mazaheri E, Karim Elahi M, Mohamadi MA, et al. [Comparison of nursing students and instructors about OSPE performance and evaluation methods in common practice]. Proceedings of the 7th National Congress Country training. 2005. FA.
  • 19.
    Bari V. Direct observation of procedural skills in radiology. AJR Am J Roentgenol. 2010;195(1):W14-8. [PubMed ID: 20566775]. https://doi.org/10.2214/AJR.09.4068.
  • 20.
    Chehrzad M, Sohail SZ, Mirzaee M, Kazemnejad E. [Compare of Osce and traditional clinical evaluation methods on nursing students’ satisfaction]. J Med Faculty Guilan Univ Med Sci. 2007;13:8-12. FA.
  • 21.
    Bahraini Toosi H. [Clinical evaluation of medical students in 2001]. Iran J Med Educ. 2002;7(23). FA.
  • 22.
    Tabish SA. Assessment methods in medical education. Int J Health Sci (Qassim). 2008;2(2):3-7. [PubMed ID: 21475483]. [PubMed Central ID: PMC3068728].
  • 23.
    Schuwirth LW, Van der Vleuten CP. Programmatic assessment: From assessment of learning to assessment for learning. Med Teach. 2011;33(6):478-85. [PubMed ID: 21609177]. https://doi.org/10.3109/0142159X.2011.565828.
  • 24.
    Raupach T, Brown J, Anders S, Hasenfuss G, Harendza S. Summative assessments are more powerful drivers of student learning than resource intensive teaching formats. BMC Med. 2013;11:61. [PubMed ID: 23497243]. [PubMed Central ID: PMC3635879]. https://doi.org/10.1186/1741-7015-11-61.
  • 25.
    Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-9. [PubMed ID: 11289364]. https://doi.org/10.1016/S0140-6736(00)04221-5.
  • 26.
    Espey E, Nuthalapaty F, Cox S, Katz N, Ogburn T, Peskin T, et al. To the point: Medical education review of the RIME method for the evaluation of medical student clinical performance. Am J Obstet Gynecol. 2007;197(2):123-33. [PubMed ID: 17689622]. https://doi.org/10.1016/j.ajog.2007.04.006.
  • 27.
    Bourbonnais FF, Langford S, Giannantonio L. Development of a clinical evaluation tool for baccalaureate nursing students. Nurse Educ Pract. 2008;8(1):62-71. [PubMed ID: 17728186]. https://doi.org/10.1016/j.nepr.2007.06.005.
  • 28.
    Brown N, Doshi M. Assessing professional and clinical competence: the way forward†. Adv Psychiatric Treatment. 2018;12(2):81-9. https://doi.org/10.1192/apt.12.2.81.
  • 29.
    Kogan JR, Hauer KE. Brief report: Use of the mini-clinical evaluation exercise in internal medicine core clerkships. J Gen Intern Med. 2006;21(5):501-2. [PubMed ID: 16704397]. [PubMed Central ID: PMC1484777]. https://doi.org/10.1111/j.1525-1497.2006.00436.x.
  • 30.
    Khan KZ, Gaunt K, Ramachandran S, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration. Med Teach. 2013;35(9):e1447-63. [PubMed ID: 23968324]. https://doi.org/10.3109/0142159X.2013.818635.
  • 31.
    Maroufi SS, Moradimajd P, Jalali M, Ramezani G, Alizadeh S. Investigating the current status of the student evaluation system in Iran University of Medical Sciences: A step to improve education. J Educ Health Promot. 2021;10:231. [PubMed ID: 34395668]. [PubMed Central ID: PMC8318155]. https://doi.org/10.4103/jehp.jehp_1428_20.
  • 32.
    Martinsen SSS, Espeland T, Berg EAR, Samstad E, Lillebo B, Slordahl TS. Examining the educational impact of the mini-CEX: a randomised controlled study. BMC Med Educ. 2021;21(1):228. [PubMed ID: 33882913]. [PubMed Central ID: PMC8061047]. https://doi.org/10.1186/s12909-021-02670-3.
  • 33.
    Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, et al. The influence of students' prior clinical skills and context characteristics on mini-CEX scores in clerkships--a multilevel analysis. BMC Med Educ. 2015;15:208. [PubMed ID: 26608836]. [PubMed Central ID: PMC4658793]. https://doi.org/10.1186/s12909-015-0490-3.
  • 34.
    Sousa R, Costa P, Cerqueira J, Pêgo JM, Santa Cruz A, Oliveira e Silva A, et al. Translation, adaptation and validation of the Mini-Clinical Evaluation Exercise to the EU-Portuguese language. Revista de la Fundación Educación Médica. 2020;23(4). https://doi.org/10.33588/fem.234.1073.

Crossmark
Crossmark
Checking
Share on
Cited by
Metrics

Purchasing Reprints

  • Copyright Clearance Center (CCC) handles bulk orders for article reprints for Brieflands. To place an order for reprints, please click here (   https://www.copyright.com/landing/reprintsinquiryform/ ). Clicking this link will bring you to a CCC request form where you can provide the details of your order. Once complete, please click the ‘Submit Request’ button and CCC’s Reprints Services team will generate a quote for your review.
Search Relations

Author(s):

Related Articles