The Effect of Direct Observation of Procedural Skills Assessment Method on Clinical Performance of Radiology Students

authors:

avatar Mohammad Rasoul Tohidnia ORCID 1 , avatar Sajad Weisi 2 , * , avatar Sodabeh Eskandari 3

Department of Radiology and Nuclear Medicine, School of Paramedicine, Kermanshah University of Medical Sciences, Kermanshah, Iran
Students’ Research Committee, Kermanshah University of Medical Sciences, Kermanshah, Iran
Center of Researches in Environmental Factors Affecting Health, Kermanshah University of Mechanical Sciences, Kermanshah, Iran

how to cite: Tohidnia M R, Weisi S, Eskandari S. The Effect of Direct Observation of Procedural Skills Assessment Method on Clinical Performance of Radiology Students. Educ Res Med Sci. 2018;7(1):e80301. https://doi.org/10.5812/erms.80301.

Abstract

Background and Objectives:

Assessment is one of the most important factors involved in effective medical education. Direct observation of procedural skills (DOPS) assessment method requires direct observation of learners during the performance of clinical procedures on patients and providing appropriate feedback to learners. This research was carried out to evaluate the effect of DOPS assessment method on the clinical performance of radiology students at Kermanshah University of Medical Sciences.

Methods:

This study was performed on 30 undergraduate radiology students at the teaching hospitals of Kermanshah University of Medical Sciences in 2017. The participants were randomly divided into intervention and control groups. Data were collected via observation and completion of a researcher-made checklist before and after the apprenticeship course. The data were analyzed using independent t-test and paired t-test.

Results:

There was no significant difference between the groups regarding the demographic variables. There was no significant difference between the intervention and control groups in the mean score of students’ clinical skills in pretest (P = 0.911), but the mean score in the intervention group with DOPS assessment method significantly increased in comparison with that of the control group at the end of the apprenticeship course (P = 0.001).

Conclusions:

The DOPS clinical assessment method significantly improves the practical and clinical skills and self-confidence of radiology students at clinical education centers and can be used as a more effective method than conventional clinical assessment methods.

1. Background

Clinical education is one of the most significant components of medical students’ education and constitutes a principal and vital part of training competent and professional students (1). The role of ideal clinical education in the personal and professional development and improvement of clinical skills of students is inevitable. Assessment is the basis and an inseparable component of medical education. The learners’ motivation for learning the content presented to them is influenced by efficient assessment methods. An assessment method has to be valid and reliable, replicable and practical and has to have a positive effect on students’ learning in order to be acceptable (2). Conventional and personal assessment methods used by teachers, especially in the realm of clinical skills, is one of the main concerns of learners regarding the failure to observe educational justice, which can reduce the learners’ motivation for learning (3). Selecting poor assessment methods can lead to passive, habitual, and repetitive learning, which is sometimes followed by a rapid decrease of knowledge and inability to apply it in real situations (2). Nowadays, multi-purpose and multi-faceted tests which evaluate dimensions such as knowledge, problem-solving skills, communication skills, and teamwork skills are recommended (4).

Direct observation by a clinical specialist or a faculty member is one of the most common methods to assess students’ capabilities in dealing with patients (5). The observation and assessment of learners during the practical procedures on patients and providing the learners with appropriate feedback by the faculty member help the learners to acquire and improve the practical skill and assist them through the direct supervision of clinical care (6).

Experts have long been involved in finding valid and reliable methods to assess the students’ clinical skills effectively (7). Direct observation of procedural skills (DOPS) is a common method used to assess the procedural skills. This assessment method includes observing an apprentice during the performance of a procedural skill on a patient in a real clinical situation (8). A study carried out in the British Medical royal college showed that this method is qualified and competent to assess clinical procedures (9). Also, since providing feedback is one of the basic aspects of this test, the test is considered to have a significant role in clinical education (10).

An important characteristic of this method is the provision of feedback to the learners as well as its structural developmental nature. In this method, each skill is frequently assessed by the assessor and analyzed according to a checklist, and the defects are reported to the learner. Therefore, the learners realize their mistakes in each observation, thereby improving and promoting their skills. The reliability of the DOPS method in assessing the radiology assistants has been confirmed, and it is considered an appropriate method in this specialty owing to the providing of feedback to students and identifying their weaknesses (11).

In their study, Bagheri et al. reported DOPS to be objective, valid, and highly reliable in the clinical assessment of paramedical students. It can promote students’ clinical skills more effectively, singly or along with other conventional methods of clinical assessment (12).

According to the results of Kundra et al. (2014) and Delfino et al. (2013), DOPS improved the students’ scores in the clinical performance test. These studies have reported DOPS as one of the most effective educational methods by which the learners maximize their strengths and greatly minimize their weaknesses (13, 14).

Regarding the educational effect of DOPS, it should be pointed out that using this method not only accounts for motivation and encouragement of the learners but also orients the learners toward learning because the content and method of the test are directly associated with the clinical performance. Given the increasing number of learners in the clinical education environments and lack of appropriate development of manpower and educational resources, developing efficient assessment methods compatible with clinical education in each specialty seems to be of great significance (11). Based on the aforementioned discussion, this study was aimed to determine the effect of assessment of practical skills on the performance of radiology students in clinical settings by the DOPS assessment method.

2. Methods

The present trial investigated 30 sophomore students of radiology who were taking hospital apprenticeship course 3 at the teaching hospitals of Kermanshah University of Medical Sciences in 2017. The data collection tool was a researcher-made checklist consisting of two parts: the first part included demographic data of the students, and the second part included 18 clinical procedures assessing chest imaging skills and techniques such as patient training, selecting appropriate radiation angle, and choosing the right cassette based on the lesson plan of hospital apprenticeship course 3. The content validity of this checklist was confirmed by revising and applying the corrective comments of 10 expert faculty members. The reliability of this scale was approved by equivalent forms method in which two faculty members observed and evaluated at least five students during a radiology procedure by the DOPS assessment method. Then, the agreement between the results of the trainers’ assessment and intraclass correlation test (Kappa test) was analyzed, which yielded a Kappa coefficient of 0.6 and correlation coefficient of 0.80.

All 30 students who had passed the chest X-ray course were included in the study and were equally divided into control and intervention groups by simple random sampling. A pretest was given to both control and intervention groups to compare the effects of conventional and DOPS assessment methods on students’ clinical skills before the start of a 10-day apprenticeship course based on the constructed checklist. Then, the control group underwent the apprenticeship course under the supervision of trainers, and their clinical skills were evaluated again at the end of the course using the checklist. The students in the control group were evaluated by the DOPS method before and after the apprenticeship course.

In the intervention group, according to the DOPS method, the students’ clinical skills were assessed in three stages during the apprenticeship course through direct observation of trainers during chest X-ray procedure, and purposive feedback was provided to the students after each stage to correct the weak points in radiology procedure. In the DOPS assessment method, the intervals between observations were different depending on the students’ readiness and adequate time for performing the clinical skills (Figure 1).

Selection and group assignment of the students in the study
Selection and group assignment of the students in the study

Each radiology procedure was rated as good (complete procedure) with a score of 2, average (incomplete procedure) with a score of 1, and poor (no procedure) with a score of 0. The maximum final score was 36, and the minimum score was 0 for each observation. Moreover, students’ participation in the study was voluntary, and the study was conducted after taking the required permissions.

Data analysis was done using SPSS (Version 16) software using descriptive and analytical statistics. Fisher’s exact test and independent t-test were used to analyze the homogeneity of the demographic variables in the intervention and control groups. Kolmogorov-Smirnov test was used to analyze the normality of quantitative variables and DOPS scores. Independent t-test was used to compare the mean score of DOPS between the control and intervention groups, and paired t-test was run to compare the mean scores of DOPS before and after the intervention. P < 0.05 was considered significant for all tests.

3. Results

In this study, 30 students with the mean age of 21.9 ± 1.07 years were equally divided into the control and intervention groups; 73.3% were male. The results are shown in Table 1. Both control and intervention groups were homogeneous in terms of demographic variables such as age, gender, grade point average (GPS), and satisfaction with major (Table 1).

Table 1.

Frequency Distribution and Mean Scores of Demographic Variables of Students in the Intervention and Control Groupsa

VariablesInterventionControlP Value
Gender
Female4 (26.7)4 (26.7)0.999b
Male11 (73.3)11 (73.3)
Satisfaction with major
Satisfied and partially satisfied9 (60)12 (80)0.213b
Dissatisfied6 (40)3 (20)
Age21.93 ± 1.2221.8 ± 0.940.740c
GPA15.22 ± 1.3415.71 ± 0.970.266d

The results of Shapiro-Wilk test showed a normal distribution of the DOPS scores in both the intervention and control groups in all three stages (P > 0.05).

The results of the repeated measures analysis indicated a significant trend of change in the intervention group for the mean scores of radiology students’ clinical skills in the three stages by the DOPS method (P = 0.001). These changes were not significant in the control group (P = 0.174). The comparison of the mean scores of assessment of radiology students’ clinical skills by independent t-test showed no significant difference between the control and intervention groups in the first (P = 0.125) and second stage (P = 0.879), but a significant difference was found in the third stage (P = 0.001) (Table 2).

Table 2.

Comparison of the Mean Scores (Mean ± SD) of Radiology Students’ Clinical Skills Assessment Between the Intervention and Control Groups in all Three Observation Stages by DOPS Method

GroupFirst StageSecond StageThird StageP Valuea
Intervention14.9 ± 4.317.46 ± 5.7125 ± 5.250.001
Control17.2 ± 3.516.93 ± 3.617.2 ± 3.090.174
P valueb0.1250.7620.001

No significant difference was seen in the DOPS mean scores of students’ clinical skills in the intervention group between the first and second stages by Bonferroni follow-up test (P = 0.786), but there was a significant difference between the first and third stages in DOPS mean scores (P = 0.001). Further, there was a significant difference between the second and third stages (P = 0.001) (Table 3).

Table 3.

Comparison of the Scores of Students’ Clinical Skills Between the Study Groups for Chest X-Ray Procedure After Intervention

Comparison Group 3
1 and 22 and 3
Number1515
Mean ± SD-1.13 ± 0.99-3.90 ± 0.951
P value0.7860.001

4. Discussion

The results of this study showed that the clinical skills of performing chest X-ray acquired by the radiology students in the intervention group significantly improved after administering the DOPS assessment method compared with the control group students who were assessed by the conventional assessment. This could be due to the structured and timely feedback regarding the performance weaknesses of the students during the performance of the clinical procedures in real clinical settings, which is the main characteristic of the DOPS assessment method.

The results of the study by Nooreddini et al. showed that the mean score of students assessed by the DOPS method was significantly higher than that of the control group (15). Profanter et al. reported the efficacy of rapid feedback by the examiner in increasing the students’ clinical skills, which can promote the safety and health of patients (16). Also, Shahgheibi et al. reported DOPS as a new, active, multi-faceted assessment method in clinical education that leads to significant changes in the learners’ clinical skills compared with the conventional assessment methods. Their findings also showed a significant difference between the mean scores of students’ clinical performance in the three stages of DOPS assessment method during the apprenticeship course (17).

However, the findings of the present study only showed a significant increase in the mean scores of students’ clinical performance assessed by the DOPS method between the second and third stages. This can be due to the complexity and diversity of the stages of a standard radiology procedure, which requires more practice of the technique along with providing of appropriate feedback and informed intervention to eliminate the performance weaknesses of students.

Moreover, the results of Chen et al. showed that the DOPS method, which focuses on providing feedback during the student-patient encounter, promotes the students’ competence and self-confidence (18). The results of Cobb et al. indicated that students assessed by DOPS method had deeper attitude and approach to clinical skills and could acquire higher scores (19). In a systematic review, Ahmad et al. studied 106 articles related to assessment methods of clinical skills and concluded that none of the assessment methods were completely valid and reliable, each having their advantages. Hence, they suggested a combination of assessment methods to examine students’ clinical skills (20).

4.1. Conclusion

The findings of this study showed that direct observation and provision of structured feedback to students during clinical education by DOPS assessment method significantly improves the practical and clinical skills of radiology students at clinical centers and can be used as a more effective method than conventional clinical assessment methods.

Acknowledgements

References

  • 1.

    Hoseini B, Jafarnejad F, Mazlom S, Foroghi Pour M, Karimi Mouneghi H. [Midwifery Students' Satisfaction with Logbook as a Clinical Assessment Means in Mashhad University of Medical 2-Sciences]. Iran J Med Educ. 2012;11(8):933-41. Persian.

  • 2.

    Wilkinson JR, Crossley JG, Wragg A, Mills P, Cowan G, Wade W. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Med Educ. 2008;42(4):364-73. [PubMed ID: 18338989]. https://doi.org/10.1111/j.1365-2923.2008.03010.x.

  • 3.

    Jalalvandi M, Amirian P, Tohidnia MR, Nemati Kivenani A. [Educational justice from the perspective of Kermanshah paramedical students in 2014]. J Med Educ. 2016;11(1):51-60. Persian.

  • 4.

    Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ. 2007;23(3):239-50. https://doi.org/10.1016/j.tate.2006.12.021.

  • 5.

    Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387-96. [PubMed ID: 17251535]. https://doi.org/10.1056/NEJMra054784.

  • 6.

    Duffy FD, Gordon GH, Whelan G, Cole-Kelly K, Frankel R, Buffone N, et al. Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Acad Med. 2004;79(6):495-507. [PubMed ID: 15165967]. https://doi.org/10.1097/00001888-200406000-00002.

  • 7.

    Crossley J, Humphris G, Jolly B. Assessing health professionals. Med Educ. 2002;36(9):800-4. [PubMed ID: 12354241]. https://doi.org/10.1046/j.1365-2923.2002.01294.x.

  • 8.

    Sahebalzamani M, Farahani H, Jahantigh M. [Validity and reliability ofdirect observation of procedural skills in evaluating the clinical skills of nursing students of Zahedan nursing and midwifery school]. Zahedan J Res Med Sci. 2012;14(2):76-81. Persian.

  • 9.

    Wilkinson J, Benjamin A, Wade W. Assessing the performance of doctors in training. BMJ. 2003;327(7416):s91-2. [PubMed ID: 14500458]. https://doi.org/10.1136/bmj.327.7416.s91.

  • 10.

    Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37(9):830-7. [PubMed ID: 14506816]. https://doi.org/10.1046/j.1365-2923.2003.01594.x.

  • 11.

    Bari V. Direct observation of procedural skills in radiology. AJR Am J Roentgenol. 2010;195(1):W14-8. [PubMed ID: 20566775]. https://doi.org/10.2214/AJR.09.4068.

  • 12.

    Bagheri M, Sadeghnezhad M, Sayyadee T, Hajiabadi F. [The effect of direct observation of procedural skills (DOPS) evaluation method on learning clinical skills among emergency medicine students]. Iran J Med Educ. 2014;13(12):1073-81. Persian.

  • 13.

    Kundra S, Singh T. Feasibility and acceptability of direct observation of procedural skills to improve procedural skills. Indian Pediatr. 2014;51(1):59-60. [PubMed ID: 24561468]. https://doi.org/10.1007/s13312-014-0327-x.

  • 14.

    Delfino AE, Altermatt F, Echevarria G. The Use of DOPS as a Self-Assessment Instrument in the Chilean Anesthetic Context. The Anesthesiologists Annual Meeting. 2013.

  • 15.

    Nooreddini A, Sedaghat S, Sanagu A, Hoshyari H, CHeraghian B. [Effect of clinical skills evaluation applied by direct observation clinical skills (DOPS) on the clinical performance of junior nursing students]. J Res Dev Nurs Midwifery. 2015-2016;12(1). Persian.

  • 16.

    Profanter C, Perathoner A. DOPS (Direct Observation of Procedural Skills) in undergraduate skills-lab: Does it work? Analysis of skills-performance and curricular side effects. GMS Z Med Ausbild. 2015;32(4):Doc45. [PubMed ID: 26483858]. [PubMed Central ID: PMC4606486]. https://doi.org/10.3205/zma000987.

  • 17.

    Shah Gheibi SH, Pooladi A, Bahram Rezaie M, Farhadifar F, Khatibi R. [Evaluation of the effects of direct observation of procedural skills (DOPS) on clinical externship students' learning level in obstetrics ward of Kurdistan University of Medical Sciences]. J Med Educ. 2009;13(1,2):29-33. Persian.

  • 18.

    Chen W, Liao SC, Tsai CH, Huang CC, Lin CC, Tsai CH. Clinical skills in final-year medical students: the relationship between self-reported confidence and direct observation by faculty or residents. Ann Acad Med Singapore. 2008;37(1):3-8. [PubMed ID: 18265890].

  • 19.

    Cobb KA, Brown G, Jaarsma DA, Hammond RA. The educational impact of assessment: a comparison of DOPS and MCQs. Med Teach. 2013;35(11):e1598-607. [PubMed ID: 23808609]. [PubMed Central ID: PMC3809925]. https://doi.org/10.3109/0142159X.2013.803061.

  • 20.

    Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: a systematic review. Am J Surg. 2011;202(4):469-480 e6. [PubMed ID: 21798511]. https://doi.org/10.1016/j.amjsurg.2010.10.020.