Child Health Essential Skills Stations (CHESS): Development, Implementation, and Evaluation of an Undergraduate Child Health Clinical Skills Course

authors:

avatar Alexander James Harper ORCID 1 , 2 , * , avatar Kyriacos Gregoriou ORCID 3 , 2 , avatar Caitlin Patterson ORCID 4 , 2 , avatar Prashant Kumar ORCID 1 , 2

Department of Medical Education, NHS Greater Glasgow and Clyde, University of Glasgow, UK
School of Medicine, Dentistry and Nursing, University of Glasgow, UK
Department of Medical Education, NHS Ayrshire and Arran, University of Glasgow, UK
Department of Medical Education, NHS Lanarkshire, University of Glasgow, UK

how to cite: Harper A J, Gregoriou K, Patterson C, Kumar P. Child Health Essential Skills Stations (CHESS): Development, Implementation, and Evaluation of an Undergraduate Child Health Clinical Skills Course. J Med Edu. 2023;22(1):e132063. https://doi.org/10.5812/jme-132063.

Abstract

Background:

Teaching undergraduate medical students clinical skills in a child health setting is a particular challenge for clinical educators. Students spend less time with pediatric patients and have fewer opportunities to practice clinical skills. The coronavirus (COVID-19) pandemic further reduced students’ opportunities to observe and practice skills in the workplace. This has necessitated a greater shift towards teaching skills in a "skills lab" setting, which allows for simulated practice in a safe environment. This study reports the design, implementation, and evaluation of a standardized course that utilizes the "skills lab" to train undergraduate medical students in five clinical skills important to child health.

Objectives:

This study aimed to implement and evaluate a standardized undergraduate clinical skills course for child health and improve students' confidence in performing child health clinical skills.

Methods:

Evaluations were carried out over approximately one academic year, with a total of 174 participants from a single medical school in the United Kingdom. Qualitative and quantitative data were collected, examining students' self-reported confidence (pre- and post-course) along with free-text course evaluations. A paired t-test was used to calculate the mean difference in students' pre- and post-course confidence scores. Qualitative evaluations were analyzed for themes using framework analysis.

Results:

The students had greater confidence in all measured learning outcomes following the course. Qualitative data, examined using framework analysis, suggested that the course was valued by students, who felt it was relevant to their future practice. Numerous written comments suggested particular content and teaching methods that were strengths of the course, including practical elements, small group teaching, and feedback from tutors.

Conclusions:

Implementing a child health clinical skills course in a skills lab setting is feasible and valued by students. The course increased the self-reported confidence of the studied cohort and might therefore support them in practicing these skills with actual patients. Further studies are required to determine whether these effects demonstrate longevity and whether they translate to increased competence in performing the taught skills.

1. Background

The advent of the patient safety movement has encouraged medical educators to find new, controlled-practice environments in which to train clinical skills (1). The evidence suggests that “skills lab” teaching for undergraduates is associated with a greater number of skills being demonstrated during placements (2) and improved long-term performance (3). Therefore, the traditional mantra of "see one, do one, teach one", should be confined to the past.

The integration of pre-clinical and clinical portions of medical school curricula has seen students enter the workplace sooner, with classroom, laboratory, and online-based teaching often spread throughout the duration of degree courses (4). While acknowledging the benefits of this approach, the fragmentation of time spent in clinical settings presents a challenge for educators, who must try to ensure undergraduates gain the necessary exposure and opportunities concerning clinical skills. Recent evidence suggests that the disruption of clinical placements by the coronavirus (COVID-19) pandemic has resulted in United Kingdom (UK) medical students feeling less prepared to undertake their roles as new doctors (5) and has necessitated a revamping of the way clinical skills are taught in the health professions (6). A recent review demonstrated that UK medical students feel less prepared for certain aspects of practice, in particular prescribing and certain procedural skills (7). This makes the work performed to prepare medical students for the clinical workplace especially important at this critical juncture.

Professional bodies mandate that UK undergraduate medical students show competence in practical skills relating to child health (8, 9). However, final-year placements in child health might be the only clinical exposure students have in this specialty before graduation (10). This puts an onus on the child health placement to provide students with the necessary opportunities to learn and practice core skills before graduating as new doctors. With the knowledge that exposure to teaching and skills is highly variable (11), there is a move to standardize child health teaching across the UK (12), and it is our role as educators to ensure students all gain opportunities to practice relevant clinical skills.

Historically, the opportunities to attempt necessary procedural skills on placement have been lacking (2, 11, 13) and have generated graduates who did not feel prepared for the clinical environment (7). However, bespoke clinical skills training in the laboratory setting might improve confidence to exercise skills (2, 11) and improve objective competence in skills beyond traditional teaching methods (14). One retrospective study reported that new doctors who underwent a procedural skill course as undergraduates had significantly higher self-reported competence as opposed to those who did not (11).

At this UK medical school, final-year medical students rotate through a variety of five-week specialty placements, one of which is child health. During this placement, students attend both clinical (outpatient clinic, inpatient ward, and acute admissions) and non-clinical (lectures, seminars, online learning, and personal study) activities. The students are expected to shadow clinical staff, take medical histories, examine patients, and be observed demonstrating specific clinical skills that form part of the medical school curriculum. These skills include counseling on inhaler technique, counseling on the use of peak flow meters, urinalysis, and the measurement of vital signs. The students have a responsibility to ensure skills logbooks are completed and signed off by an appropriately skilled healthcare worker, signifying that they have been observed completing each skill competently.

There is evidence that a skills lab approach provides students with more confidence to apply skills in relevant clinical environments (2). Therefore, this study developed, implemented, and evaluated the Child Health Essential Skills Stations (CHESS) course for medical students to attend during their child health clinical placements. The course was piloted from October 2020 to April 2021 and continues to this day. Students’ self-reported confidence in achieving the course learning outcomes was measured before and after the course, alongside the qualitative free-text evaluations of the course.

2. Objectives

This study aimed to implement and evaluate a standardized undergraduate clinical skills course for child health and improve students’ confidence in performing child health-related clinical skills.

3. Methods

3.1. Course Design

The General Medical Council (GMC) 'Outcomes for Graduates: Practical Skills and Procedures' document (8) and the Royal College of Paediatrics and Child Health (RCPCH) 'Undergraduate Curriculum for Child Health' (9) are curricula providing guidance on the procedures expected to be performed competently by UK medical graduates. Based upon the standards set by the UK General Medical Council (8) and UK Royal College of Paediatrics and Child Health (9) and after careful consideration with respect to relevance to the medical school curriculum, resources, and feasibility, five skills were chosen to be included in the course (Table 1). These topics were refined through consultation with the university’s subdean for child health and adapted iteratively from student feedback.

Table 1.

List of five skills selected to be included in the CHESS course mapped to which of these skills are included in two UK national curricula for medical students/graduates (8, 9)

List of Skills Stations Included in the CHESS CourseOn GMC Outcomes for GraduatesOn RCPCH Undergraduate Curriculum for Child Health
1. Neonatal hip examinationNoYes
2. Counseling on the inhaler and peak flow meter techniqueYesYes
3. Basic pediatric prescribingNoYes
4. Pediatric fluid management and prescription of fluidsNoYes
5. Pediatric Early Warning Score use and urine collectionYesYes

During the course, each skill is taught in small groups (of two to five students) using standardized teaching materials and equipment, with each station lasting 30 minutes. Facilitators initially guide students through relevant concepts concerning the performance of a particular skill. Students can then practice the skill in a simulated manner, with feedback and support from facilitators. Stations were facilitated by either one or two pediatric clinicians of varying seniority.

The CHESS course is run twice per child health rotation (i.e., twice every 5 weeks). This allows each student on their child health rotation to participate once. By integrating the course within the child health rotation, students can practice these learned skills concurrently within the clinical environment. The integration of clinical placements with relevant practical skills has been shown to be feasible and effective at other institutions (15).

3.2. Participants

The target population identified for the study were medical students undertaking clinical placements in child health. Therefore, a convenience sample was chosen within the local region where the course was piloted. There were no specific inclusion or exclusion criteria. At this institution, medical students have a single child health rotation during their final year of study. Every final-year medical student was given a place on the CHESS clinical skills course as part of their timetabled activities during their child health placement. Each student was invited to attend the course only once. Between 16 and 18 students attended each course. All the students attending the first 10 consecutive courses between October 2020 and April 2021 were invited to participate in the course evaluation.

3.3. Course Evaluation

Evaluations consisted of two parts as follows:

1. A self-reported confidence questionnaire was administered before and after the course.

2. Free-text qualitative feedback questionnaires were administered after the course in order to explore students’ perceptions of the course and their recommendations for future improvements to the course.

Each student was asked to score their confidence in performing the skills immediately before and after the course using a five-point Likert-type scale ranging from “not at all confident” to “very confident”. Selections were coded as numerical values from 1 to 5. The items were constructed directly from the intended learning outcomes (ILOs). For example, self-reported confidence in the ILO “practice counseling on the correct use of an inhaler + spacer with a colleague” was measured using the item “How confident do you feel counseling patients on the correct use of an inhaler?”. Therefore, the questionnaires demonstrated face validity.

Following each course, attending students were also invited to complete a written, free-text evaluation under four headings (Appendix 1).

3.4. Data Collection

The data were collected using anonymous paper questionnaires given to every student attending the CHESS course between October 2020 and April 2021. Pre-course confidence scores were obtained immediately before commencing the course. Post-course confidence scores and written qualitative feedback were obtained immediately following the course before leaving the course venue. The students were not interviewed as part of this study, and qualitative data were extracted from students’ written free-text responses (Appendix 1). The questionnaires were handed out and collected from the students by administrative staff.

3.5. Data Analysis

Raw data were entered into Microsoft Excel (version 2301). The internal consistency of pre-course and post-course confidence questionnaires from the first two pilot courses was calculated manually in Excel using Cronbach’s alpha. Mean differences in pre- and post-course confidence scores were calculated using a paired t-test. This quasi-experimental approach of comparing pre- and post-course scores was felt to improve the internal validity of the study.

Qualitative data collected from students’ free-text responses were analyzed using framework analysis, as described by Ward et al. (16). One of the authors (AH) formatted and analyzed the data using Microsoft Word (version 2108). Analysis was carried out through the lens of a realist epistemology, aligning with the chosen method for analysis (16). This study explored students’ perceptions regarding the quality and effectiveness of the course, the aspects which they felt could be improved upon, and the features that might be translatable to other courses. Framework analysis is commonly used to answer questions that have practical applications, such as the aforementioned ones (16). It also employs a clearly defined, structured approach, resulting in a series of notes and tables generated during the analytic process (16), thereby demonstrating dependability in the qualitative analysis (17).

Given that these data have been analyzed by a single researcher (AH), a short reflexive statement on the researcher’s background is relevant to the confirmability of the data (17). AH is a UK medical school graduate and currently works as a pediatric clinician with experience in undergraduate medical education. As someone with significant lived experience of the institution, curriculum, and student culture, AH is well-placed to interpret the students’ qualitative data on the CHESS course.

3.6. Ethical Approval

Formal ethical approval was deemed unnecessary for this study by the Institutional Ethics Committee (University of Glasgow, Scotland). Course evaluations were collected anonymously, and the data were processed with participant anonymity protected.

4. Results

A total of 184 medical students were invited to attend and evaluate the CHESS course between October 2021 and April 2021. A total of 174 students participated in the course evaluation, completing part or all of the written evaluation (94.6% response rate). The items with missing data responses (for pre- and/or post-course confidence scores) were excluded from the analysis for that particular item. The total number of complete responses for each item can be observed in Table 2.

Table 2.

Mean Difference in Students' Self-reported Confidence Pre- and Post-course for Each of the Five Skills Stations, Divided Into Their Constuent Leaning Outcomes

Station and Learning OutcomeMean Difference in Self-reported Confidence Pre- Vs. Post-course (95% CI)Number of Complete Pre- and Post-responsesP-Value
Drug prescription
Prescribing analgesia and antibiotics+ 1.98 (1.82 - 2.14)140< 0.0001
Fluid prescription
Recognizing signs of dehydration+ 1.16 (1.02 - 1.29)140< 0.0001
Prescribing fluids intravenously + 1.36 (1.21 - 1.52)140< 0.0001
PEWS and urine collection
Plotting PEWS score using the chart+ 1.25 (1.05 - 1.46)75 a< 0.0001
Interpreting PEWS chart+ 1.15 (0.93 - 1.36)75 a< 0.0001
Identifying indications for urinalysis+ 1.50 (1.35 - 1.65)145< 0.0001
Interpreting urinalysis+ 1.24 (1.09 - 1.39)140< 0.0001
Undertaking urine collection+ 1.69 (1.53 - 1.86)140< 0.0001
Inhaler and peak flow meter
Counseling on the use of peak flow meter technique+ 1.19 (1.03 - 1.35)140< 0.0001
Counseling on the use of an inhaler (with spacer)+ 1.32 (1.17 - 1.47)139< 0.0001
Neonatal hip examination
Performing newborn hip examination+ 1.96 (1.79 - 2.13)126< 0.0001
Identifying abnormal findings+ 1.87 (1.72 - 2.02)145< 0.0001

Quantitative data from the first two CHESS courses were analyzed using Cronbach’s alpha to ratify the internal consistency of the questionnaires. The two courses yielded 36 responses from each questionnaire. Their alpha values were 0.706 and 0.785, respectively.

A significant increase in students’ self-reported confidence was observed for all ILOs measured after attendance at the course (P < 0.0001) (Table 2). This finding was particularly evident for outcomes relating to prescribing and neonatal hip examinations.

The framework analysis of written student evaluations identified four themes, namely applicability, practice, facilitation, and timing. Table 3 shows the four themes and relevant subthemes derived from the framework analysis of student free-text evaluations (see questions in Appendix 1). Supporting exemplar comments are provided alongside. Changes made to the course based upon feedback included splitting fluids and prescribing into separate stations, introducing Pediatric Early Warning Score into the course, and removing the practice of urinalysis.

Table 3.

Four Themes and Relevant Subthemes Derived from Framework Analysis of Student Free-text Evaluations with Supporting Exemplar Comments a

ThemeSubthemeQuotation
ApplicabilityUseful and relevant to practice“Prescribing station - great teaching on a very important subject”
“I found the newborn hip examination good practice ahead of my baby check day”
Confidence building“The fluid station was very helpful and made me feel a lot more confident…”
“More clear in giving instructions and techniques. Good session. Built confidence.”
Case-based“Going over fluid and medication prescribing with practical examples”*
“Clinical scenarios is very helpful in learning common hospital scenarios”
“Good to do a case based teaching session”
Relevant to exams and revision“The course covered core skills that are relevant to OSCEs”
“very relevant to exams”
Completed skill before“urinalysis (we’ve had so much teaching on this already)”#
“Urinalysis and peak flow - I was confident in these prior to the course”#
PracticeHip model“was good to feel what an abnormal hip was like”
“Hip examination – hard to get technique right from online resources online”*
Using a drug card (Kardex) & British National Formulary (BNF)“It was useful to practice prescribing”
“Going over fluid and medication prescribing with practical examples”*
“Also learning how to prescribe fluids and medications in real kardexes”*
“Using a BNF to calculate medication dose”*
Lack of prior prescribing opportunity“Prescribing as don’t get much practice on wards”*
“Really good, particularly prescribing as this is something we don’t get much of”
High levels of interactivity“Having each skill demonstrated and then a go was helpful.”
“really interactive.”
FacilitationHigh-quality facilitators“Brilliant quality of teaching, all tutors very engaging.”
“helpful facilitators who share clinical tips”
“It was useful to practice prescribing and get immediate feedback.”
Small group work“It is good in being in small groups”
“Small groups, good to ask and answer questions.”*
Opportunity to ask questions“Lots of opportunities to ask questions.”
TimingRushed or more time needed“Not enough time at clinical stations – very helpful but felt rushed”
“Difficult to fit everything into the fluids and prescribing stations”#
Suggestion of prior reading“Going through the lectures during stations – would have preferred to have read these prior to attending”
“I didn’t have enough theory knowledge on DDH before coming in so I struggled with the hip examination station”#

5. Discussion

Regulatory bodies require graduating doctors in the UK to meet set criteria, including the ability to perform clinical skills relevant to child health (8, 9). Procedural skill competency has been identified multiple times as a weakness in curricula for training doctors (2, 7, 11). Inconsistent opportunities to observe or attempt procedures (2, 11), minimal clinical exposure in child health settings (10, 13), and a dearth of evidence on how to transfer skills training into the clinical environment (3, 14) all act as barriers to students obtaining procedural competencies relating to child health. Disruption to attachments during the COVID-19 pandemic has exacerbated these existing deficiencies and created new challenges for educators training medical students in procedural skills (6). Some clinicians in training feel relatively unprepared to perform the skills expected of them (18). Moreover, some literature suggests that junior doctors are objectively under-skilled in certain procedures (7). There is therefore a rationale to move away from traditional skills teaching and build the evidence base for effective skills training.

The literature describes various interventions that have targeted the training of clinical skills, including logbooks (19), skills labs (20, 21), deliberate practice (14), integration within curricula (15), and the use of realistic clinical scenarios to train skills (22). The level and type of evidence available for such interventions varies. Some interventions proved popular with participants (20), while few demonstrated an objective improvement in skill performance (14).

Perhaps more important than simply demonstrating better skill performance within the teaching environment is showing that learning is transferred to the clinical environment. Miller’s seminal pyramid model (23) on demonstrating competency puts “does” at the peak. Therefore, training will ideally see students able to apply skills training in clinical environments. The original intervention enabling this is the skills logbook. A logbook acts as a guide for medical students, mandating that they be signed off as competently demonstrating a list of key skills on clinical placements by an experienced practitioner. However, data collected in the Netherlands suggest that simply providing a skills checklist for students to tick off is unlikely to provide students with exposure to all of the necessary skills (2). Another study reviewed the logbook element of an undergraduate medical placement, finding that few students could complete all the skills suggested; while many completed extracurricular skills not in the logbook (19). They discuss that completed logbooks do not necessarily confer competence; and thus clinical placements should supplement practical experiences with other forms of training (19). The medical students in the present study had a logbook of procedural skills to complete during their child health placement. Given the finding that the presence of a logbook in itself is insufficient to train students to become competent, the rationale of the current intervention was to complement this existing logbook. Clinical skills teaching in labs has been linked to more skills being performed by students whilst observed in the clinical environment (2). This phenomenon might be related to the confidence gained through formal practice and feedback from tutors while in a skills lab.

A 2018 study by O’Donoghue et al. (13) investigated 85 undergraduate students performing child health-related clinical skills in a skills lab. They demonstrated that self-reported confidence before performing a skill did not correlate with students' objective performance (13). Focus groups allowed for discussion about why this had occurred. Participating students suggested that they had few opportunities for skills practice with actual patients in child health and that these tasks were particularly complex (13). This study overlaps with our own in that they looked at pediatric prescribing, including intravenous fluid calculations. In addition, the present study included other procedural skills which were patient facing rather than simply paper-based.

O’Donoghue et al. (13) concluded that standardizing teaching and providing formative feedback on skills would be the best way to train competence in these undergraduate clinical skills (13). Our study complements these findings by demonstrating a standardized course delivered to all students in the region wherein formative feedback was provided on each skill. While our findings do not measure competence, qualitative feedback suggests that getting immediate, individual feedback from a trained clinician is a strength of the CHESS course. O’Donoghue et al. (13) also focused on the dangers of inconsistent teaching with particular relevance to pediatric fluid prescribing. The present study has tackled this issue by standardizing the teaching of this topic across the entire region within the CHESS course. Therefore, although the current study did not measure competence, it is hoped to have improved students’ competence in these skills and given students the confidence to demonstrate these skills with actual patients under appropriate supervision.

In this cohort of final-year medical students, a statistically significant increase in self-reported confidence was observed for all course learning outcomes. Increased confidence is the first step towards enabling students to grasp real opportunities to demonstrate these skills on placement. The largest difference in self-reported confidence was observed in the prescribing and neonatal hip examination. A theme within the qualitative data was the high value of teaching in these domains, where students might have limited prior experience or practice. The students commented that the course was useful; "... particularly prescribing as this is something we don't get much of" and "Hip examination - hard to get the technique right from online resources online.

Triangulating this data with the self-reported confidence scores might suggest that the large difference in confidence is attributable to a particular lack of confidence in these areas to begin with. One study on skills logbooks for undergraduates explains that despite providing a checklist of skills, student exposure inevitably varies (19). Therefore, those designing medical curricula should take this issue into account. The current study’s data have further implications for educators, suggesting a greater focus on areas where medical students have few opportunities to practice and therefore lack confidence.

The patient safety initiative has necessitated safe training environments for procedural skills to be practiced via simulation (18). Several authors have described the process of transitioning medical student teaching to skills labs (20, 21, 24) and emphasized the opportunity for repeated practice and the ability to make mistakes in this environment (1, 21). Issenberg et al. (25) summarized the evidence for making the most use of high-fidelity simulation training, and much of this applies to teaching procedural skills. The CHESS course operationalized various recommendations described by Issenberg et al. (25) that make high-fidelity simulation exercises successful. For example, there is a strong evidence base for feedback in simulated exercises. This was a strong theme in the qualitative feedback from the investigated students, who discussed that “It was useful to practice prescribing and get immediate feedback”. Beyond this, the students specified that they valued small group sizes and time to ask questions. These aspects of the CHESS course may further enhance opportunities for immediate, individual feedback. Another undergraduate clinical skill course similarly showed that students particularly valued group learning, clinically-based scenarios, and specific feedback (26). For other educators planning simulated skills training, these techniques for allowing feedback are assets to a course.

The present study demonstrated the feasibility of integrating a clinical skills session into a clinical attachment for child health. Curricular integration is another technique that Issenberg et al. (25) note to improve the outcomes of simulation. One article discusses the reform of a clinical skills curriculum to integrate fully with the rest of the taught curriculum (15). This provides a theoretical advantage for learners regarding the cognitive load, wherein fewer new ideas are presented each semester (27). This is because the skills taught match up to the physiology they are being taught at the time (27). Moreover, as discussed above, by providing the simulated learning of skills within a clinical placement, students might then have more confidence to perform such procedures on the wards and reach Miller’s (23) “do” stage of competency (2).

It is known from previous studies that a level of competence is not always maintained after a skill is learned (28). Offiah et al. (28) describe “skills decay” in a prospective cohort of medical students following a skills course, noting that this is closely related to how many times a skill was performed after the course. They suggest using a logbook, along with an increased curriculum focus on providing students with opportunities to perform skills (28). Meanwhile, another group teaching their course across 4 weeks encouraged students to practice taught skills on patients in between the sessions, likening this to a “spiral learning” model wherein concepts are revisited in increasing detail (26). With undergraduates increasingly under-exposed and under-confident in practical skills, simulated practice can act as a springboard to encourage students to seek and attempt procedures on patients (29). Similarly, the CHESS course complements the current child health curriculum and logbook of procedures that students complete during their placement.

This study has important strengths to note. Firstly, while a convenience sample was used, a large sample size of 184 students with a high response rate of 94.6% indicates that the study population closely matches the target population, thereby reducing the risk of selection bias. The second strength of this study is the use of a mixed methods approach. The ability to analyze both quantitative and qualitative data provides some concurrent validity across measures, and qualitative analysis assists and deepens our understanding of the quantitative results. Thirdly, using Cronbach’s alpha, the questionnaires demonstrated a high level of internal consistency, thereby suggesting reliability. Finally, this study reported the various methods used to ensure the trustworthiness of its qualitative data analysis, including measures to support honest responses from students, triangulation with quantitative data, stating the researcher’s background, a reflexive statement, and explicitly detailing the method of qualitative analysis employed (16, 17). These methods directly correlate with the four characteristics associated with trustworthiness in qualitative research, namely credibility, dependability, confirmability, and transferability (17).

This study also has several limitations that are important to acknowledge. Firstly, improved self-reported confidence is not analogous to competence (13). Although prior meta-analyses in the undergraduate population suggest a correlation between self-reported confidence and competence, this often demonstrates poor accuracy (30). Nevertheless, self-reported confidence, discrete from competence, might have an intrinsic value for new medical graduates, who must have a sense of their limitations to seek help appropriately (31). Secondly, the self-reported confidence scores and evaluations were collected directly following the course. Therefore, it was impossible to comment on any longitudinal difference in students’ confidence in demonstrating these skills. Promisingly, however, another undergraduate study that used similar training methods for teaching cannulation and nasogastric tube insertion skills to medical students demonstrated a positive effect on competence up to 6 months following the intervention (3). In the same study, the medical students who were trained in the skills lab, with tutors providing feedback, performed significantly better than the control group taught with a “see one, do one” methodology, both initially and at 6 months of follow-up (3). Thirdly, this cohort represents a single academic-year group from one UK medical school, and their baseline demographic details were not collected. Therefore, the results might not necessarily be generalizable to other cohorts of students. Finally, it is recognized that conducting interviews or focus groups may have yielded richer qualitative data.

5.1. Conclusions

This particular cohort of medical students experienced unprecedented disruption to clinical attachments during the COVID-19 pandemic. The pandemic has crystallized the importance of facilitating medical students reaching competence in required clinical skills, where, traditionally, opportunities to practice these might be variable. Therefore, providing the opportunity to practice skills in a safe environment, with immediate, focused feedback, is especially valuable. The CHESS course has been developed to fill the gap for a standardized clinical skills course in pediatrics. There was a statistically significant increase in self-reported confidence across all of the skills taught, and the course was universally valued by students. The students particularly valued the small group learning, opportunities to practice, and gaining immediate feedback on their practical skills. Alongside prior evidence on undergraduate skills teaching, this study suggests that these aspects of the course are distinct strengths that have a positive impact on students’ confidence following course attendance. This might provide a good model for skills teaching during undergraduate courses as a foundation to supplement learning skills through practice with actual patients. Further studies should focus on using an objective measure of competence following this intervention and determining whether it results in long-term improvements.

Acknowledgements

References