The influence of confidence and experience on the competency of junior medical students in performing basic procedural skills
Studies, mostly done with final year medical students and doctors, are in agreement that the confidence level with which a clinical skill is performed, is not a reliable benchmark of actual clinical competence. This inaccurate self-evaluation of proficiency has far reaching implications, e.g. the inability to identify learning deficiencies and consequently to manage learning – both essential components of self-directed learning programmes.
Why the idea was necessary?
The purpose of this study in comparing self reported competence and actual competence was threefold, namely, to discover students’ perceptions concerning their competence of specific procedural skills; to establish what the actual competence level of junior medical students were with regard to these skills and to raise student awareness of the value of accurate self-evaluation.
What was done?
Third year medical students at the Faculty of Health Sciences (FHS) of Stellenbosch University (SU) attended a training session in the Clinical Skills Centre at the beginning of the year. Supervised by clinical tutors, they practiced three basic procedural skills on part-task trainers/bench top manikins, namely, commencing an intravenous infusion; performing simple wound closure (suturing) and administering an intramuscular injection. During the remainder of the year, they returned in smaller groups in their Family Medicine rotation for formative assessment of these skills, using an OSCE. Prior to performing the clinical procedures, students had to rate their perceived competence. Clinical tutors then used checklists to rate actual student competence when performing these three skills.
Evaluation of results and impact
In accordance with similar studies, there was poor correlation between self-reported and actual competence regarding the performance of procedural skills. There were, however, significant correlations between self-reported competence and clinical experience (r = 0.49) as well as between experience and actual competence (r = 0.36). It seems that junior students, not unlike lack the necessary critical self-assessment skills to accurately evaluate their performance of certain basic procedural skills. However, frequently performing these skills in the clinical setting (or elsewhere), increased both self-reported and actual competence in these students.
Prior to this study, junior medical students had limited formal clinical skills teaching in the Clinical Skills Centre (CSC) and due to the already overloaded curriculum were not assessed regarding such skills. As a result, the onus rested on the student to gain these and other, often ill-defined, skills in the clinical setting. Since the completion of this study, a logbook system has been introduced to encourage students to make the most of the opportunities in the clinical setting to practice the skills taught in the CSC. Furthermore, a core clinical skills curriculum was compiled, indicating which skills should be taught in simulation and which in the clinical setting, as well as the competency levels (based on Miller’s Framework for clinical assessment) at which these skills should be performed. From 2011 students will be subjected to a summative OSCE to assess their clinical skills competency.
Adele De Villiers,
Full TextPDF (172KB)
Cite this article
Date published: 2011-06-17
Full text views: 2474