Many FiY1s would have liked more direct communication from their medical schools and especially from governing bodies. Despite this, most students felt their roles and responsibilities were made clear prior to starting work (71.1% n=32) and most respondents felt prepared for their FiY1 post (68.9% n=31), as shown in Figure 1. Communication with students can be optimized through timely distribution of surveys, responding to common concerns in weekly emails, and providing regular opportunities to meet with medical school faculty members as in one American medical school model initiated during the pandemic [12]. Using such methods to maintain communication channels with graduates would have alleviated concerns of information not being relayed promptly.
Graduates perceived that the academic content tested in final year exams was less useful for work than knowledge of procedural tasks, however, passing finals, even in a modified format, gave them a feeling of “legitimacy” and “confidence”. As many medical schools increase their focus on programmatic assessment through utilization of placement-based assessment and small group tutor feedback, final examinations are considered to be of relative not absolute importance [13, 14]. However, we suggest that final examinations continue to play an important role in providing external validation for new doctors, which should be taken into consideration when planning future curricular revisions.
Graduates reported practical examinations to contribute more to their feelings of preparedness than written examinations with 100% (n=17) of those who undertook modified OSCE examinations agreeing that these contributed to feelings of preparedness for work. This is classically illustrated by Miller’s ‘Framework for clinical assessment’ [15] which favours ‘shows how’ to ‘knows how’ as a mode of testing. OSCEs are also often considered to be superior to written examinations in terms of their range of assessment and versatility [16–18]. We suggest that not only are OSCEs valuable in terms of their quality of assessment but also in terms of student preference.
A large proportion of graduates who felt the need to supplement their learning did so by watching online webinars organized by private enterprises (58.8% n=10). Whilst these organizations are usually led by doctors and provide reliable information, medical schools and governing bodies have the ultimate responsibility for preparing their students for work.
Teaching focused on procedural tasks such as TTOs and discharge summaries was found to be lacking by graduates. The ‘buddy’ system, initially recommended by the UKFPO, was not discussed by graduates and has elsewhere been reported as difficult to implement [8, 19]. Proper organisation of such a scheme may have aided in learning. However, these are skills often gained through shadowing on ‘assistantship’ programmes; a part of curricula graduates missed out on, and one respondents largely agreed would have made them feel more prepared for work (91.1% n=41). Assistantships are known to improve self-reported preparedness for work and in lieu of these, sufficient resources should have been provided to supplement learning [20–22].
At no point did graduates report anything to suggest the superiority of FiY1 over assistantship programmes. This point has been argued by Butt and Umaskanth [23] who feel that the extra responsibility of an FiY1 role better prepares doctors for work and recommend that the FiY1 programme becomes the norm after final examinations. More focussed studies are necessary to sufficiently answer this question. It is however interesting to note that FiY1, providing all of the responsibilities of a doctor, with maximal supervision, may fulfil Miller’s fabled ‘Does’ rung on the pyramid of clinical competence [15, 24]. By integrating robust feedback systems to the role, FiY1 may yet prove useful as a fine-tuning tool for graduates before independent clinical practice begins.
This study was limited to two UK medical schools. As such, small sample sizes and low response rates meant variation across schools was difficult to detect and recommendations specific to teaching styles and aspects of curricula were therefore harder to make. Other limitations lay in the external factor of the COVID-19 pandemic. One survey of final year medical students found that the pandemic itself had affected students' perceived preparedness for foundation training, independent of changes made to curricula [25]. Due to these multiple confounders, it was decided best not to directly compare our results to the National Training Survey from years past for example. Additionally, self-reported preparedness as a measure does not necessarily correlate with actual preparedness and so should not be mistaken for competence [26, 27]. However, self-reported preparedness can provide an opportunity for reflection and self-assessment, important attributes for a doctor [28]. Furthermore, medical schools and governing bodies should take an interest in the wellbeing of their new doctors and see this measure as a way to gauge the mindsets of graduates entering their first posts.