This study obtained consensus-based statements on essential elements when assessing podiatry students’ competency during WIL, as informed by podiatry academics, Clinical Educators, students and end-users. It ensures the necessary first step in the development of a valid WIL assessment tool specific for podiatry students, which will ultimately assist in consistency in clinical assessment across providers of entry-level podiatry programs in Australia and New Zealand.
Based on our findings, the essential elements identified by the Delphi technique share consistency with existing documentation. The primary elements from this study focus on competent clinical skills, communication and professional behaviour. These are consistent with several elements of the Professional Capabilities for Podiatrists document (2022), developed by the Podiatry Accreditation Committee of the Podiatry Board of Australia (2). The Professional Capabilities document covers five domains of expected competence for registered podiatrists: knowledge and skills; communication and collaboration; professional and ethical practice; lifelong learning; and quality and risk management. Arguably, the only domain not essential to the WIL experience of students is that of ‘lifelong learner’ due to its focus on continued learning and mentorship of peers/other health professionals, which is outside the need or ‘capabilities’ as they relate to students. Encouragingly, even though our findings do not specify a ‘quality and risk’ component, elements relevant to student expectations are covered in ‘Professional behaviour’ (such as, demonstrates and acts in accordance with relevant legislation, professional standards and guidelines, e.g. consent, infection control, confidentiality, workplace health, safety and welfare). Similarly, many of our essential elements reflect those used within the COMPASS tool for speech pathology students (5), the Assessment for Physiotherapy Practice [APP] (4), and the Student Practice Evaluation Form – Revised (Second Edition) [SPEF-R2] (for Occupational therapists) (6). As one example, our findings indicate students should have the “Ability to communicate appropriately with people involved in client care”, whereas the COMPASS requires students to ‘Communicate effectively with work teams’, the APP requires students to “Communicate effectively and appropriately – verbal/non-verbal”, and the SPEF-R2 has “communicates effectively with service users and significant others” as a core objective.
Another notable finding was that there is evidence to support the main essential elements accepted by our panel. Reynolds and McLean (13), when investigating Clinical Educator perceptions of podiatry students’ placement practice, identified that deficiencies in practical clinical skills communication abilities contributed to a lack of preparedness. It is potentially this perceived importance of professional and communication skills in clinical performance, where neither are mutually exclusive, that led to several essential elements being identified across categories. For example, ‘Demonstrates clear and appropriate history taking’ in the Performance/Clinical Skills section is similar to ‘Demonstrates note taking abilities’ identified in the Communication section. This speaks to the integrated nature of clinical practice, where it is acknowledged that no singular skill or task in isolation makes a good practitioner.
Of interest, many of the outcomes accepted by respondents relating to clinical skills were often specific. For example, the single consensus statement relates to ‘safe and effective scalpel skills’, whilst statements focused on biomechanical assessment, orthotic manufacture, wound and nail care were also accepted. While these skills are irrefutably important, there were notably some podiatry-related tasks that were not identified or accepted as essential elements for the assessment of WIL activities (e.g., assessment and management of paediatric clients, serial casting for musculoskeletal concerns). Ideally a universal assessment tool needs to be adaptable to a broad range of WIL experiences, able to be applied across cohorts with different levels of experience, adaptable to different levels of competence, and responsive to changing technology and practice scope (such as evolving methods of orthoses manufacture), to maintain relevance. This was identified and incorporated into the key elements by the respondents, with statements requiring students to maintain knowledge and identify client focused, evidence-based, appropriately informed strategies for management of conditions that demonstrate clear clinical reasoning reaching 100% agreement. These elements are essential to ensure the tool remains relevant, ‘future-proof’ and able to be nuanced to individual institutions.
With regards to grading scales, this study found that the preference was for a clear pass grade to determine baseline competency. This can be determined by a giving a pass/fail grade, or a mark over the midpoint of a Likert scale. This is similar to the APP (4) which uses a 5-point Likert scale to grade students’ competencies with the mid-line being the base requirement for success. It must be noted that WIL activities occur at different points depending on the university program structure. Clear guidelines need to be developed to assist Clinical Educators to rate students’ competency according to their progress within the program.
The consensus statements developed in this study represent the initial step to inform the development of a standardised WIL assessment tool. However, the statements may require amalgamation or refinement with the aim of improving brevity and clarity. Further work is required to develop clear assessment criteria, with explanatory notes and examples. However, once developed, this tool may offer entry-level program providers and students greater validity and consistency in assessment of WIL, provide Clinical Educators with more guidance on what is expected of students, and allow accrediting and registration bodies greater confidence that graduating students from different programs have been assessed against the same criteria. Ultimately, this has the potential to help ensure consistency in the clinical capabilities of graduates entering the workforce resulting in improved patient experiences. Any subsequently developed tool may also prove to have international implications where podiatrists train in similar structures to Australia and New Zealand.
There are limitations of this study that need to be considered. All statements required consensus or agreement from the respondents but, in the context of evidence-based practice, represents low-level evidence and expert opinion only. Additionally, despite transparently supplying respondents with a copy of their comments prior to each round to ensure they were satisfied with our management of them, there is potential that the authorship team could have introduced bias during the theming of statements. Furthermore, we acknowledge that Round 1 questions, as created by the authorship group, may have introduced bias given our clear understanding of the current Australian ‘Professional capabilities for podiatrists’ (2) and experience in the assessment of students undertaking WIL. Finally, the strengths of a Delphi technique are enhanced by the anonymity of participants and maintaining confidentiality of responses/respondents. Whilst respondents were asked to maintain anonymity throughout the process, podiatry is a small profession and the chance of intentional or non-intentional collusion of responders cannot be guaranteed.