Design Thinking is a human-centered process of evaluation and innovation that fosters collaboration between stakeholders and the integration of multiple perspectives to understand issues and develop solutions that are appropriate for all users [21]. This methodology has been used across various disciplines as the nature of Design Thinking is relatively open-ended [22]. However, in this study, we have employed the Deitte and Omary version where the methodology is divided into five stages: (1) empathizing, where user experiences are considered and an understanding of their needs and preferences are understood; (2) defining, where the most pressing and meaningful challenges and/or opportunities are described; (3) ideating, which involves contemplating solutions that may be impactful in addressing the described challenges; (4) prototyping, where the most impactful (and feasible) ideas are developed; and (5) testing, during which stage prototypes are piloted and results are shared with all stakeholders for their input, feedback, and suggestions for improvement [23, 24]. A review by Altman and colleagues found that interventions developed using Design Thinking were more effective, easier to use, and accepted than traditional, expert-based solutions [25].
Accordingly, in this study, we operationalize the Empathy stage of the Design Thinking approach to present an investigation that aims to gain a deeper understanding of the challenges that health professional learners face in an asynchronous, online education graduate program; an understanding which is ideally positioned to support the development of novel solutions to address these challenges.
Study Design. This two-part sequential explanatory qualitative study contemplates the Empathy stage of a Design Thinking project aimed at improving and enhancing the educational quality of the Health Science Education graduate (HSED) program at McMaster University. The first part was comprised of an empathy-focused mapping stage with program stakeholders; the second part involved a series of semi-structured interviews with program learners. The design was sequential insofar that the empathy mapping phase informed the development of the interview guide used in the second phase.
Study Context. This study was conducted in the context of the Health Science Education (HSED) Graduate Program, which is an online, primarily asynchronous, MSc-level degree training program in the Faculty of Health Sciences at McMaster University in Hamilton, Ontario, Canada. Students in the program are typically working health professionals from a wide variety of disciplines who teach in health sciences education, research, and academic clinical care, and who lead in healthcare settings (see hsed.mcmaster.ca for details about the graduate program).
The majority of learning activities in the program are administered online and asynchronously, allowing active professionals to pace their academic work alongside their professional roles [26]. However, students are required to engage in two mandatory in-person events during their time in the program. Before beginning their studies in the online environment, incoming students attend mandatory, in-person sessions in the summer months preceding their first term. Likewise, students nearing completion of their degree also engage with in-person sessions surrounding final curricular requirements in the summer months preceding their final term [26].
Ethics Approval.
This study was approved as an education quality improvement activity by the Hamilton Integrated Research Ethics Board.
2.1 Part 1 – Empathy Mapping
Participants. There were 14 focus group participants, which included 8 faculty members, 3 student representatives, 1 educational developer, 1 graduate officer, and the HSED Assistant Dean.
Data Collection. Focus group participants were assigned to smaller “user groups” according to their respective roles: one group of students, 2 groups of faculty (4 members per group), and one administration group (comprised of the developer, officer, and dean). Each user group generated “empathy maps” on physical poster charts, which were defined by four quadrants: “Think”, “Feel”, “Say”, “Do”. Using the maps, the users were asked to categorize their experiences in terms of their thoughts, observations, feelings, and actions [27]. In this context, the empathy maps illustrated what each user group did, said, thought, and felt as it related to their experience in the graduate program (Fig. 1). This process was mediated by a group facilitator (IB).
Data Analysis. Empathy maps were subjected to thematic analysis to capture the important elements of the user experience [28]. This began with a close reading of the empathy maps to ensure familiarity with the data. Two researchers (IB, AE) then analyzed the maps using open and axial coding techniques to inductively generate categories regarding the learner experience in the online program [29]. This coding was performed independently and differences in coding outcomes were reconciled during full research team meetings. The findings derived from the empathy maps informed the development of the semi-structured interview guide used in Part 2 of the study.
2.2 Part 2 – Semi-structured Interviews
Participants
The interview portion of the study included 11 participants. These individuals were all active HSED students or very recent graduates of the program (Table 1). We ceased recruitment with consideration for the concept of information power, which describes persisting with data collection until sufficient information is present to meet the research objective [30].
Table 1
Participant demographic data
Characteristics | Participants (n = 11) |
Sex | |
Male | 3 | |
Female | 8 | |
Student Academic Load | |
Full-time | 5 | |
Part-time | 6 | |
Student Program Pathway | |
Course-based | 5 | |
Thesis-based | 6 | |
Interview Format | |
In-person | 4 | |
Zoom Communications | 7 | |
Data Collection. Participants were recruited through email requests using a purposive sampling technique, to maximize diversity in learner’s academic load (i.e., full-time, part-time) and program pathway (i.e., course-based, thesis-based) [31]. Interviews were conducted either in-person, by online video-conferencing platform, or by telephone, according to participant’s preference. These interviews were facilitated with the use of an interview guide that consisted of questions contemplating the structure of the program as well as its perceived strengths, weaknesses, gaps, and opportunities. The interviews were conducted by one of two researchers (IB and AE) and lasted approximately 30-minutes. All interviews were recorded digitally, transcribed verbatim, and de-identified.
Data Analysis. The interview data were analyzed by an unconstrained approach to descriptive content analysis informed by the analysis performed in Part 1 of the study [32]. This again began with a close reading of interview transcripts and progressed through a process of deductive, line-by-line coding to identify resonance with the analysis completed in Part 1. The transcripts were also coded inductively to identify other experiences relating to the learner experience in an online program [29]. Independent coding of the first three transcripts was completed by two research team members (AE and MN). Upon completing the first three transcripts, these researchers met to review their codes. The full research team (AE, MN, IB, and LG) then met to reconcile coding differences and reach a consensus on the final coding framework. Once the coding framework was finalized, it was uploaded and systematically applied to all the transcripts by a member of the research team (MN) using the qualitative software program, NVivo (QSR International, Doncaster, Australia). Codes with conceptual similarities and characteristics were grouped together to form higher and broader level categories, which were then ultimately expressed as one overarching theme, with two subthemes that gave a focused explanation of the learner experience in an online program. To establish credibility, member checking was conducted where the findings were shared with three program stakeholders to ensure the findings resonated with the participants' essence and the nature of the overall program [33]. Neither the interview transcripts nor the findings were returned to the participants for commentary.