Study design and participants
This prospective study, conducted from January 2016 to November 2017, included two cohorts of fourth year medical students at the University of Cape Town, South Africa. Medical students were recruited during their Internal Medicine clinical clerkship, during which Electrocardiography is traditionally taught. Students recruited in 2016 formed the conventional teaching cohort, whereas students from 2017 formed the blended learning cohort.
Method of ECG instruction
At the University of Cape Town, medical students are introduced to the basic principles of Electrocardiography during their third year of study, when they attend a series of lectures introducing rhythm and waveform abnormalities. Therefore, all participants in this study had prior exposure to ECG teaching in the preceding academic year. Training in Electrocardiography continues during the fourth-year Internal Medicine clinical clerkship in the form of lectures. Over and above lectures, the clinical clerkship also requires of students to acquire and analyse ECGs on the patients that they see. They present patients and their ECGs to senior clinicians on ward rounds. Although a year apart, both cohorts completed the same Internal Medicine clinical clerkship, with the same clinicians, and the same learning requirements and opportunities, both during lectures and on ward rounds. There is no formal ECG training after the Internal Medicine clinical clerkship. The other fourth-year clerkships at our institution comprise Obstetrics, Neonatology, Psychiatry and Public Health.
During the study period, participants attended two lectures (of 120 minutes each) at the end of the second and fourth week of the Internal Medicine Clerkship respectively. The lectures revisit the basic principles of ECG analysis (including calculation of the heart rate, measuring the different intervals and calculating the QRS axis), and then predominantly focus on waveform abnormalities (e.g. left and right atrial enlargement, left and right ventricular hypertrophy [LVH, RVH], left and right bundle branch block [LBBB, RBBB], left anterior fascicular block, Wolff-Parkinson-White [WPW] pattern, ST elevation myocardial infarction [STEMI], pericarditis, hyperkalaemia, long QT syndrome) and rhythm abnormalities (e.g. sinus arrhythmia, sinus arrest with escape rhythm, first degree atrioventricular [AV] block, Mobitz I and II second degree AV block, third degree AV block, fast and slow atrial fibrillation [AF], atrial flutter, AV node re-entry tachycardia [AVNRT], ventricular tachycardia [VT] and ventricular fibrillation [VF]). The topics included in the syllabus are considered core knowledge for undergraduate ECG training at our Institution.22 All ECGs were 12-lead ECGs, with the exception of VF for which a chest lead rhythm strip was shown. The lectures were interactive, i.e. students were asked to analyse and interpret all the ECGs shown during the Microsoft® PowerPoint® presentation, and they were encouraged to ask questions. All lectures were facilitated by the same lecturer, who used the same Microsoft® PowerPoint® presentations (demonstrating the same ECG examples and illustrations) throughout the study. Depending on the student allocations to clinical clerkships, the group sizes varied between 32 and 42 students.
Both cohorts received the same lectures. In addition, the blended learning cohort had access to a web application (ECG ONLINE, accessed at ecgonline.uct.ac.za), which facilitated ECG analysis and interpretation with feedback. Access was free but restricted to University staff and registered students at the time of the study. Use of the web application was voluntary; its use was not a compulsory learning activity during the clerkship. Once signed into the web application, users had access to five online modules, each containing four to six ECGs to analyse. The ECGs used in the web application were different to the ECGs used during the lectures but were of the same diagnoses discussed in class. For each of the 24 ECGs contained in the web application, the user was provided with a standardised template for online analysis, as shown in Supplementary Material 1. The template contained checkboxes for normal and abnormal parameters, as well as textboxes for interval measurements and axis calculations. For each ECG, the user selected the checkboxes that were relevant to the ECG analysis and entered the values of the interval measurements and axis calculations. The web application required that users analysed the rate and rhythm before proceeding to the detailed waveform analysis, and prior to providing their interpretation (diagnosis) of the ECG.23 Once the process of ECG analysis and interpretation were complete and submitted, users were provided with the correct answers on the same page to facilitate comparison with the answers they had provided. For each analysed ECG, the user could also download a document with a take-home message (with text and annotated ECGs), as shown in Supplementary Material 2. There was no limit to the number of times that participants could analyse the ECGs. Students could also review their previous analyses, along with the answers of all their previously submitted ECGs. The web application monitored the number of ECGs analysed by each user.
The features offered by the web application used in this study, as well as the ECG curriculum taught online, are summarised in Table 1. These aspects were compared to undergraduate ECG teaching software that has previously been described in the literature,18, 24-33 and assessed by the modified Kirkpatrick framework.34
Assessment of ECG competence
The study flow of participants and competence tests is outlined in Figure 1. During the study, participants were asked to complete three 30-minute competence tests. Each comprised 28 single best answer multiple-choice questions (MCQ). The first MCQ test (pre-intervention test) was written on enrolment, i.e. in the first week of the Internal Medicine clinical clerkship, prior to any ECG teaching in that academic year, to determine baseline (pre-existing) ECG competence. The second MCQ test (immediate post-intervention test) was written at the end of the six-week clinical clerkship, after ECG tuition (with or without access to e-learning in the blended learning and conventional teaching cohorts respectively), to assess the participants’ acquisition of ECG competence. The third MCQ test (delayed post-intervention test) was written six months later, without any further ECG training, or access to the web application, to assess the participants’ retention of ECG competence. The first, second and third MCQ tests respectively were the same for both cohorts, i.e. the two cohorts underwent the same assessment of baseline ECG competence, as well as acquisition and retention of ECG competence.
The three tests examined the same topics, using the same multiple-choice questions and answers, but with different exemplar ECGs in the three respective tests. Each test included three questions regarding basic ECG analysis (i.e. calculating the rate, measuring the QRS width and determining the QRS axis), as well as 25 ECG diagnoses. Of these, 12 were rhythm abnormalities (sinus arrhythmia, sinus arrest with escape rhythm, first degree AV block, Mobitz I and II second degree AV block, third degree AV block, AF with normal and uncontrolled rate, atrial flutter, AVNRT, VT and VF), and 13 were waveform abnormalities (left and right atrial enlargement, LVH, RVH, LBBB, RBBB, left anterior fascicular block, WPW pattern, anterior and inferior STEMI, pericarditis, hyperkalaemia, long QT syndrome). These conditions that were included in the MCQ tests are considered core knowledge for the undergraduate ECG training at our Institution.22 The ECGs used in the tests were not the same as those used in class or on the web application. Two cardiologists and two specialist physicians, with a special interest in Electrocardiography, agreed that the ECGs used in the tests were unequivocal examples of the conditions and that the multiple-choice options were fair for the given ECG.
The MCQ tests were administered electronically at the University computer laboratories. They were invigilated, password-protected and could only be accessed on the day of the test. The order in which the questions were asked was randomised. For each question, there were five optional answers - four possible diagnoses (of which only one was correct), and a fifth option, i.e. “I am not sure what the answer is”. Each correct question was awarded one mark and negative marking was not applied. The answers to the questions were only made available to the students at the end of the study. The results of the MCQ tests in this study did not contribute to the participants’ course mark.
Survey of confidence in ECG interpretation
After the immediate post-intervention test, participants completed a survey in which they were asked to rate their confidence in ECG analysis and interpretation using 5-point Likert-type questions (to which the participants could select strongly agree, agree, neutral, disagree or strongly disagree).35, 36 For purposes of analysis, the responses were clustered into three categories (agree, neutral and disagree).
Determining other learning materials used during study period
After the immediate post-intervention test, participants were also asked to declare which learning materials (i.e. textbooks, class notes) they used during the study period.
Students’ perception of lectures and web-based learning
Participants were asked to comment on what they liked and what they disliked of the lectures (both the conventional teaching and blended learning cohorts) and web application (blended learning cohort only). The feedback was received in free-text form. Two investigators (CAV, VCB) performed qualitative content analysis of the feedback from the participants. An inductive approach was used to identify themes and subthemes from the free-text comments made by the participants with regards to the lectures and web application.37, 38 The themes and subthemes were refined through an iterative process of reviewing the participants’ responses.39 Disagreement was resolved through discussions with a third investigator (RSM). A deductive approach was used to quantify the frequency in which the themes and subthemes emerged from the feedback on the lectures and web application.40
Estimated sample size needed for an adequately powered study
We estimated that a minimum sample size of 36 participants in each group would provide 80% power to detect a mean difference of 10% in the test scores after intervention between the two groups and considering an α (type 1 error) of 5%. This calculation was based on the results of previous studies that compared blended learning (lectures complemented by CAI) to lectures alone for teaching Electrocardiography.18, 26, 30, 32
Eligibility to be included in the study
All fourth-year medical students were invited to take part in the study; participation was voluntary. Participants were only included if they completed all three MCQ tests and the survey on ECG confidence during the study period.
Statistical analysis
Statistical analyses were performed on anonymised data using Stata (Version 14.2, StataCorp, College Station TX, USA). The Shapiro-Wilk test was used to assess distributional normality of data.41 Parametric data were summarised as means with standard deviations (SD), whereas median with interquartile range (IQR) were used for non-parametric data. Paired and unpaired t-tests were used to assess within-group and between-group differences in test scores respectively. Cohen’s d was used to determine the effect size (practical significance) of the differences in test scores, with values of 0.2, 0.5 and 0.8 indicating small, moderate and large effect sizes respectively. Categorical variables were expressed as frequencies and percentages. Chi-squared or Fisher’s exact tests were used, where applicable, to compare categorical variables. A p value of < 0.05 was considered statistically significant.