In total, 81 of 100 invited staff members (81%) entered the training module and completed the online survey. Fifty (62%), 16 (20%) and 15 (18%) staff members responded from China, Spain and Denmark, respectively. Of the 81 responders, 41 were doctors (51%) and 40 were nurses (49%). Previous experience with NIRS monitoring was reported by 46 of the 81 responders (57%), including 26 doctors (57%) and 20 nurses (43%) (table 4). In Denmark, six of the 15 responders had previous experience (40%), in China 25 of 50 (50%), and in Spain 15 of 16 (94%).
Overall, responders spent a median time of 15 minutes (range 1 to 420 minutes) and a median number of seven questions (range 4 to 50 questions) to complete the NIRS module. Spanish responders were faster than both Danish and Chinese (median 10, 14 and 20 minutes respectively), and used fewer questions to pass (median 4, 7 and 8 respectively) (table 2). Doctors were faster than nurses (median 13.5 versus 20 minutes) and used fewer questions to pass (median 6 versus 9 questions) (table 3). Responders with NIRS experience were faster than non-experienced (median 13.5 minutes versus 20 minutes) and spent fewer questions to pass (median 5.5 versus 8 questions) (table 4).
Overall, 69 of 81 (85%) responders found the academic level of the learning material appropriate and none found it too easy. Of the 12 responders who found the learning material too advanced, ten were from China, one was from Denmark, and one was from Spain (table 2). Eight of 40 (20%) nurses found the learning material too advanced compared to four of 41 (10%) doctors (table 3). Additionally, no relevant difference was seen between responders experienced in NIRS monitoring and those with no experience (seven of 46 (15%) experienced versus five of 35 non-experienced (14%)) (table 4). When asked if the introduction material was sufficient to answer quiz questions, 23 of 78 (29%) responders disagreed or strongly disagreed, nine from Denmark, 11 from China and three from Spain (table 2). More nurses (13 of 41 (32%)) than doctors (10 of 38 (26%)) disagreed or strongly disagreed to this statement (table 3). Amongst those with NIRS experience, 12 of 44 (27%) disagreed or strongly disagreed compared to 11 of 34 (32%) responders with no previous experience (table 4).
Seventy-five of 81 (93%) responders agreed or strongly agreed, that the academic level of questions was appropriate. Of those who disagreed or strongly disagreed, no relevant difference was found between countries (table 2) or clinical positions (table 3). Of the six disagreeing or strongly disagreeing on the statement, five were experienced in NIRS monitoring (table 4). Thirty-two of 81 (40%) responders thought there were too many answer possibilities for each question, primarily nurses (20 of 40 (50%) nurses compared to 12 of 41 (29%) doctors) (table 3) and Danish responders (nine of 15 (60%) compared to seven of 16 (43%) Spanish and 16 of 50 (32%) Chinese responders) (table 2). Among those with NIRS experience, 19 of 46 (41%) thought there were too many answer possibilities compared to 13 of 35 (37%) with no experience (table 4). When asked if the quiz questions were clinically relevant and up-to-date, 77 of 80 (96%) responders agreed or strongly agreed on this.
Almost one-third of all responders (23 of 80 (29%)) experienced that the NIRS module crashed once or multiple times. It seemed that the problem was greatest in Denmark and Spain where nine of 15 (60%) and six of 15 (40%) reported experiencing a crash, compared to only 8 of 50 (16%) in China (table 2). Among doctors and nurses, 12 of 41 (29%) and 11 of 39 (28%) experienced a crash, respectively (table 3). Eleven of 45 (24%) experienced with NIRS and 12 of 35 (34%) non-experienced responders reported a crash (table 4).
Preparation for using NIRS
When asked if the module was relevant to prepare staff members to use the NIRS device, 72 of 80 (90%) agreed or strongly agreed on this (13 of 15 (87%) Danish, 12 of 15 (80%) Spanish and 47 of 50 (94%) Chinese responders) (table 2). No relevant difference was seen between clinical positions (35 of 40 (88%) doctors and 37 of 40 nurses (93%)) or between experience levels (41 of 45 (91%) experienced and 31 of 35 (86%) non-experienced) (table 3 and 4).
The thematic analysis resulted in four essential themes, accompanied by sub-themes (figure 4). The themes were 1) Learning material-quiz discrepancies 2) Lack of clarity within course 3) Technical issues and 4) Unsolicited positive comments. These four themes elicit key concepts, that are essential throughout the data.
Learning material - quiz discrepancies
Some responders (n=18) described a discrepancy between the learning material and the quiz, with several stating that the learning material was insufficient to adequately answer the questions in the quiz:
“For someone who know[s] little or nothing about the topic, the introduction material is not sufficient enough to answer the quiz questions” Doctor
One responder stated that despite being committed and working hard to gain a comprehension of the learning material, they struggled with answering the questions correctly and finishing the course:
“Put in a great effort to understand the intro material and I was surprised that I could not answer questions correctly. I did not feel that there was a connection between theory in the introduction material and questions” Nurse
A few mentioned (n=6), that the learning material was too simple or not detailed enough, and lacking comprehensiveness:
“Additional knowledge is needed in the principles and concepts section” Nurse
As a possible consequence of this discrepancy, some responders (n=10) also expressed that the content of the course was too hard:
“The content is too hard to understand“ Nurse
“The questions are difficult, and the basic courses are few” Doctor
Some responders (n=23) also stated that specific clinical content was missing in the learning material, which made it difficult to complete the quiz. A specific concern raised (n=14), was the absence of knowledge regarding the practicality behind the usage and handling of the NIRS device:
“Risk of skin marks and side-effects is not described sufficiently in the introduction material” Doctor
The lack of clinical content left a few responders (n=4) feeling unequipped for answering questions in relation to this:
“No introduction to how you prepared for NIRS monitoring, so it was pure guessing – you have no idea whether you need to calibrate/shave/wash or something else (prior monitoring), if you have not been told forehand” Nurse
Lack of clarity within the course
Language issues were mentioned (n=6), including that the language was not written precise and clear, which made it hard to understand the context of the course. This was voiced by Spanish (n=1) and Chinese (n=5) participants.
“… The language is not enough concise and clear” Doctor
The transparency of the module’s structure was also criticized, with a few responders (n=7) stating that the feedback mechanism was hard to figure out:
“[Tthe module] did not tell me what my wrong answers I had, and therefore I didn’t know what the correct answers were and I couldn’t find it in the introduction material” Nurse
In this event, one mentioned that it was hard to learn something from answering incorrectly:
“It would be nice if one could learn something by answering wrong, hence that you could use the box that pops up after you answer incorrectly to see what was the correct answer.” Nurse
In specific regards to the module lacking clarity, the deficient explanation of the quiz set-up was described. One responder expressed that it should be stated more explicitly how the module was structured:
“Very good, but I was not prepared for a case-setup – and many answer possibilities were not mentioned in the introduction material” Nurse
A few (n=4) respondents stated that having multiple answer possibilities was an issue:
“I think the quality of learning is increased if there are more questions with fewer answer possibilities. The purpose is learning and I think this could be heightened if one is presented with more questions with lesser answer possibilities….” Doctor
Technical issues seemed to be a source of frustration in this course. Responders answered that the module entered into a loop of incorrect questions (n=4), that it crashed (n=8), that the speed was slow (n=5), and that the screen froze (n=12) with one responder describing how it froze three to four times in a row, which caused this person to restart and begin all over again:
”If you do it, you will be stuck, you can not finish it, what the hell” Doctor
“The page hangs on some occasions and does not allow to advance. When there is an incorrect answer, it loops in and you must restart the questionnaire to get out of there” Doctor
The accessibility also seemed to be a problem. A few (n=4) experienced that they could not easily navigate between the quiz and learning material without losing answers or facing a module crash, which in some cases led to a failure in finalizing the quiz:
“Problems when some question is incorrect: it does not allow one to advance, in spite of reviewing the material and you must leave the page” Doctor
Unsolicited positive comments
Despite the open-ended questions were focused on clarifying any critique points of the module as well as potential improvements, some responders (n=12) also commented on the positive aspects of the module. Some applauded the clinical relevancy and fitness for clinical use:
“Suitable for application of clinical” Doctor
Others were positive towards the method of learning:
“I really like the methodology in this e-learning course…” Doctor
Some were also generally positive such as:
“Just right, very good“ Doctor
“Very helpful” nurse
“Relatively friendly” nurse