A total of 382 activities were evaluated by the REAP tool of which there are 40 workshops (10.5%), 280 Courses (73.3%) and 62 web-based resources (22.1%). Table 2 further describes the characteristics of the included learning activities. The majority of learning activities were offered online (60% of workshops, 71.4% of courses, and 100% of web-based resources) and the majority of activities (85.1%) fulfilled 1-2 competencies. Most learning activities were based in the PAHO (84.8%) or EURO (9.2%) region, though some courses were also based in the AFRO region (3.4%). Cost varied widely, with a large set of free courses (47.1%), but also a substantial set of courses costing over $1,000 (22.5%).
Each of the learning activities were further assessed across the four domains of the REAP tool. Results are displayed in Tables 3a-3d with key variables for each domain broken down by the category of competency (core, content, or skill-based competency). We chose these competency categories as a consistent way to analyze the REAP variables by domain because of the centrality of these competencies to the design of STAR’s learning program and because we anticipated that some of the REAP variables would differ based on the kind of content that a learning activity focused on. Some activities were often able to fill multiple competencies and thus may be featured multiple times.
Relevance
The relevance findings, summarized in Table 3a, describe how the learning activities in the database were distributed across the three categories of STAR competencies (core, content, and skill) and by milestone level. Most courses across all competencies were either at the understanding (195/531, 36.7%) or practicing level (291/531, 54.8%). Very few courses were at the inquiring (20/531, 3.8%) or leading level (2/531, 0.4%)..
Engagement
Engagement was a REAP domain that STAR participants prioritized, with general preferences for higher levels of engagement with faculty and peers (Table 3b). Among the core competency-related activities, there was a high level of faculty engagement (83/195, 42.6%), but about equally as many had either no direct faculty engagement (36/195, 18.5%) or limited engagement (41/195, 21.0%). Faculty engagement levels also varied for the content and skill competencies, though skill competencies had a number of activities with average (53/241, 22.0%) and high levels of engagement (50/241, 20.7%).
For peer engagement, many activities across all competency categories had no engagement (core: 60/195, 30.8%; content: 45/95, 47.4%; skill: 97/241, 40.2%). For activities covering core competencies, the largest proportion of activities were characterized as having high levels of engagement (66/195, 33.8%). For content competency-related activities, the second largest set of courses had an average amount of peer engagement (20/95, 21.1%). Finally, the skill competency activities were fairly equally distributed across limited (41/241, 17.0%), average (44/241, 18.3%), and high (45/241, 18.7%) levels of engagement.
Access
Access variables include the in-person requirements of an activity and the flexibility in start time and pace (Table 3c). The largest proportion of courses were offered online only across all categories of competencies (core: 110/195, 56.4%; content: 76/95, 80.0%; skill: 173/241, 71.8%). Core competency-related activities also had a large proportion (63/195, 32.3%) offered in workshop or short training formats, while the content and skill competency-related activities had a small, but still noteworthy, number of these workshop format activities (content: 14/95, 14.7%; skill53/241, 22.0%).
For core competency activities, the flexibility in start time varied and was spread fairly evenly across infrequent starts (once or twice a year) (47/195, 24.1%), frequent starts (53/195, 27.2%), and on demand courses (available whenever learners signed up) (58/195, 28.7%). Skill competencies followed a similar distribution, but with more courses (106/241, 44.0%) offered on demand. Content-related activities were more commonly available on demand (63/95, 66.3%).
Across competencies, the most common amount of flexibility in terms of the pace at which a participant could complete the course was “highly flexible” (core: 68/195, 34.9%; content: 66/95, 69.5%; skill: 106/241, 44.0%).
Pedagogy
Table 3d includes a set of variables related to activity pedagogy. For this variable, we evaluated the availability of a syllabus, the credibility of the instructor, the applied learning aspects, and use of assessment as a learning tool. Activities were split between which had and which did not have formal syllabi: core competency-related activities provided a syllabus (93/195, 47.7%) more often than not (81/195, 41.5%) and more skills activities also had a syllabus (133/241, 55.2%) than did not (80/241, 33.2%). The vast majority of instructors for all categories of activities were considered to be credible (core: 164/195, 84.1%; content: 83/95, 87.4%; skill: 203/241, 84.2%).
The amount of application of knowledge and concepts in an activity was also a priority for STAR, with many participants valuing higher levels of applicability to both real-world examples and their work. The amount of applicability found within activities varied, but for many activities, there was not enough information about this aspect available (core: 45/195, 23.1%; content: 22/95, 23.2%; skill: 53/241, 22.0%).
In terms of the inclusion and alignment of learner assessment within activities, both the core (88/195, 45.1%) and skill (105/241, 43.6%) competency-related activities had substantial proportions that included assessments that were highly aligned with activity learning objectives and topics. However, all categories of competencies had almost one third of activities for which the inclusion of any assessments was unknown (core: 56/195, 28.7%; content: 31/95, 32.6%; skill: 73/241, 30.3%).