Online Practical Assessment (Opra) in Biochemistry Designed on Bloom’s Taxonomy to Assess the Student’s Competency: Prior Exposure of the Students to Online Tools Beneficial



Background: Online assessments are the need of the hour during the prevailing pandemic situation to continue the educational activities and at the same time ensuring the safety of the students and faculty. All Higher Educational Institutions must take the necessary steps to design an effective assessment model to assess the various levels of student competency. Here, we analyzed the responses of the students after the conduct of the online practical assessment (OPrA) in Biochemistry.

Methods: The blueprint of the OPrA was prepared by the faculty referring to the various levels and domains of Bloom’s taxonomy. Four components were chosen for the online assessment: digital spotters, enumerating the steps of objective structured practical examination, interpretation of quantitative estimation, and case discussion. Detailed instruction was provided to the students and the breakout room option in the Zoom platform was chosen for assessment. About 12-13 students were assessed by each faculty in separate breakout rooms over 15-20 minutes on all four components. Feedback on the conduct of the examination was collected from the students and faculty anonymously and analyzed.

Results: Among 200 students who submitted their responses, 20% of them had previous experience of attending the OPrA. They differed in their opinion from the rest of the students on five aspects; time allotted for the assessment (P-value=0.02, χ2=5.07), students using unfair means during the online viva (P-value=0.02, χ2=5.57), their computing skills (P-value=0.001, χ2=19.82), their performance (P-value=0.001, χ2=8.84), and overall conduct of the examination (P-value=0.001, χ2=15.55).

Conclusion: OPrA tools may be designed referring to Bloom’s taxonomy and prior exposure to the online tools may be beneficial for the students. 


In the modern era, to enhance the efficiency and productivity of any field, the advancement of scientific technology and digitalization is of utmost importance pondering its tremendous impact on the output. All academic institutions including Higher Educational Institutions are now swiftly shifting to e-learning and e-assessment platforms considering the present COVID-19 pandemic. The use of advanced technologies in medical education and the role of blended and hybrid course models have also been highlighted in the recent Horizon Teaching and Learning report (1,2). Several barriers for the implementation of online learning in medical education were identified which include poor technical skills, time constraints, absence of institutional strategies and support, inadequate infrastructure, and negative attitudes of the people involved (3). However, virtual learning being the only safe option to ensure student involvement during the COVID-19 pandemic there was enhanced adaptations to the online tools overcoming most of the listed barriers (4). Since then, online learning has become a new norm and has emerged as a compliant method of training and sustainability of learning (3).

Assessment is an integral part of any training program to assess the effectiveness of the teaching-learning strategy employed, measure, track, understand the student’s capabilities, and evaluate their progress (5). The method of assessment has evolved over several years from the formal and structured summative assessment to the ongoing and flexible formative assessment. The choice of the assessment method depends on what is assessed and why it should be assessed and the choice of assessment tool depends on how valid, reliable and flexible it is? (6) With education being digitalized and teaching-learning happening via virtual mode the need for online assessment has been felt across various institutions. Online assessments have ensured continuity of the educational activities by providing students with an option of taking exams remotely in a secured environment using the latest technology. Assessment is purpose-driven and should be designed to measure the core knowledge and competencies attained during the learning process of the student. However, studies carried out to assess the effectiveness of the online assessment by digital means have reported controversial perceptions with some mentioning it as a beneficial strategy (7) while others referring to it to be imperfect (8).  

The traditional assessment system includes assessing both theoretical knowledge and practical skills of the students wherein all the three domains; the cognitive, psychomotor, and affective domain of Bloom’s taxonomy are taken into consideration. During the practical assessment of Biochemistry, the medical undergraduate students are assessed for their cognitive domain by their ability to recall, to explain the principle of the test being performed, the rationale for choosing the test based on the clinical scenario provided, analyze and interpret the report and provide biochemical reasoning for the clinical manifestation of the hypothetical patient. They are also assessed for their psychomotor domain using objective structured practical examination (OSPE) stations wherein they are observed how skillfully they perform the clinically relevant tests. The affective domain is assessed by the empathy they show towards the hypothetical patient in terms of explaining the clinical condition to them, counseling them for undergoing the confirmatory test, and interpreting the test results so that the patient is convinced for the further course of treatment. 

Replicating this and assessing all the domains through the online mode is currently a daunting task. The presently available online assessment tools are suitable for assessing the student's cognitive domain but fail to assess the psychomotor and affective domains. Hence, while using digital web-based tools to assess the student’s academic knowledge and understanding of the curriculum, the instructor needs to use his creativity and devise new ideas and solutions to have a fair assessment. However, presently the online assessments have several pros and cons (9,10), one of them being the previous experience of the students in using the online tools for the practical assessment. In our attempts towards a fair online practical assessment in Biochemistry for medical undergraduates, we used the zoom breakout and waiting room option to individually evaluate the student's knowledge in the subject of biochemistry and their ability to understand, apply, analyze and evaluate the clinical scenario presented to them. Here we present the student's perception towards our attempts to provide a fair online practical assessment as an interim solution during the pandemic. We also compared the responses of the students who had prior expertise in attending online practical assessments to those who did not have any prior experience to come out with effective solutions for the smooth conduct of such assessments. 


Study subjects and their training: 

The study was conducted in the Department of Biochemistry of a Private Indian Medical School setting having 200 admissions per year for the Bachelor of Medicine and Bachelor of Surgery (MBBS) program. The JSS Medical College ethical committee clearance was obtained via letter no. JSSMC/IEC/110621/12NCT/21-22 Dated: 11.06.21. The students were admitted to the program in February and continued their onsite training till April enabling them to complete hands-on training on the estimation of few biochemical parameters in the practical session of Biochemistry. These include the quantitative estimation of Alkaline phosphatase, Aspartate aminotransferase, and Alanine aminotransferase. They were also trained in the OSPE station to detect the presence of proteins and reducing substances in the given sample by heat coagulation test and Benedict's test respectively. Post lockdown, demonstration of few practical sessions were conducted online, these include, the quantitative estimation of total protein, albumin, globulin, and calculation of A/G ratio, qualitative analysis of normal urine, and OSPE stations for calculating specific gravity of the given urine sample using urinometer and detecting the presence of protein and glucose in the given urine samples by dipstick method. The videos of the same were uploaded to the University's online portal which could be accessed by the students as and when they desired. 

Structuring the online practical assessment: 

After the completion of the module 1 syllabus, the student’s theoretical knowledge and practical skills are assessed before proceeding with module 2. Considering the pandemic and the mandatory assessment due for the students, an online assessment was planned, and the students were informed accordingly giving them sufficient time to prepare for the same. The faculty involved in the training of these students formed a committee and had focused group discussions to prepare the blueprint of the online practical assessment. Based on the blueprint, the online practical assessment in biochemistry was structured comprising of four components; 1. Digital spotters – wherein the students identify the displayed item and mention its clinical utility. This helped us to assess their “recall” and “understanding” abilities. 2. OSPE – wherein the students had to enumerate the steps required to perform a particular test. In the traditional assessment, this helps us to assess the psychomotor skills of the student, however, this was retained in the online mode so that students have a feel of experimenting. 3. Quantitative estimation – wherein the students were given a short case scenario based on which they had to select a confirmatory test to arrive at the diagnosis. This helped us to assess the “apply” and “analyze” levels of Bloom’s taxonomy. The students also had to calculate the final result based on the raw data provided to them and then interpret the results which helped us to assess the “analyze” and “evaluate” levels of Bloom’s taxonomy. 4. Case discussion – the students were provided with a case scenario after going through which they were asked to link the various aspects and arrive at a probable diagnosis. They also had to justify the various biochemical mechanisms responsible for the patient's clinical condition. This also helped us to assess the “analyze” and “evaluate” levels of Bloom’s taxonomy. 

Conduct of the online practical examination: 

A document containing detailed information and instructions regarding the online practical assessment was posted in the student’s WhatsApp group. The online assessment was conducted in two batches on two different days, consisting of 100 students per batches. The Zoom platform was chosen for the online practical assessment and nine breakout rooms were created with one faculty in charge of each room. The students were divided into nine groups comprising of 12-13 students per group and were asked to switch on their video and audio during the online assessment. Each group was allotted different breakout rooms and students were admitted one at a time to their respective breakout rooms. Each student was given an option to choose a number between 1-13. Clicking the chosen number would open the link to the set of questions based on the four components mentioned above and the student was assessed for 15-20 minutes by the faculty on all the four components to understand the level of competency attained by the student.

Preparation and validation of feedback forms: 

A questionnaire was designed on a five-point Likert scale using a Google form to collect the feedback from the students as well as the faculty, post-assessment. The content validity index (CVI) and content validity ratio (CVR) were calculated to assess the pertinence and precision of both the student and faculty feedback forms. The ten membered expert panelists were explained about the study in an online meeting and the content validation form was given to them to review the questionnaire with detailed instructions. The questions were given a score from 1 to 4 based on the relevance of the questions to the study and then the scores were recoded as 0 (for the score of 1 and 2) and 1 (for the score of 3 and 4). CVR was calculated using the formula: CVR = [(E - (N / 2)) / (N / 2)], where E is the number of experts who rated the question as relevant and N is the total number of experts involved in the assessing the questions. CVI was calculated as = agreed on questions/number of experts. 

Collection of feedback: 

Following the online practical assessment, the link to the google form was posted in the student and faculty WhatsApp group. The questionnaire consisted of various questions related to their experience on the conduct of the online practical assessment.  Students were informed that their opinion in the feedback would be voluntary and kept anonymous, with no impact on their academic record. The responses were reported in a summative form.

Statistical analysis: 

Data was tabulated in MS excel and the responses of the students and the faculty in the feedback questionnaire were represented as percentages. The coefficient of reliability of questions administered was done by calculating Cronbach's alpha (α).  A Chi-square test was performed to compare the responses of the students who had previous experience and who did not have previous experience to online practical assessment to check whether there was a significant difference in opinion. The P-value of <0.05 was considered statistically significant.


The questionnaire was filled by all the 200 students who attended the online practical assessment. The questions in the questionnaire were considered apt for the study with CVI and CVR scores of 0.9 and 0.97 respectively. The Cronbach's alpha of the questions administered had a score of 0.806 showing good internal consistency.

Student feedback

Among the 200 students, 40 (20%) students had previous experience of attending the online practical assessment. The significant difference in opinion among these students and the rest of the class was observed in the following aspects: 1. Was the time allotted for each student sufficient during the conduct of the online practical assessment? – 95% vs 86.8% of the students agreed while 0 vs 4.3% of the students disagreed respectively (P-value = 0.02, χ2 = 5.07), 2. Did the process allow the students to use unfair means during online viva? – 12.5% vs 6.8% of the students agreed while 72.5 vs 80.5% of the students disagreed respectively (P-value = 0.02, χ2 = 5.57), 3. How would you rate your computing skills to attempt the online practical assessment? – 85% vs 59.9% of the students were very good while 0 vs 9.3% of the students were poor in computer skills respectively (P-value = 0.001, χ2 = 19.82), 4. How do you feel about your performance in the online practical assessment? – 57.5% vs 41.2% of the students felt very good while 7.5% vs 13.1% of the students did not feel good about their performance respectively (P-value = 0.001, χ2 = 8.84), 5. How would you rate the overall conduct of the examination? – 60% vs 40.6% of the students rated the conduct of the examination as excellent while 40% vs 59.3% of the students remained neutral respectively (P-value = 0.001, χ2 = 15.55). However, none of the students rated it as poor. For the rest of the questions in the questionnaire, there was significant agreement between the two groups of students (Table 1).

Faculty feedback

A total of ten faculty members involved in the online practical assessment filled the questionnaire. The CVI and CVR scores were 0.85 and 0.95 respectively, indicating that the questions were suitable for the study. The questions administered had an internal consistency of 0.705, which was acceptable.

About 70% of the faculty had previous experience in conducting the online practical assessment. However, all of them had sound computer knowledge to conduct the online practical assessment and agreed that the online platform chosen for the practical assessment was comfortable and easy to use. About 70% of them agreed that online assessment helped them to understand the student’s preparedness in Biochemistry, 60% of the faculty felt good about students’ performance in the assessment and 50% agreed that the student's performance in the online practical assessment is a good indicator of how well they know/understand the subject, however, 40% of them were neutral about it. Nearly 70% of the faculty agreed that the time allotted for each student was sufficient during the conduct of the online practical assessment. The majority of the faculty (60%) were not sure if the students had used unfair means during the assessment and felt that students were less anxious than the offline assessment. Despite this, about 80% of them felt that the assessment was conducted in a fair manner. Nevertheless, 50% of the faculty voted for case discussion and the other 50% for the spotters to be the best choice of topic for the online practical assessment. Nearly 60% of them preferred offline examination however, they felt that online assessment can be employed as an alternative to offline assessment during pandemics as it has a positive effect on the learning progress. Finally, all of them rated excellent for the overall conduct of the online practical assessment.

Table 1

 Comparing the feedback of the students with and without previous experience of online practical examination.


Students having previous experience (20%)

Students having no previous experience (80%)


Chi-square test

Agree (%)

Disagree (%)

Agree (%)

Disagree (%)

Were the instructions provided before the online assessment clear and easy to follow?







Was the online platform (Zoom) chosen for the practical assessment comfortable and easy to use







Did the online practical assessment help you to understand your preparedness in Biochemistry?







Do you think your performance in the online practical assessment a good indicator of how well you know/understand this subject?







Was the assessment conducted in a fair manner?







Was the time allotted for each student sufficient during the conduct of the online practical assessment?







Were the faculty competent to conduct the online practical assessment?







Did the process allow the students to use unfair means during online viva?







Did you feel less anxious during the online practical assessment?







Was the online assessment better than the offline examination?







Was the online assessment consistent with the topics taught in the regular classes







Do you feel online assessment can be applied as an alternative to offline assessment during pandemics?







Does the online assessment system have a positive effect on the learning progress?








Very good (%)

Poor (%)

Very good (%)

Poor (%)


Chi-square test

How would you rate your computing skills to attempt the online practical assessment?







How do you feel about your performance in the online practical assessment?








Excellent (%)

Neutral (%)

Excellent (%)

Neutral (%)


Chi-square test

How would you rate the conduct of the examination








Conducting assessment in any form to evaluate the student's knowledge and practical skills is an essential component to certify if they have attained the required competency prescribed by the training program. The COVID-19 pandemic has propelled all most all educational institutions to use the online platform including medical institutions (11, 12). Several innovative and novel methods have been employed by Higher Educational Institutions to deliver education to their students ensuring their safety. Various online platforms such as Zoom are being used extensively to create engagement with medical students (13). This approach has proved beneficial to those students who had returned to their native place during the COVID-19 pandemic. They could access the lectures and study material from the comfort of their homes ensuring that they stay updated with the curriculum. This mode of learning was well received by the medical students who could discuss clinical conditions, interesting case studies, and exam-related questions allowing them to stay connected throughout these unprecedented times.

This transition of medical education to online mode has seen significant changes in the assessment methods. Imperial College London was the first to conduct an online examination for their medical undergraduates (8). This approach was followed by several medical schools in the interest of their student's future, with various universities opting for open-book examinations. Written assessments can be comfortably carried out on the online platform, however, the challenge still occurs in the conduct of the online practical assessment. Here, we have tried to create an online practical assessment model for Biochemistry considering all the domains and levels of Bloom’s taxonomy. We have also analyzed the feedback of the students and the faculty towards the design and conduct of the online assessment. The conduct of the online practical assessment was well received by both students and faculty. We employed the face-to-face interactive approach using the breakout room option of Zoom for the conduct of the assessment. Each student was individually interviewed by the faculty in charge over 15 to 20 minutes assessing mainly the cognitive and affective domain as well as the various levels like “knowledge”, “understanding”, “apply”, “analyze” and “evaluate”. The highest level “create” and the psychomotor domain could not be assessed.

Among 200 students, 63.7% of the students had good computer skills, 88.7% of the students did not have difficulty following the instructions provided before the assessment and 93.1% of the students were comfortable with the Zoom platform chosen for the assessment as the same platform was used for the online classes. About 89.2% of the students agreed that the online assessment helped them to understand their preparedness in Biochemistry, however, only 43.6% of the students felt good about their performance and an equal percentage of them (43.1%) were neutral about it, and 72% of them felt that it was a good indicator of how well the students understood the topics. Nearly 95.1% of the students opined that the assessment was conducted in a fair manner, 88.7% of them agreed that the time allotted for them during the assessment was sufficient and 95.1% of them felt that the faculty were competent to conduct the online practical assessment. About 79.5% of the students opined that they did not use unfair means, and 48.1% of them were less anxious during the assessment, 86.3% agreed that the assessment was consistent with the topics taught in the regular classes. The majority of them (40.7%) were neutral about their preference for online or offline assessment, while 34.8% preferred online and 24.5% preferred off-line assessments suggesting a mixed opinion. However, 77.9% of them opined that online assessment was a good alternative to offline assessments during the pandemic as they have a positive effect on the student’s learning process. Overall, 94.1% of the students rated the conduct of the online practical assessment as very good and excellent.

A prospective cohort study carried out on undergraduate students in pharmacology showed a good correlation between online assessment parameters and summative examination performance. This indicated that the regular online assessments may help identify students who would benefit from remedial measures (14). A similar aspect was observed in our study as the faculty had one-to-one interaction with the student in the virtual model, students were at ease and opened up easily as they were in the comforts of their home. This enabled the faculty to provide critical feedback to the students on the areas they could improve. We report a significant difference in the opinion of the students with and without previous experience in online assessment, however, this can be addressed by providing them with few training sessions so that they can perform better and be at par with their peers. An exploratory case study was carried out on third-year medical students in Saudi Arabia has reported that the formative online assessment on Blackboard improved the student's performance in the final exam and the blended learning method was liked by a majority of the students (15). A study was done to understand the perception of the first-year undergraduate medical students on the reliability, usefulness, and practical challenges in the conduct of the online tests reported that online formative assessments were helpful for the students despite practical challenges like network connectivity issues. The most reliable assessment method was viva-voce by video conferencing and the most practically feasible method was the multiple-choice-question-based assessment (16). The network connectivity issue was one of the challenges mentioned by the students in the descriptive section of the questionnaire. Also, a few of them faced difficulty writing the exam with the screen in from of them. However, the face-to-face viva-voce was appreciated by the students as it can curb malpractice. Most of the students liked the comfort of being at home and writing the exam, felt less anxious as it was a stress-free environment, liked the face-to-face interaction, could manage their time better, using earphones helped them to concentrate better, felt that the assessment was systematic and organized, helped them to focus on their exam and were not distracted by their peers. On the whole, the students and the faculty had a new and good experience.

Several advantages for the conduct of online examinations have been listed: they are environment friendly minimizing the wastage of paper; economical considering the reduced human, logistic and administrative costs; and easy to use both by the examiner and the examinee; assessment may be taken from the comfort of the students home; allows the faculty to monitor the students throughout the course in real-time; unique alternatives to the traditional assessment patterns may be discovered and applied to create interests in the minds of the students and motivate them. Several disadvantages have also been listed for this mode of assessment: challenges in adopting new technology may create minor disruption but can be overcome by a brief period of familiarization by the user; infrastructural barriers where access to electricity and stable internet connectivity may be difficult to achieve; long-type subjective answers and practical skills are difficult to assess and may require upgrading of the technology; susceptible to practice unfair means as students may restore to impersonation or use of external help during the assessment which may require artificial intelligence-based proctoring features or transitioning to open-book exams as it may be difficult to avoid students from using external help during the online assessment (9, 10). Educational institutions may opt for adapting the online teaching-learning & assessment strategies in their regular academic program, which will help them to take benefit of the advantages listed above and also familiarise the students & faculty with these tools to be better prepared for any such situations in future.

The limitation of the study is that we could not correlate the marks of the students with and without previous experience in online assessment as the feedback was collected anonymously. This study was carried out in a single Private Medical School and may be a basis for multicentric studies on a larger cohort for confirmation. Also, further studies may be carried out to investigate if regular online assessments improved the overall academic performance of medical undergraduate students.


The current study reports that the online practical assessment was well received by the undergraduate medical students and the Zoom breakout room option was comfortable to use by both the students and the faculty. However, previous experience with online practical assessment would be beneficial for the students, hence a trial session before the final assessment should be executed. The online practical assessment designed in the study was able to assess the cognitive and affective domains as well as the various levels like “knowledge”, “understanding”, “apply”, “analyze”, and “evaluate” of Bloom’s taxonomy. The face-to-face viva-voce was considered the most reliable method of assessment and the students found it less stressful to attend the exams from the comfort of their homes. However, the network connectivity issue was the major challenge faced by the students.


Ethics approval and consent to participate: 

The JSS Medical College Ethical Committee clearance was obtained via letter no. JSSMC/IEC/110621/12NCT/21-22 Dated: 11.06.21 before submitting the results for publication. 

Consent for publication: 

Not applicable

Availability of data and materials: 

The raw data of the current study will be made available from the corresponding author (Dr. Akila Prashant @ [email protected] on reasonable request.

Competing interests: 

The authors declare that they have no competing interests



Authors' contributions: 

SNK, SCR, and BA, designed the study, executed it, and carried out the literature survey; AD, KKS, SKR, and SN, executed the study and drafted the manuscript; ABS, SP, and DD, executed the study and analysed the results; PV, SMN, and AP executed the study, carried out the literature survey and critically evaluated the manuscript. All the authors reviewed the manuscript.


We acknowledge the leadership of JSSAHER for providing us with all the necessary infrastructure to carry out the teaching-learning and assessment activities in the online mode. 

Authors' information (optional):


  1. Brown M, Mccormack M, Reeves J, Brooks DC, Grajek S, Alexander B, et al. 2020 EDUCAUSE Horizon Report TM Teaching and Learning Edition Teaching and Learning Edition Thank You to Our Horizon Report Sponsors Platinum Partner Platinum Partner [Internet]. 2020 [cited 2021 Jun 7]. Available from:
  2. Pelletier K, Brown M, Brooks DC, Mccormack M, Reeves J, Arbino N, et al. Teaching and Learning Edition Teaching and Learning Edition Thank You to Our Teaching and Learning Horizon Report Sponsor [Internet]. 2021 [cited 2021 Jun 7]. Available from:
  3. O’Doherty D, Dromey M, Lougheed J, Hannigan A, Last J, McGrath D. Barriers and solutions to online learning in medical education - An integrative review. BMC Med Educ. 2018;18(1):1–11.
  4. Alkhowailed MS, Rasheed Z, Shariq A, Elzainy A, El Sadik A, Alkhamiss A, et al. Digitalization plan in medical education during COVID-19 lockdown. Informatics Med Unlocked. 2020;20. Available from:
  5. Husain FN. Use of Digital Assessments how to Utilize Digital Bloom to Accommodate Online Learning and Assessments? Asian Online J Publ Gr Asian J Educ Train. 2021;7(1):2519–5387.
  6. Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ. 2007;23(3):239–50.
  7. Moving practical assessment online - Learning and Teaching [Internet]. [cited 2021 Jun 7]. Available from:
  8. Medical students take final exams online for first time, despite student concern | Imperial College London | The Guardian [Internet]. 2020 [cited 2021 Jun 7]. Available from:
  9. (23) What are the pros and cons of online assessment? | LinkedIn [Internet]. [cited 2021 Jun 8]. Available from: are the pros and cons of online assessment%3F_article-card_title
  10. The Advantages and Disadvantages of an Online Examination System [Internet]. [cited 2021 Jun 8]. Available from:
  11. Chen BY, Kern DE, Kearns RM, Thomas PA, Hughes MT, Tackett S. From modules to MOOCs: Application of the six-step approach to online curriculum development for medical education. Acad Med. 2019;94(5):678–85.
  12. Sandhu P, de Wolf M. The impact of COVID-19 on the undergraduate medical curriculum. Med Educ Online. 2020;25(1):1764740.
  13. Kay D, Pasarica M. Using technology to increase student (and faculty satisfaction with) engagement in medical education. Adv Physiol Educ. 2019;43(3):408–13.
  14. Kühbeck F, Berberat PO, Engelhardt S, Sarikas A. Correlation of online assessment parameters with summative exam performance in undergraduate medical education of pharmacology: a prospective cohort study. BMC Med Educ. 2019;19(1):412.
  15. Baig M, Gazzaz ZJ, Farouq M. Blended learning: The impact of blackboard formative assessment on the final marks and students’ perception of its effectiveness. Pakistan J Med Sci. 2020;36(3):327–32.
  16. Snekalatha S, Marzuk M, Meshram SA, Uma Maheswari K, Sugapriya G, Sivasharan K. Medical Students’ Perception of The Reliability, Usefulness And Feasibility of Unproctored Online Formative Assessment Tests. Adv Physiol Educ. 2021;45(1):84–8.