Optimizing Quality of Electronic Medical Records as a Platform for Resident Education

Background Electronic medical records (EMRs) are becoming a predominant feature of health care and an important tool in residents’ medical education. Few studies have examined and compared the EMR performance of residents from different departments. This study explores the quality of EMRs in residency and offers pedagogical implications. Method Authors developed an EMR checklist which examined fundamental EMR requirement through six dimensions and 20 items. A total of 149 records created by residents from 32 departments/stations were randomly chosen. Seven senior physicians rated the EMRs using the checklist. Medical records were grouped as General Medicine, Surgery, Pediatric, Obstetrics and Gynecology and Other Departments. Overall and grouped performances were analyzed by ANOVA. Results Overall performance was rated as fair to good. Regarding the six dimensions, the discharge note (0.81) gained the highest scores, followed by the admission note (0.79), problem list (0.73), overall performance (0.73), progress note (0.71), and weekly summary (0.66). Among the ve groups, Other Departments (80.20) gained the highest total score, followed by Obstetrics and Gynecology (78.02), Pediatrics (77.47), General Medicine (75.58), and Surgery (73.92). Conclusions The quality of EMRs, core competencies, and clinical reasoning skills are deeply interconnected in resident education. This study suggested that the use of duplication in EMR, and the communication abilities of residents are associated with the variable quality of medical records in different departments. Further study is required to apply the insights obtained in this study to improve the quality of EMRs, and thereby the effectiveness of resident training. This study extends previous research to demonstrate the need for improving the dimensions and items in residents’ EMRs. Our results suggest that duplications and unqualied clinical communication skills present hidden problems in the use of the EMR system in the complex clinical scenario. In addition, the various results of dimensions and groups offers potential insights into the various cultures and needs of EMR training for residents in different specialties. From the examination of authentic resident created EMRs, we provided pedagogical suggestions which thoroughly encompass the spectrum of resident education.


Introduction
The electronic medical record (EMR) is becoming a predominant instrument for improving the delivery of quality health care [1,2]. Additionally, the EMR is becoming an important tool in medical education for both medical students and residents. A previous study has linked key EMR issues to the Accreditation Council for Graduate Medical Education Core Competencies [3]. Bowen proposed EMR as a potential platform to assess residents' performance and train their competency in clinical reasoning [4]. She presented key elements in medical education on clinical processes, which were in line with elements in EMR. Collectively, investments in EMR could help EMR function as a bridge to the six core competencies [5], which are closely related to the clinical reasoning process.
There were, however, challenges for residents using EMRs in daily clinical practice. EMR could overwhelm residents with massive data, stunt critical clinical thinking, restrict doctor-patient communication, and thereby weaken support from interprofessional collaboration [3]. For residents who were under training, the EMR system had the unintended consequence of limiting access to patient contact. This restriction might result in a dilemma where residents are equipped with medical knowledge, but inexperienced in core competencies in residency training [6,7,8]. Moreover, a resident's performance in a clinical encounter is based upon various affordances of the clinical scenario [9]. Residents need to recall fundamental medical knowledge from school. This cognitive process is slow when confronted with complex clinical context. There is, therefore, a gap between medical knowledge and speci c encounters which leads to incomplete information in medical records [4,10,11]. In this regard, doctors' clinical reasoning during residency may be inconsistent and various. There is an inferential distance between clinical reasoning and the medical record writing which residents are required to perform while equipped with insu cient medical knowledge [12]. In such situations, using note templates and copy-and-paste are frequent strategies, but such strategies dramatically decrease the quality of medical records as a platform for communication among residents and other medical team members [13,14]. Hence, it requires effort and a transition period to go from clinical observation to medical writing, and then to the EMR system. These challenges are rigorous for young residents who are inexperienced in clinical encounters and reasoning processes.
The EMR includes admission notes, progress notes, a weekly summary, and discharge notes. The EMR system affects how medical records are evaluated and how pedagogical suggestions are provided to learners [5]. Medical educators should develop an organized, uniform and e cient method to train EMR writing skills [15,16]. However, the question remains whether customized educational training programs for EMR improvement would contribute to clinical documentation, medical education, and patient safety. Literature about EMR training discusses its application for medical students or for clinicians in a single department [6,17,18,19]. Few studies investigated the differences in EMR quality between residents from different departments. Residents might be confronted with learning challenges due, in part, to the lack of proper feedback from senior physicians regarding EMR. The feedback content also varies between residents from different specialties or departments. In other words, quali ed EMRs from certain residents might imply that these residents have obtained principle clinical abilities during residency training. The information, however, is lacking as a guide for residents to achieve quality writing in medical records. An investigation of residents' EMR performance is required to reveal the main issues that need to be improved.
The Quality Center of our hospital carried out a competition to survey the current EMR use situation. The aim of this study is to identify the critical issues in the context of EMRs written by residents from various departments. We focused on inpatient records as the medical record format or template. We also grouped departments which shared similar workloads to examine the quality of their EMRs. The research questions are: (1) What is the overall quality of EMRs among residents from different specialties? (2) Does EMR quality vary between residents from different departments/specialties? Method Design During 2018-2019, the Quality Center held a competition for improving the quality of medical records. The competition aimed to raise healthcare providers' awareness of medical records, and investigate medical record training in departments/stations so that clinical teachers could construct a systematic medical record training program.
Competition participants were hospital staff including attending physicians, residents, interns, nurse practitioners, and medical students. The competition included medical records examination, self-assessment reports from departments, and oral presentations. This program was reviewed and approved by the institutional review board at National Cheng Kung University Hospital.

Data collection
This study collected inpatient records created by residents from the competition. The reason was that inpatient records were summary documents containing admission notes, treatment courses, and patient instructions after discharge.
Additionally, residents created inpatient records and attending physicians offered follow-up feedback as inspectors. For these reasons, we believed that an inpatient record should be a comprehensive document for examining residents' performance in creating medical records. As Table 1 showed, 32 out of 37 departments in our hospital participated in the competition and the participation rate was 87 %, (Table 1). Three to ve inpatient records were randomly selected from the stations. The staff who chose the random sample could not participate in the rating process. In the end, 149 inpatient records were rated.

Assessment
To examine the quality of inpatient records, the Quality Center rst produced a draft checklist based on our hospital's clinical needs and on a literature review [9,14]. Next, the Quality Center held an interactive workshop involving multiple departments at our hospital. Medical records committee members and senior doctors reviewed a revised checklist based on the workshop feedback.
The inpatient record checklist contained six dimensions and 20 items, ( Table 2). The six dimensions were: (1) admission note (25%), (2) problems list (8%), (3) progress note (25%), (4) weekly summary (4%), (5) discharge note (25%), and (6) overall performance (13%). Additionally, raters could check "outstanding performance" for an extra 9 points. Each dimension included a different percentage of points based on its functions in an inpatient record. Each dimension and item could be rated as excellent, good, fair or poor. Before the rating process, the seven raters explored three inpatient records and inter-rater reliability was calculated at .86 using Cronbach's alpha.

Data analysis
This study used scores obtained for six dimensions and 20 items. Data were categorized into ve groups, namely General Medicine, Surgery, Pediatrics, Obstetrics and Gynecology, and Other Departments. Scores were analyzed using descriptive statistics, and ANOVA was used to examine the average differences between and within groups. Due to the unequal variance of the ve groups, we used generalized least squares, instead of ordinary least squares, to calculate the differences.

Results
Overall performance The overall quality of medical records was rated "fair" to "good" (Table 3).

Inter-comparison among dimensions and the ve groups
To investigate each dimension's performance, we calculated standardized scores (group mean / total score). As Table 4 showed, the weekly summary needed the most improvement (0.66), followed by the progress note (0.71), problem list (0.73), overall performance (0.73), admission note (0.79), and discharge note (0.81) ( Table 4). In order to understand the performance of each item among the ve groups, the ANOVA analysis was conducted with the grand means as a reference group. As Table 5 showed, residents of Other Departments gained signi cantly higher scores than the grand mean in the following dimensions: Among the 20 items, surgical residents gained only one item score higher than the grand mean (Item 12 in progress note, 0.17). In addition, surgical residents showed signi cantly low scores in the following seven items:

Discussion
The literature on EMRs was mostly about the use of the system; only few studies discussed EMRs in terms of resident education [1,21]. This study explored the dimensions and items within the EMRs conducted by residents from various departments. The checklist in this study could be used as a fundamental requirement in residency. The results indicated that residents showed diverse performance. Overall performance was rated as fair to good. This result was consistent with previous studies which indicated that residents demonstrated inconsistencies between their medical knowledge and their clinical encounters, leading to limited performance in EMRs [4,9,12].
Of the six dimensions, the weakest dimension was the weekly summary, followed by the progress note, problem lists, overall performance, admission note, and discharge note. Returning to the checklist for the weekly summary and progress note, the assessment for these two dimensions emphasized that residents should avoid overuse of copy-and-paste and should revise patients' daily changes and summarize daily information logically. For example, the format of the progress note was subjective-objective-assessment-plan (SOAP), but we found that duplication often occurred in the subjective, which led to low scores in this dimension. Instead of renewing daily examination results, some residents would copy admission notes to progress notes.
From an educational perspective, residents tried to apply medical knowledge to practice-based encounters to identify problems, establish diagnosis, and summarize the report in narrative or chronological order. However, in the current EMR system, information is often presented in a segment, splitting data from multiple screens and modules in various formats. As there are massive amounts of data in EMR systems, residents needed to put more effort into re ning everyday changes and selecting useful data to complete a precise and concise MR. Discharge notes in the inpatient record consisting of a summary with clearly written orders and feedback from attending physicians were well developed. Conversely, the weaker performing dimensions, the weekly summary and progress notes, required careful clinical reasoning elements including collecting the patient's story and screening data to judge critical writing points.
With the judgment based on medical knowledge and patient-care professionalism, a resident could then gain accurate diagnosis and create an illness script, thus conducting high quality EMRs [4]. However, the massive amount of data in the EMR system may hinder the clinical reasoning process of constructing the EMR de novo or receiving essential feedback from residents' advisors or colleagues. This situation indicated that residents might ignore the inter-connection among dimensions within EMRs, which leads to a long but pointless record. In the United States, only 24% of healthcare organizations establish regulations to limit the use of duplication in EMRs. Additionally, more than 60% of residents duplicate data without con rming its accuracy, and this leads to 2-3% of inappropriate diagnosis [22]. Moreover, when residents solely relied on copy-and paste from previous records or templates in EMR system, the feedback was likely not meaningful because the records were not created through doctor-thinking but through computer program-thinking [1].
We proposed that in such clinical reasoning processes, the competencies of medical knowledge, interpersonal communication skills, and system-based practice might be required [3]. The nding was also in line with a previous study which demonstrated that the effective use of EMRs requires narrative elements, data elements, and system elements combined in the context of patient care [5]. In other words, the quality of EMRs, the core competencies, and the clinical reasoning skills are closely interconnected in resident education.
We also found difference among groups. Performance varied depending on the unique affordances and workload in each department. Surgery gained the lowest scores among the ve groups. They performed well solely on Item 12, which was the particular record for operation. This could be a result of the speci c culture of Surgery; for example, heavy workload, time pressure, insu cient information from interview, lack of feedback, and the focus on surgical skills training. As such, residents in Surgery require extra opportunities to learn clinical reasoning processes and chronological descriptions of the EMR system, rather than overly relying on templates and the copy-and-paste strategy.
Conclusively, meaningful experiences using the EMR system should be consistently implemented in clinical training and the integration of the EMR system and core competencies should be rigorously designed and assessed. Enhancing EMR as a meaningful tool could bene t not only the level of clinical skills of residents but also their attitudes toward clinical reasoning and professionalism [6,17].

Pedagogical suggestion
Previous study suggested that EMRs can affect how residents develop clinical reasoning skills and document strategies [5]. Stephens, Gimbel, and Pangaro (2011) proposed a speci c educational approach to clinical documentation called the Reporter-Interpreter-Manager-Educator scheme, which structures the introduction, expectations, and assessment of medical record writing skills throughout the medical education process. We further suggest two key points for EMRs teaching in resident education. First, in spite of the convenience of the EMR system, EMR should avoid duplications and offer additional and concise information through careful judgement of past and future plans. The avoidance of duplication should be a key point in resident education [23].
The overuse of duplication also indicates the gap in clinical communication. For example, review of system (ROS) provides a complete list, other than chief complaints and present illnesses, and helps doctors make clinical decisions [24]. In this study, the ndings show that our residents may lack a comprehensive review of patients. It seems that the EMR has not demonstrated a positive impact on communication [23,25]. In the era of specialized divisions in medicine, a comprehensive review is particularly necessary because it may affect more than 10% of nal clinical diagnoses [26]. An EMR system may enhance the ability of physicians to complete information tasks but can also make it more di cult to focus attention on other aspects of patient communication.
Some suggestions are given to mitigate EMRs' negative educational impact [23]. First, a meeting of residents, attending physicians, and other healthcare providers can be held for real and interprofessional communication. After that, doctorpatient communication can be followed so as to ful ll patients' courses based on doctors' narratives. Second, residents can be trained to interact with patients before directly reviewing EMRs [14]. Third, a systematic coaching program may be helpful for integrating important elements and strategies as a whole-package course, and thus clinical teachers can train residents effectively.

Limitation
This study contributes to resident education in the era of EMR, but there are some limitations. First, although the sample in this study contained residents from various departments, it only included 7-10% of the residents in our hospital.
Additionally, the inpatient record checklist was created for clinical assessment. The design process relied on medical experts' discussions and the literature review. In future studies, we will expand the sample size to other hospitals and examine reliability and validity to promote generalization. Second, departments have created medical record templates based on their unique culture and needs long before this study. Although the Quality Center announced the assessment standards before the competition, it was not easy for every department to transform the templates in such a short time. This might lead to a bias that the departments that carefully used the checklist achieved great performance. Third, residents from different departments experience unique work requirements and clinical cultures, and thus this study compared the ve groups instead of comparing group members with each other. Conclusively, despite the limitations, this study provides new insights into resident education and offers directions for future studies on EMR.

Conclusion
This study extends previous research to demonstrate the need for improving the dimensions and items in residents' EMRs. Our results suggest that duplications and unquali ed clinical communication skills present hidden problems in the use of the EMR system in the complex clinical scenario. In addition, the various results of dimensions and groups offers potential insights into the various cultures and needs of EMR training for residents in different specialties. From the examination of authentic resident created EMRs, we provided pedagogical suggestions which thoroughly encompass the spectrum of resident education.

Declarations
Ethical Approval and consent to participate All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
The institutional review board at National Cheng Kung University Hospital approved to waive the requirement for obtaining informed consent (Approval No. B-EX-108-048).
Waiver of informed consent was approved by the institutional review board.

Consent for Publication
Not applicable

Competing interest
The authors declare that they have no competing interests.

Funding
This work was supported by grants from the National Science Council of Taiwan [MOST 108-2314-B-006-097 to JNR].
Authors' contributions HH and JNR led on conception, design, interpretation of data, writing the article and critical appraising the content, and are the guarantor of the paper. LLK contributed to review literatures and analyze data. CCL contributed to organize the competition. CCT, HWH, SYW, YNH, PYL, JLW, and PFC responsible for collection of data. All authors have approved the nal version of the article submitted.