Evaluating the quality, utility, and reliability of the information in uveitis videos shared on YouTube

This study aimed to analyze the quality of videos on YouTube as educational resources about uveitis. An online YouTube search was performed using the keyword “uveitis”. Total view counts, duration of videos, publishing dates, likes and dislikes, numbers of comments, and source of videos were recorded. The quality and accuracy of the video’s educational content were evaluated using the DISCERN score, Global Quality Score (GQS), and Journal of the American Medical Association (JAMA) score. Video power index (VPI) was used to evaluate both the view and the like ratio of the videos. All videos were classified according to publishers and types of categories. From among the 200 videos analyzed, 94 were included. The mean DISCERN score was 38.5 ± 13.2 (poor), the mean JAMA score was 1.8 ± 0.6 (fair), and the GQS was 2.5 ± 0.9 (fair). There were positive correlations between the three checklists (p < .001). VPI was not correlated with each score (p > .05). The most common upload sources were ophthalmologists (24.4%) and YouTube channels about health (20.2%). Regarding content, we identified 47 (50%) medical education, 26 (27.6%) patient education, 16 (17%) patient experience, and five (5.3%) surgical procedure videos involving patients with uveitis. While the most popular videos were uploaded by doctors other than ophthalmologists, the videos uploaded by academic institutions and associations of healthcare professional were found to have higher educational quality and reliability scores. Uveitis videos on YouTube are of poor quality and reliability and are not adequately educational for patients. Therefore, physicians must be aware of the limitations of YouTube and ensure the flow of correct medical information to patients.


Introduction
YouTube is one of the most popular websites, as it is the most widely used free video sharing platform worldwide, the second largest search engine, and the third most visited website after Google and Facebook [1,2]. As a result of the increase in the use of smartphones and the ease of access to the internet, You-Tube has become more common. With 500 h of new

Abstract
Background This study aimed to analyze the quality of videos on YouTube as educational resources about uveitis. Methods An online YouTube search was performed using the keyword "uveitis". Total view counts, duration of videos, publishing dates, likes and dislikes, numbers of comments, and source of videos were recorded. The quality and accuracy of the video's educational content were evaluated using the DIS-CERN score, Global Quality Score (GQS), and Journal of the American Medical Association (JAMA) score. Video power index (VPI) was used to evaluate both the view and the like ratio of the videos. All videos were classified according to publishers and types of categories. Results From among the 200 videos analyzed, 94 were included. The mean DISCERN score was 38.5 ± 13.2 (poor), the mean JAMA score was 1.8 ± 0.6 (fair), and the GQS was 2.5 ± 0.9 (fair). There were positive correlations between the three checklists (p < .001). VPI was not correlated with video content upload speed per minute and nearly 2 billion visitors per month, the platform is becoming a rapidly growing visual library [1]. Because of this vast capacity, YouTube is becoming a frequent choice for both patients and healthcare professionals to gain knowledge and share their experiences [3]. However, in addition to its popularity, easy accessibility, and free awareness, YouTube can generate false and possibly misleading health-related information due to the lack of peer review and content control for uploading videos. Therefore, it is important for health-related video providers to be aware of the quality and accuracy of medical information that patients are accessing online [4]. Previously, ophthalmologists presented studies on YouTube video quality regarding retinitis pigmentosa, cataract surgery, refractive surgery, strabismus, keratoconus, multifocal intraocular lens, keratoplasty, and soft contact lenses [4][5][6][7][8][9].
To our knowledge, no previous study has assessed YouTube videos addressing uveitis. Thus, our study aimed to evaluate the quality of YouTube videos addressing uveitis as educational resources for patients.

Materials and methods
An online search of YouTube was performed using the keyword "uveitis". A total of 200 videos were recorded. The standard search preferences were selected as "sort videos by view count". After these videos were investigated, the remaining 94 videos were included in the study. All video searches were done by clearing the entire search history without logging in. Irrelevant, duplicated, soundless, poorly accessible, non-English videos and videos shorter than 60 s were excluded from the evaluation process. The following data were recorded for each video: video title, views, duration (s), time since upload date (day), daily views, number of likes and dislikes, number of comments, like ratio (like × 100/ [like + dislike]), DISCERN score, Global Quality Score (GQS), Journal of the American Medical Association (JAMA) score, and video power index (VPI) score. The video's source was categorized as follows: ophthalmologist, YouTube channels about health, academic institution, associations of healthcare professional, private hospital or trading company advertising, patient, doctors other than an ophthalmologist. The videos were also grouped according to the medical education, patient education, patient experience, and surgical treatment in a patient with uveitis.
DISCERN is a scoring system developed at Oxford University that evaluates the reliability and educational quality of information, especially regarding treatment. It consists of three sections of 16 questions total, each scored from 1 to 5. The first section involves questions 1 to 8 and evaluates the reliability of a publication. The second section focuses on treatment-related information with seven questions; the third section addresses the overall quality of the video's content with one question. The last question in the third section is excluded from the scoring, and the evaluation is made on 15 questions. The DISCERN scoring system ranges from 15 to 75 points and classifies items as excellent (63-75 points), good (51-62 points), fair (39-50 points), poor (27-38 points), or very poor (15-26 points) [10,11]. The JAMA scoring system is a well-known quality evaluation tool that can be used to assess the transparency and publication information of the video source. It involves four criteria; authorship, attribution, disclosure, and currency, each scored from 0 to 1. Four points indicate the highest quality [12].
The Global Quality Score (GQS) is a 5-point scale scoring system that allows users to evaluate the overall quality of video's content (each scored from 1 to 5 points for a total possible score of 5 points) [13].
Video power index (VPI) is a scoring system defined by Erdem and Karaca that evaluates the popularity of the videos. To analyze the view and like ratio of the videos, we use the "Video Power Index" (VPI). Following is how the VPI was determined: The VPI is equal to the like ratio*view ratio/100 after calculating the like ratio (like*100/[like + dislike]) and the view ratio (number of views/ time since upload date (days) [14].
Each video was independently scored by two experienced uveitis specialists (B.T. and M.O.), and the mean values of the DISCERN, GQS, and JAMA scores were evaluated.

Statistical analysis
The SPSS software, Version 21.0 (SPSS Inc., Chicago, IL, USA), was used for statistical analyses. Descriptive statistics of the continuous variables in the study are shown with means and standard deviations. Categorical variables are shown with frequency and percentage. The Kolmogorov-Smirnov test was initially used to determine whether the data were distributed normally. We compared continuous variables between the groups with the Kruskal-Wallis test. The Mann-Whitney U test was used to test the significance of pairwise differences using Bonferroni correction. The correlation statistics were performed using the Spearman test. All P-values less than 0.05 were considered to indicate statistical significance.

Results
A total of 94 videos met the inclusion criteria. Videos in any language other than English (n = 80), soundless videos (n = 16), videos of animal eye diseases (n = 7), duplicate video (n = 1), video shorter than 60 s (n = 1), and videos unrelated to uveitis (n = 1) were excluded. Table 1 summarizes the descriptive statistics of the 94 included videos.
Of the 94 videos, 47 (50%) were medical education, 26 (27.6%) were patient education, 16 (17%) were patient experience, and five (5.3%) were surgical procedures involving a patient with uveitis. Regarding publishers, 23 videos (24.4%) were uploaded by an ophthalmologist, 19 (20.2%) by YouTube channels about health, 18 (19.1%) by an academic institution, 12 (12.7%) by associations of healthcare professional, nine (9.5%) by private hospital and trading company, nine (9.5%) by the patient, and four (4.2%) by doctors other than ophthalmologists. When evaluated according to the country of origin, most videos had been uploaded from the United States and India (38.3% and 36.2%, respectively).
The DISCERN, GQS, JAMA, and VPI scores were compared among categories and publishers (Tables 2  and 3). While there was a statistically significant difference in the DISCERN, GQS, JAMA scores among categories and publishers (p < 0.01), there was no significant difference between the groups in terms of mean VPI score (popularity of video) (p > 0.05). In the evaluation based on video categories, medical and patient educational videos had significantly higher quality of information (DISCERN score) than videos of patient experience (patient experience vs patient education videos p < 0.05; patient experience vs medical education videos p < 0.001). There was no significant difference between the other video categories (p > 0.05). Mean GQS (educational value) and JAMA score (transparency) of patient experience videos were significantly lower than the patient and medical educational videos (p < 0.01). There was no significant difference in GQS and JAMA score among the other groups (p > 0.05). When classifying each video according to publishers, DISCERN scores of academic sourced videos were significantly higher than the patient and advertising sourced videos . JAMA scores of academic institutionsourced videos were significantly higher than the YouTube channels about health and patient-sourced videos (p < 0.01, p < 0.01, respectively). Table 4 presents correlations between DISCERN score, Global Quality score, JAMA score, VPI score, duration of video, time since upload date, and number of comments. The DISCERN score was significantly correlated with GQS and JAMA scores (p < 0.001). The VPI was significantly correlated with time since upload date and number of comments (r = − 0.401, p < 0.001; r = 0.416, p < 0.001, respectively). Furthermore, duration of video was positively correlated with DISCERN score, GQS, and JAMA score (r = 0.565, p < 0.001; r = 0.376, p < 0.001; r = 0.29, p < 0.01, respectively).

Discussion
YouTube videos associated with uveitis are watched frequently, and some have more than approximately 30,000 views per year. Recent studies have reported that 75-80% of patients use the Internet to obtain medical information [15]. However, with easy accessibility and being free, YouTube may have inaccurate and possibly misleading health-related information due to the lack of peer view and content control for uploading videos.
In our study, we used DISCERN, Global Quality scores, and JAMA to investigate the accuracy and reliability of uveitis videos available online, the mean values of which were 38.5 ± 13.2 (poor), 2.5 ± 0.9 (fair), and 1.8 ± 0.6 (fair), respectively. Those results demonstrate that although patients have preferred to access YouTube videos to learn about uveitis, these videos have not appropriately educated them about the disease. When the videos were separated according to their categories and evaluated according to their popularity with the VPI objective scoring system, it was seen that the highest VPI score was in the surgical videos of patients with uveitis, and the lowest VPI score was found in the patient experience and medical education videos.
Although the mean DISCERN and GQS score of medical education was higher than other categories, the VPI score was lower than other categories. Similar results have been demonstrated in previous studies [4,11,16]. These results show an inverse relationship between the DISCERN and VPI scores in our study, which suggests that the VPI decreases as the educational quality of the videos increases. The reasons for the high VPI score of surgical videos may be that clinicians learn about surgical procedures or patients are curious about the surgical procedure. It is thought that the low quality and reliability of the videos containing the patient experiences may be because some incorrectly obtained information is presented directly to YouTube users without any filtering. In our study, when uveitis videos were divided according to their categories, nearly 50% were videos uploaded for medical staff training (medical education), whereas 25% were videos uploaded for patient education purposes. It was seen that the GQS of the patient education videos were at the highest level. However, it was observed that patient education videos had a lower DISCERN score than medical education videos. This is because while educational information for patients is more included in patient education videos, it may be that more academic subjects are included in medical training videos.
In our study, videos were also classified according to publishers. The highest DISCERN score was seen in videos published by academic institutions. The DISCERN and VPI scores of the academic institute videos were 47.1 ± 15.1 and 5.6 ± 4.5, respectively. It has been determined that the videos published by academic institutions have high DISCERN scores, although they have low VPI scores. Many videos about uveitis consisted of videos where academic institutions and ophthalmologists directly transferred the course recordings to YouTube to train medical staff. Therefore, these videos have high educational quality but low popularity. Patients may have difficulty understanding these lectures, which contain medical terms. As a matter of fact, when the first 200 videos in our study were examined according to the study's criteria, it was noted that most of them have not contained phrases such as "for patients" or "for healthcare professionals" in their titles. Accordingly, physicians who upload videos to train healthcare personnel should know that patients can also watch these videos. To clear the ambiguity, it may be helpful to create an upload option that identifies medical videos targeting patients or healthcare professionals. Kucuk et al. obtained similar results in their study and emphasized the importance of specifying the target audience of the videos in the video title [11].
Recently, there has been an increase in studies evaluating YouTube videos about ocular conditions. In the first reported studies from ophthalmology, Guthrie et al. investigated YouTube videos about retinitis pigmentosa and observed that 50% of the videos were misleading and that only one-third of the videos were useful [5]. In another study, Bae and Baxter investigated 72 YouTube videos about cataract surgery and determined that the educational quality was inadequate for patients [7]. In studies evaluating the quality of YouTube videos that are related to refractive surgery, soft contact lenses, multifocal lenses, and keratoplasty, the quality of these videos were found to be poor or fair [9,11,17,18]. Similar results were obtained in our study. Although some YouTubes videos contain useful information for uveitis patients, most videos have poor quality, reliability, and inadequate information.
Another main finding of the present study was that the main scoring checklist (DISCERN, JAMA, and GQS score) showed positive correlations among themselves. However, there was no correlation between VPI scores and these main scoring systems. A statistically significant positive correlation was found between the video duration and the main scoring systems. However, no statistically significant correlation was found between VPI score and video duration. As a result of these findings, it was seen that the main scoring systems generally gave parallel results in reflecting the educational quality and reliability of the video. Similar results were found in the study of Kucuk et al. and Yildiz et al. [9,11]. In general, other studies have shown an inverse relationship between video popularity (VPI score) and main scoring systems. [9,11,16]. The relationship between video durations and main scoring systems has not been evaluated in previous studies. Our study observed that the main scoring systems increased with the increase of video duration, but no statistically significant correlation was found between the VPI score and main scoring systems. The reason why short-duration videos get low scores may be because the educational content is not adequately conveyed to the YouTube users. Gill et al. reported that the videos with longer-term popularity tended to have durations well below 10 min [19]. During the preparation process of the videos, it should not be ignored that the viewers should be prepared in a reasonable time so that they deliver sufficient information and viewers do not get bored while watching the videos. The average duration of the videos in our study was greater than 20 min. It was determined that the average video duration was quite high compared to other studies [4,11,16]. This is because most of the videos in our study are medical education videos, and medical education videos are published directly on YouTube without editing. In order to reach a wider audience, attention should be paid to the quality and duration of the videos.
There are a few limitations to this study. First, we evaluated the videos at one specific time moment. Considering the dynamic origin of YouTube parameters, the content of YouTube may change over time. Finally, we evaluated only English language videos.
In conclusion, we found that YouTube is an inadequate source of information for uveitis. Although most of the videos were scored as moderate in terms of their educational quality, YouTube videos should still not be considered a fully reliable source of information about uveitis. Therefore, more participation is needed from especially academic instituions and universities to contribute useful videos.