Are We Accurately Teaching the Online Public about Appendicitis? A Novel Evaluation of YouTube Videos' Content

Corbin Walters, B.S., Oklahoma State University Center for Health Sciences Amanda Hale, D.O., Durant Family Medicine Clinic
Stormy Walkup, D.O., Durant Family Medicine Clinic
Matt Vassar, PhD., Oklahoma State University Center for Health Sciences




Abstract

Introduction

Appendicitis affects approximately 250,000 people each year in the US. With increasing health costs, patients are turning to online platforms like YouTube to evaluate if their symptoms warrant urgent medical attention. The goal of this investigation is to evaluate the quality of the highest viewed appendicitis videos on YouTube using a novel scoring system developed by physicians.

Methods

We searched YouTube for videos related to appendicitis. These videos were scored in a blinded and independent fashion using a pilot-tested Google Form. Scores for each video could range from negative 8 to 21 points, with a point deduction for each misleading claim. We extracted the number of views, likes/dislikes, and presenter type for each video.

Results

Of the 98 videos scored, 92 were included in the final analysis. The mean total score was 6.93, with a median score of 4.34. The range of scores was -7 to 21. There was a significant difference in total scores among YouTube videos from healthcare professionals compared to individuals with unknown credentials (P = 0.05). No significant difference was noted for number of likes or views.

Discussion

Medical education on YouTube presents unique challenges for physicians and other healthcare professionals. One way to increase public exposure of high-quality medical information is through videos created by research and medical institutions. More high-quality YouTube videos from these institutions may better facilitate conversations between patients and providers, while minimizing the number of harmful or misleading statements.

Introduction

Appendicitis is one of the most common causes of emergency surgery in the United States, and approximately 250,000 cases of appendicitis occur annually in the United States alone, which accounts for an estimated 1 million days spent in the hospital.1 Due to the high prevalence and emergent nature of appendicitis, it is vital that patients have access to accurate information about the symptoms associated with this infection, as well as other possible diagnoses related to their symptoms. Thus, many patients have sought out medical information on the Internet to inform the choices they make associated with their medical care. A recent study found that 72% of Internet users conduct online searches for information that is related to their healthcare.2 Although finding health information online has become popular, patients should be made aware of the risks associated with trusting medical advice that is not directly given by a qualified healthcare professional.

For many patients, the decision to trust online health information may be a financial one. The costs associated with seeing any medical professional, including time off work, copays, and health insurance, prohibit some from seeking medical attention. In these instances, the public may seek out medical information online related to their symptoms to evaluate whether home remedies exist or whether professional medical attention is required. One online platform - YouTube, which reports over 1 billion hours of video streaming daily3 - is commonly used by patients to find information about a wide variety of medical topics. Several studies have examined the quality of the information made available on YouTube regarding various health conditions4-8. However, these studies examine the quality of medical information available on YouTube, but do not inform healthcare professionals or the public how to critique the accuracy of medical information on YouTube using an evidence based approach. The goal of the present study is the use of a novel scoring system developed by physicians to assign a quality rating to YouTube videos related to appendicitis.

Methods

Search and Inclusion Criteria

To determine the wording of our YouTube search, we compared the popularity of Google searches for a series of keywords - appendicitis, inflamed appendix, ruptured appendix - using Google Trends. Google Trends provides data on search volume indices and geographical information for user-searched topics.9 Using the results of our search using Google Trends, we found that appendicitis was the most popular search of the three terms tested. Our group reached a consensus to use the search term "appendicitis" as the keyword in our YouTube search. On September 7, 2018, we searched YouTube using the keyword "appendicitis" and applied the "most viewed" filter in order to categorize the videos to display the most viewed videos in sequential order, with the overall most viewed video first. We applied the following inclusion criteria to be considered for this study: each video must have at least 10,000 views, include English subtitles or narration, have a video quality of at least 240p, be 20 minutes or less in length, and that the video was strictly related to appendicitis as a health condition.

Appendicitis Video Scoring

Two family medicine resident physician authors (A.H. and S.W.) created a novel scoring system to evaluate the best practices related to diagnosing and treating appendicitis. The scoring system was verified by a board-certified family medicine physician before beginning data extraction. The scoring tool for appendicitis videos is outlined in Table 1. Scores for each video could range from negative 8 to 21 points, where a score of 21 points indicated the highest possible score that met all scoring criteria in an accurate manner. One point was deducted for each misleading piece of information; thus, a total score could assume a score of negative 8. The number of views received along with the number of likes and dislikes for each video were recorded. After applying the inclusion criteria, two of the researchers (A.H. and S.W.) watched and scored each video in an independent and blinded manner. Data was extracted in duplicate using a Pilot-tested Google Form. Discrepancies in scoring differences were resolved with consensus.




Statistical Analyses

We reported both means (with standard deviations) and medians (with interquartile ranges) to summarize the scores found for all videos, the number of views the videos received, and the number of likes and dislikes given for all videos. We conducted a t-test to evaluate differences on the number of likes, views, and total score between videos posted by healthcare professionals and videos posted by individuals with unknown, or no, credentials.

Results

Approximately 98 videos met all inclusion criteria necessary to be viewed and scored. The interrater agreement for each component of the scoring process was 820/1170 (70.1%). After scoring, one video was excluded for not including written text or audio in English. Additionally, 5 videos were excluded since they were removed from YouTube during the writing of this study. Characteristics of the 92 YouTube videos are provided in Figure 1.





The mean total score was 6.93 (SD=6.57), with a median score of 4.34 (IQR: 1.75-11.63). The range of scores was -7 to 21. We examined the relationship between the provider source (healthcare professional vs. unknown credentials) on the effect of total score, views, and likes. There was a significant difference in total scores among YouTube videos from healthcare professionals (M = 8.77, SD = 6.91) compared to individuals with unknown credentials (M = 5.97, SD = 6.24); t(92) = -1.98, P = 0.05. No significant difference was noted for the number of likes between healthcare professionals (M = 297.44, SD = 498.32) and unknown credentials (M = 854.90, SD = 2,329.25); t(88) = 1.33, P = 0.19. Additionally, no significant difference was seen on the number of views for healthcare professionals (M = 82,773.67, SD = 11,5863.40) and unknown credentials (M = 134,202.80, SD = 233,369.70); t(88) = 1.1677, P = 0.25.

The mean duration of the included videos was 335 seconds (SD = 287). A male voice narrated 51% (47/92) of the videos, and 28% (26/92) were narrated using a female voice. Narration was not used in 21% (19/92) of the videos. Of the narrated videos (n=73), 12% (9/73) used a computer-generated voice. Computer-generated narration (n=9) had a mean score of 10.19 (SD = 4.0) and a median score of 10.5 (IQR: 9.92-15.00), whereas narration by a healthcare professional (n=31) had a mean score of 8.47 (SD = 6.81) and a median score of 6.67 (IQR: 3.00-12.33). Narration by individuals with unknown credentials (n=32) had an average score of 6.65 (SD = 6.63) and a median score of 4.17 (IQR: 2.00-11.63). Approximately 32% (30/92) of the videos had at least 1 misleading or harmful statement related to appendicitis, while 24% (22/92) of videos had 2 misleading or harmful statements, and 13% (12/92) had 5 or more.

Discussion

Our analysis found that less than half of the included videos scored greater than a quarter of the possible points. Concerningly, only 2 videos achieved a perfect score using our novel scoring system. Approximately 30 videos contained at least one misleading or outright harmful statement. These findings are particularly concerning because they suggest a lack of high-quality medical information on appendicitis on YouTube. These findings -that few YouTube medical education videos satisfy all quality requirements-are similar to the results found in other YouTube studies.10,11 However, our analysis differs from previous studies through the inclusion of a scoring sheet for grading YouTube medical information that was developed and screened by family medicine residents. Thus, for patients searching YouTube for a diagnosis related to appendicitis symptoms, it is crucial that the videos be high-quality resources from a reliable research or medical institution, and free of false or harmful medical information.

A Google search for YouTube videos using the keyword "appendicitis" returned 95,000 results. With a life-threatening medical condition such as appendicitis, access to accurate, high-quality information in a timely fashion is of the utmost importance. As of September 2020, YouTube does not have a monitoring system to screen or validate the quality of the posted videos, leaving the responsibility of sifting through YouTube's vast catalog of videos to the patient. Numerous YouTube medical education videos presenting unverified information of dubious origin have the potential to cause severe harm to patients. For example, a 2015 study examining skin cancer videos on YouTube found that the most viewed videos recommended the use of a black salve as a home remedy for treating skin cancer, despite the gross lack of evidence to support its use.12 The black salve's toxic effects may cause severe scarring, diffuse spread of infection, and even hinder the attempts by medical professionals to treat the underlying cancer.13 Despite these valid concerns, YouTube continues to be a useful platform for the dissemination of medical information for patients and healthcare professionals14.

One way to increase public exposure of high-quality medical information is through videos created by research and medical institutions. Previous studies have shown that videos created by researchers, medical institutions, and healthcare professionals receive higher quality scores when compared to videos created by individuals without an advanced medical degree10,15-18, further validating the findings of this study. Recent studies have also noted that videos created by individuals or institutions in the research and medical communities have a larger proportion of their videos devoted to scientific content.19,20 Roughly one third of the videos scored in our study were narrated by a healthcare professional, yet none were produced by the American College of Surgeons, a scientific and educational surgical society that represents surgical fellows. An appendectomy is a procedure commonly performed by general surgeons, whose knowledge and expertise of signs, symptoms, and intervention would be very much welcomed by the public and other healthcare professionals.

Medical education on YouTube presents unique challenges for physicians and other healthcare professionals. First, these providers may be unaware of the keyword used by their patients to search for a video. The following concern is the source responsible for producing the video, such as a medical institution, an uncredentialed establishment, or individual providing medical advice (regardless of whether this advice is medically sound). Finally, healthcare professionals must be able to validate the accuracy and application of the medical education given. A common finding reported in YouTube studies is the presence of misleading health information.10,18 Misleading statements in videos that discuss the signs, symptoms, and treatment of appendicitis can be dangerous for patients, especially if the videos suggest that surgical and appropriate medical interventions are not required to resolve the infection. More high-quality YouTube videos from reliable sources available to the public may better facilitate conversations between patients and providers, thus allowing for better patient care.

Strengths and Limitations

The strength of this study includes the use of two-family medicine residents to grade each video in a blind and independent manner using an evidence based criteria to score the quality of videos on appendicitis. These family medicine residents, along with a family medicine attending physician, frequently diagnose and differentiate appendicitis from other causes of gastrointestinal pain. By using Google Trends to evaluate the most searched term for appendicitis, we ensured that our selection of videos was the most appropriate and accurate to the video search by the public. And although we believe that we have applied a strong, evidence-based approach, this study is not without limitations. First, several other studies that have previously examined the quality of YouTube videos used the 16-question DISCERN tool to give videos an overall score ranging from 1 (low quality) to 5 (high quality).21-23 However, a separate study by Sunderland et al.24 recently created their own scoring tool that was based on the criteria found in the DISCERN tool. Additionally, the DISCERN scoring system relies on subjective criteria. In our study, we relied on objective, evidence based criteria. Despite not being validated by other studies, we are confident in the design of our own scoring sheet devised by two family medicine residents and a family medicine attending physician based on information on appendicitis found in UpToDate.25 Future studies in all fields of medicine are warranted in order to properly evaluate the quality of medical information that is available to patients via the Internet and social media.






References

1. Prystowsky JB, Pugh CM, Nagle AP. Current problems in surgery. Appendicitis. Curr Probl Surg. 2005;42(10):688-742.

2. The social life of health information. Pew Research Center. Accessed May 29, 2019. https://www.pewresearch.org/fact-tank/2014/01/15/the-social-life-of-health-information/

3. Nicas J. YouTube Tops 1 Billion Hours of Video a Day, on Pace to Eclipse TV. WSJ Online. https://www.wsj.com/articles/youtube-tops-1-billion-hours-of-video-a-day-on-paceto-eclipse-tv-1488220851. Published February 27, 2017. Accessed May 31, 2019.

4. Tang W, Olscamp K, Choi SK, Friedman DB. Alzheimer’s Disease in Social Media: Content Analysis of YouTube Videos. Interact J Med Res. 2017;6(2):e19.

5. Stellefson M, Chaney B, Ochipa K, et al. YouTube as a source of chronic obstructive pulmonary disease patient education: a social media content analysis. Chron Respir Dis. 2014;11(2):61-71.

6. Lee JS, Seo HS, Hong TH. YouTube as a source of patient information on gallstone disease. World J Gastroenterol. 2014;20(14):4066-4070.

7. Benway BM. YouTube as a Source of Information on Kidney Stone Disease. Yearbook of Urology. 2011;2011:8-9. doi:10.1016/j.yuro.2011.06.062

8. Adorisio O, Silveri M, De Peppo F, Ceriati E, Marchetti P, De Goyet JDV. YouTube and pediatric surgery. What is the danger for parents? Eur J Pediatr Surg. 2015;3(02):203-205.

9. Rogers S. What is Google Trends data - and what does it mean? Medium. Published July 1, 2016. Accessed May 31, 2019. https://medium.com/google-news-lab/what-is-googletrends-data-and-what-does-it-mean-b48f07342ee8

10. Sahin AN, Sahin AS, Schwenter F, Sebajang H. YouTube Videos as a Source of Information on Colorectal Cancer: What Do Our Patients Learn? J Cancer Educ. Published online September 21, 2018. doi:10.1007/s13187-018-1422-9

11. Camm CF, Russell E, Ji Xu A, Rajappan K. Does YouTube provide high-quality resources for patient education on atrial fibrillation ablation? Int J Cardiol. 2018;272:189-193.

12. Basch CH, Basch CE, Hillyer GC, Reeves R. YouTube Videos Related to Skin Cancer: A Missed Opportunity for Cancer Prevention and Control. JMIR Cancer. 2015;1(1):e1.

13. Croaker A, King GJ, Pyne JH, Anoopkumar-Dukie S, Liu L. A Review of Black Salve: Cancer Specificity, Cure, and Cosmesis. Evid Based Complement Alternat Med. 2017;2017:9184034.

14. Wakefield MA, Loken B, Hornik RC. Use of mass media campaigns to change health behaviour. Lancet. 2010;376(9748):1261-1271.

15. Gonzalez-Estrada A, Cuervo-Pardo L, Ghosh B, et al. Popular on YouTube: a critical appraisal of the educational quality of information regarding asthma. Allergy Asthma Proc. 2015;36(6):e121-e126.

16. Ward B, Ward M, Nicheporuck A, Alaeddin I, Paskhover B. Assessment of YouTube as an Informative Resource on Facial Plastic Surgery Procedures. JAMA Facial Plast Surg. Published online August 16, 2018. doi:10.1001/jamafacial.2018.0822

17. Loeb S, Sengupta S, Butaney M, et al. Dissemination of Misinformative and Biased Information about Prostate Cancer on YouTube. Eur Urol. Published online November 27, 2018. doi:10.1016/j.eururo.2018.10.056

18. Azer SA, Bokhari RA, AlSaleh GS, et al. Experience of parents of children with autism on YouTube: are there educationally useful videos? Inform Health Soc Care. 2018;43(3):219233.

19. ReFaey K, Tripathi S, Yoon JW, et al. The reliability of YouTube videos in patients education for Glioblastoma Treatment. J Clin Neurosci. 2018;55:1-4.

20. Rayi A, Borad SJ, Kemper SE, Malhotra K. What information about sudden unexpected death in epilepsy (SUDEP) is available on YouTube? Epilepsy Behav. Published online December 6, 2018. doi:10.1016/j.yebeh.2018.10.017

21. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105-111.

22. Goobie GC, Guler SA, Johannson KA, Fisher JH, Ryerson CJ. YouTube Videos as a Source of Misinformation on Idiopathic Pulmonary Fibrosis. Ann Am Thorac Soc. 2019;16(5):572579.

23. ReFaey K, Tripathi S, Bohnen AM, et al. The Reliability of YouTube Videos Describing Stereotactic Radiosurgery: A Call for Action. World Neurosurg. Published online January 28, 2019. doi:10.1016/j.wneu.2019.01.086

24. Sunderland N, Camm CF, Glover K, Watts A, Warwick G. A quality assessment of respiratory auscultation material on YouTube. Clin Med . 2014;14(4):391-395.

25. Martin RF. Acute appendicitis in adults: Clinical manifestations and differential diagnosis. In: Weiser M, Chen W, eds. UpToDate. UpToDate; 2018.