Published on in Vol 9 (2025)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/69103, first published .
Didactic and Content Quality of Basic Life Support Videos on YouTube: Cross-Sectional Study

Didactic and Content Quality of Basic Life Support Videos on YouTube: Cross-Sectional Study

Didactic and Content Quality of Basic Life Support Videos on YouTube: Cross-Sectional Study

1Medical Faculty, Institute for Medical Education and Clinical Simulation, Goethe University, Theodor Stern Kai 7, Frankfurt am Main, Germany

2Department of Anaesthesiology, University Hospital Frankfurt, Intensive Care Medicine and Pain Therapy, Frankfurt am Main, Germany

3Agaplesion Markus Krankenhaus, Frankfurt am Main, Germany

4Department of Trauma Hand and Reconstructive Surgery, University Hospital Frankfurt, Frankfurt am Main, Germany

5Nephrologische Praxis und Dialyse im Klinikum Peine, Peine, Germany

6Department of Anaesthesiology, Intensive Care Medicine and Pain Therapy, Sana Clinic Offenbach GmbH, Offenbach/Main, Germany

7Department for Children and Adolescents, Division for Stem Cell Transplantation, Immunology and Intensive Care Medicine, University Hospital Frankfurt, Frankfurt am Main, Germany

8Department of Anaesthesiology, BG Unfallklinik Frankfurt am Main, Frankfurt am Main, Germany

9VOSS – doctor‘s office, Aschaffenburg, Germany

Corresponding Author:

Miriam Rüsseler, Prof Dr med


Background: Cardiopulmonary resuscitation (CPR) is vital for improving patient outcomes in medical emergencies. Both laypersons and health care professionals often seek guidance on performing CPR. In today’s digital age, many turn to easily accessible platforms such as YouTube for practical skills.

Objective: This study evaluates the didactic and content quality of CPR videos on YouTube using comprehensive checklists and investigates the association between the assigned quality scores and type of publisher, view count, and video rankings.

Methods: Videos were included based on defined search terms and exclusion criteria. Two emergency physicians rated each video independently using validated checklists concerning content and didactic quality. Linear regression analysis was performed to assess the relationships between video quality scores and view counts, as well as video rankings.

Results: Of the 250 videos identified, 74 (29.6%) met the inclusion criteria. On the content checklist, videos scored an average of 56.5% (SD 19.2%), and on the didactic checklist, they scored 66.6% (SD 14.3%); none achieved the maximum score. Videos from official medical institutions scored significantly higher in content quality compared to nonofficial sources (P=.04). Video quality scores were not associated with video rankings or view counts.

Conclusions: The study highlights substantial variability in the didactic and content quality of CPR-related videos on YouTube. For medical educators, this underlines the need to curate and recommend reliable online resources or to develop new high-quality content aligned with established checklists. For the general public, the findings caution against relying on popularity metrics as indicators of accuracy and emphasize the importance of guidance from trusted institutions.

JMIR Form Res 2025;9:e69103

doi:10.2196/69103

Keywords



Ischemic heart disease, the leading global cause of death, presents a substantial challenge to health care systems worldwide [1,2]. In Europe, the incidence of out-of-hospital cardiac arrest ranges from 67 to 179 cases per 100,000 inhabitants [3]. Alarmingly, data from the German resuscitation register show that only 11.1% of individuals who experience out-of-hospital cardiac arrest survive to hospital discharge [4].

Prompt and proficient cardiopulmonary resuscitation (CPR) substantially improves patient outcomes [5-8]. However, gaps in knowledge and performance persist among both laypersons and health care professionals, particularly in essential aspects such as compression depth, hand placement, and compression rate [9-12]. These deficits are further compounded by delays between arrest recognition, CPR initiation, and the arrival of professional medical support [4].

To address these challenges, widespread CPR education is essential, not only in formal training environments but also through accessible, scalable resources that support independent learning. In recent years, especially during the COVID-19 pandemic, online video platforms such as YouTube have become increasingly popular sources for acquiring or enhancing practical medical skills, including CPR [13]. While this accessibility offers great potential, concerns remain regarding the quality, accuracy, and didactic effectiveness of CPR videos freely available online [14,15].

Multiple studies have identified quality deficiencies in YouTube videos on resuscitation [16-27]. For instance, Yaylaci et al [28] reported that only 11.5% of resuscitation videos correctly demonstrated the necessary steps. Similarly, Elicabuk et al [29] reported that 75.3% of Turkish-language CPR videos failed to adhere to current guidelines. However, most previous evaluations relied on limited or nonvalidated assessment criteria [30,31]. This highlights the central problem: learners are widely exposed to CPR videos of uncertain quality but lack reliable indicators of which resources provide accurate and pedagogically sound instruction.

To address this gap, 2 validated checklists were developed: the didactic quality checklist by Rüsseler et al [32] and the content quality checklist by Sterz et al [33], the latter aligned with American Heart Association guidelines [31]. These instruments offer a structured way to evaluate both the educational effectiveness and the technical accuracy of resuscitation videos [32,33].

This study is the first to apply these validated checklists to a sample of English- and German-language YouTube CPR videos, while also examining associations between video quality and publisher type, view count, and search ranking. By integrating didactic and content analysis with visibility metrics, it provides a systematic evaluation of freely available CPR videos and highlights implications both for educators and for the general public.


Study Design and Setting

A retrospective, cross-sectional study design was used. The dataset includes videos uploaded between February 2010 and October 2018 and reflects the state of available content during that period. The video assessment is designed to address a broad target audience, including medical laypersons, professionals, and students.

The evaluation focuses on didactic quality, defined as the appropriateness of educational styles used in the videos, and content quality, which encompasses the procedural correctness of depicted resuscitation techniques and the accuracy of explanations regarding the resuscitation algorithm. Differences in assigned didactic and content quality scores are examined between videos published by official medical institutions or organizations and those from nonofficial sources. In addition, the analysis investigates potential associations between video quality scores and platform metrics such as view counts and search rankings, providing insights into the relationship between video quality and audience engagement within that historical context.

Video Selection Process

The video selection followed a multistage procedure. In October 2018, a search was conducted on YouTube using a German IP address and 10 relevant keywords in both English and German, including chest compression (“Thoraxkompression” and “Herzdruckmassage”), CPR, basic life support, cardiac arrest first aid (“Herzstillstand erste Hilfe”), heart first aid (“Herz erste Hilfe”), and resuscitation (“Wiederbelebung”). These terms were chosen to ensure thematic relevance and to capture common linguistic variations used by both laypersons and professionals.

To reflect typical user behavior, where users predominantly engage with results from the first search page, only the first 25 videos returned for each term were screened. Videos uploaded between 2010 and 2018 were included, acknowledging the ongoing visibility and ranking of older content within YouTube’s algorithm. For each video, metadata such as view count, number of likes and dislikes (noting that public dislike counts were removed in 2021), uploader identity, upload date, video duration, and channel subscriber count were documented.

Inclusion and Exclusion Criteria

Videos were included regardless of their explicitly stated target audience, as this information is often missing from video titles or descriptions. Instructional videos were also considered, even if not explicitly labeled as such.

Exclusion criteria comprised videos related to pediatric or animal resuscitation, content in languages other than English or German, videos lacking visual content (eg, audio only), demonstrations of mechanical compression devices, real-life or intraoperative resuscitation footage, and content focusing solely on cardiac arrest prevention without instructional guidance. Additionally, parody, satire, comedy, entertainment content, and promotional material for CPR training courses were excluded. Duplicate videos were removed based on identical URLs. An overview of the selection process is provided in the Results section.

Reviewer Selection

In total, 16 experienced emergency physicians participated as reviewers in the evaluation process. These individuals were selected based on their extensive expertise and experience in the medical field. The panel included 10 (63%) male physicians and 6 (37%) female physicians, representing a slightly skewed gender distribution toward male physicians. The panel consisted of emergency physicians specialized in various fields, including pediatrics, internal medicine, oral and maxillofacial surgery, trauma care, and anesthesiology. All reviewers actively practice within emergency department settings. Addition, emergency physicians from different German cities, both from university-affiliated and nonuniversity settings, were intentionally included to align with training regulations. This ensures that individuals responsible for resuscitation have relevant experiences and knowledge, regardless of their practice location or institutional affiliation.

Didactic and Content Checklists

Two separate reviewers assessed each video using 2 distinct checklists: the didactic checklist (Checklist 1), developed by Rüsseler et al [32], and the content checklist (Checklist 2), developed by Sterz et al [33]. The mean of both ratings was used for subsequent analyses. In the event of markedly divergent ratings, cases would have been reexamined to ensure plausibility.

Scoring Methodology

The didactic checklist consisted of 21 items, each rated on a 5-point Likert scale (1=strongly disagree to 5=strongly agree), yielding a maximum possible score of 105 points. Didactic quality was evaluated based on various checklist aspects, including the title; learning goals; content and technique; content; text, graphics, and images; logical sequencing; aspects of hygiene; target audience; video length; readability of text, graphics, and images displayed; camera perspective; and the quality of auditory and visual elements.

The content checklist comprised 25 items across 4 domains: initial measures (7 items), chest compressions (8 items), automated external defibrillator (AED) use (6 items), and ventilation (4 items). Each item was scored on a 3-point scale (0=not mentioned, 1=incomplete or incorrect, and 2=correct), resulting in a maximum possible score of 50 points.

Applicability and Score Normalization

Not all checklist sections were applicable to every video. For example, many videos intended for layperson training did not include AED use or ventilation. In such cases, nonapplicable items were excluded entirely from both the maximum possible points and the score. Each video was thus assessed only on relevant items.

To allow for comparison across videos with varying scopes, scores were normalized. The achieved score was divided by the maximum applicable score for that specific video and multiplied by 100 to yield a percentage. For instance, a layperson-focused video with 20 applicable items (maximum score=40) that achieved all 40 points would receive 100%, just like a more comprehensive video with 25 applicable items and a maximum score of 50 that also achieved full marks.

Data Analysis

The data were recorded using Microsoft Excel. Statistical analysis was conducted using Minitab (Minitab Inc) and SPSS (version 26; IBM Corp). Videos were categorized by language, duration, view count, and publisher type (official vs nonofficial sources). Content and didactic quality scores were analyzed descriptively. For continuous variables, results are reported as means and SDs; categorical variables are presented as frequencies and percentages. Missing values were excluded. The 5 highest-rated, 5 lowest-rated, and 5 most-watched videos were identified to illustrate score extremes.

For group comparisons, Student t tests were applied to examine differences in quality scores (content and didactic) between official and nonofficial publishers. Assumptions of independence, normality, and equality of variances were considered: independence was given by design, normality was assessed using the Shapiro-Wilk and Kolmogorov-Smirnov tests, and equality of variances was evaluated with the Levene test. Cohen d was calculated as a measure of effect size.

Associations between video metrics (view count, YouTube ranking, and publisher type) and quality scores were examined using regression analyses. Linear regression was selected because the outcome variables (checklist scores) were continuous. Model assumptions were systematically evaluated using the diagnostic output from Minitab: scatterplots of predictors against outcomes (linearity), residuals-versus-fitted plots (homoscedasticity), and Q-Q plots of residuals (normality). Outliers and influential observations were identified through scatterplots and regression diagnostics, and sensitivity analyses were conducted with and without these data points to assess robustness. R2 values were reported as measures of explained variance (effect size). Regression analyses were conducted in Minitab using a 95% confidence level. A P value of .05 or less was considered indicative of statistical significance.

Ethical Considerations

The Ethics Committee of the Faculty of Medicine, Goethe University Frankfurt, confirmed that no formal ethics approval was required for this type of educational research, as it does not constitute a biomedical research project in the sense of the Declaration of Helsinki and the requirements of §15 of the Professional Code of Conduct for Physicians in Hesse. The study was conducted in accordance with the Declaration of Helsinki.


Overview

Initially, a total of 250 videos were identified based on the search terms. Among these, 120 videos were excluded due to meeting at least one exclusion criterion. Subsequently, 51 duplicate videos were identified and excluded, while an additional 5 videos were removed from the online platform before the rating process. This led to the final analysis comprising 74 videos (Figure 1).

Figure 1. Flowchart of YouTube video selection (retrospective cross-sectional study; YouTube videos accessed via German IP: 2010‐2018; N=74). A total of 250 YouTube videos were screened. After applying predefined exclusion criteria and removing duplicates, 74 unique videos were included in the final analysis.

Video Characteristics

The timeline of video uploads spanned from February 1, 2010, to October 26, 2018. Among the included videos, 38 were in English, whereas 36 were in German. Video durations ranged from 13 seconds to 1 hour and 49 minutes. View counts for these videos varied considerably, ranging from 59 to 3,202,821 views. Notably, English-language videos generally garnered more views than their German counterparts, as of November 30, 2018. Table 1 provides a summary of the key characteristics of the videos.

Table 1. Key characteristics of the YouTube cardiopulmonary resuscitation videos (retrospective cross-sectional study; YouTube videos accessed via German IP: February 1, 2010-October 26, 2018; N=74).
CharacteristicsVideos
Language, n (%)
English36 (49)
German38 (51)
View count, mean (range; SD)126,002 (59-3,202,821; 395,913)
≤60, n (%)1 (1)
61‐60,000, n (%)50 (68)
60,001‐120,000, n (%)7 (10)
120,001‐300,000, n (%)11 (15)
300,001‐3,000,000, n (%)4 (5)
>3,000,000, n (%)1 (1)
Duration (minutes:seconds), mean (range)7:05 (0:13-60:49)
≤1:00, n (%)1 (1)
1:01-5:00, n (%)45 (61)
5:01-10:00, n (%)17 (23)
10:01-20:00, n (%)6 (8)
>20:00, n (%)5 (7)
Publisher, n (%)
Official medical institution13 (18)
Other sources61 (82)

Didactic Quality Evaluation

None of the videos achieved the maximum score of 100% on the didactic-related checklist. The average score on the didactic checklist was 66.6% (SD 14.3%) with one video achieving the highest possible score of 94.7%. Conversely, the lowest didactic quality score observed was 33.6% (Table 2). The didactic quality evaluation revealed significant disparities across single items, with higher ratings for visual (62.3% strongly agree) and audio (52.3% strongly agree) aspects indicating effective presentation, contrasted starkly by critical deficiencies in hygiene (57.7% strongly disagree) and in-depth reading (64.6% strongly disagree), underscoring urgent areas for improvement in the didactic approach. Table 3 presents the top 5 highest-rated videos, the 5 lowest-rated videos, and the 5 most-watched videos, along with their specific didactic quality scores and characteristics.

Table 2. Distribution of didactic quality scores (retrospective cross-sectional study; YouTube videos accessed via German IP: 2010‐2018; N=74).
ScoresVideosa
≤50.00, n (%)11 (15)
50.01‐60.00, n (%)16 (22)
60.01‐70.00, n (%)14 (19)
70.01‐80.00, n (%)16 (22)
80.01‐90.00, n (%)14 (19)
≥90.01, n (%)3 (4)

aMean 66.6% (SD 14.4%); range 33.6%‐94.7%.

Table 3. Top 5, bottom 5, and most-viewed YouTube videos by didactic quality score (retrospective cross-sectional study; YouTube videos accessed via German IP, 2010‐2018; N=74).
Category and rankReferenceChannelDidactic scoreContent scoreLanguageYearViews, nDuration (h:min:sec)
Top 5 videos by didactic quality score
1[34]DRK Rettungsdienst Mittelhessen94.792.5German201818,1190:09:26
2[35]SIKANA English92.677.5English201652400:02:55
3[36]Arbeitsgemeinschaft für Notfallmedizin92.251.4German201549110:20:16
4[37]Das Weltrettungsforum im Namen der Wahrheit8855.2German2017590:43:17
5[38]Saxe Healthcare Communications86.711.9English2017122,4661:00:49
Bottom 5 videos by Didactic quality score
1[39]nordbayern.de33.648.1German201655220:00:13
2[40]LearnEngg33.861English2016314,2730:03:25
3[41]American Heart Assoc.38.756.3English201382,2750:03:32
4[42]Conny X44.325.8German201324060:04:19
5[43]H1 Fernsehen45.334German201338,1170:04:24
5 most-watched videos
1[44]CPRCertified74.982.7English20143,202,8210:04:58
2[45]Weisbrod Imaging86.393.6English20121,204,8710:08:59
3[46]ProCPR78.764.6English2011600,0310:06:30
4[47]tracy76.259.6English2013378,7060:01:44
5[40]LearnEngg33.861English2016314,2730:03:25

Content Quality Evaluation

Overview

None of the videos achieved the maximum score of 100% on the content-related checklist, with an average total score of 56.5%, about 10% less than the average didactic score. This indicates that videos demonstrated slightly higher didactic than content quality. The highest content checklist score observed for a video was 95.2%, while the lowest recorded score was 6.7% (Table 4). Upon detailed analysis of specific procedures within CPR instructional videos, it is observed that “Set pain stimulus” is mentioned in only 14.9% of the videos, whereas “Shortest possible hands-off time” is mentioned in 22.3%, and “Allowing complete chest recoil” is mentioned in only 29.7% of the videos. Conversely, “Make an emergency call,” “Frequency of 100‐120/min,” and “Correct pressure point” were correct in 70.3%, 68.9%, and 66.9% of videos, respectively. Table 5 includes a selection of the top 5 highest-rated and lowest-rated videos, along with the 5 most watched videos and their related content quality score.

Table 4. Distribution of content quality scores (retrospective cross-sectional study; YouTube videos accessed via German IP: 2010‐2018; N=74).
Score rangeVideosa
<10.00, n (%)1 (1)
10.01‐30.00, n (%)5 (7)
30.01‐50.00, n (%)20 (27)
50.01‐70.00, n (%)34 (46)
70.01‐90.00, n (%)9 (12)
≥90.01, n (%)5 (7)

aMean 56.5% (SD 19.3%); range 6.7%‐95.2%.

Table 5. Top 5, bottom 5, and most-viewed YouTube videos by content quality score in percent (retrospective cross-sectional study; YouTube videos accessed via German IP: 2010‐2018; N=74).
Category and rankVideo referenceChannelDidactic scoreContent scoreLanguageYearViews, nDuration (h:min:sec)
Top 5 videos by content quality score
1[48]LearningInn73.695.2English2013151,40100:04:53
2[49]heartcom UG86.393.6German201552,11300:06:25
3[34]DRK Rettungsdienst Mittelhessen94.792.5German201818,11900:09:26
4[50]ercEuroResusCouncil77.192.4English201342,33900:13:31
5[51]Thieme80.590.5German201552,93700:03:32
Bottom 5 videos by content quality score
1[52]Dr. Heart59.56.7German2016433000:02:29
2[38]Saxe Healthcare Communications86.711.9English2017122,46601:00:49
3[53]Liverpool John Moores University56.019.6English2013740700:01:19
4[54]CPR Council54.322.4English201520,39100:03:24
5[55]SAT.1 Regional52.124.3German201526500:02:05
5 most watched videos
1[44]CPRCertified74.982.7English20143,202,82100:04:58
2[45]Weisbrod Imaging86.393.6English20121,204,87100:08:59
3[46]ProCPR78.764.6English2011600,03100:06:30
4[47]tracy76.259.6English2013378,70600:01:44
5[40]LearnEngg33.861.0English2016314,27300:03:25
Differences in Quality Score by Publisher Type

Among the 74 videos analyzed, 13 (18%) were published by official medical institutions or organizations, while the remaining 61 (82%) originated from nonofficial sources. Videos published by official medical institutions achieved a content score of 67.5% (SD 20.4%) compared to those from nonofficial sources (mean 54.1%, SD 18.4%). This difference was statistically significant (t16=−2.18, P=.04; 95% CI –26.4% to −0.4%). The effect size was moderate (Cohen d=0.71). For didactic quality, official videos scored slightly higher on average (mean 68.9%, SD 15.5%) than nonofficial videos (mean 66.1%, SD 14.3%), but this difference was not statistically significant (t14=−0.23, P=.82; 95% CI −12.7% to 7.1%). The effect size was small (Cohen d=0.19).

Regression Analysis: Views, YouTube Ranking, and Quality Score

Regression analyses showed that videos with more views appeared to have slightly higher content quality scores (β=.2, P=.02, R2=7.2%). However, this association was influenced by 3 outlier videos with view counts of 600,031; 1,204,871; and 3,202,821. After excluding these outliers, the association was no longer significant (β=.2, P=.08, R2=4.3%). No association was found between view count and didactic quality (β=.2, P=.25, R2=1.9%). Similarly, YouTube ranking was not associated with either content quality (β=.2, P=.59, R2=0.4%) or didactic quality (β=.2, P=.08, R2=4.3%).


Deficiencies in Didactic and Content Quality of YouTube CPR Videos

This retrospective analysis of CPR-related YouTube videos revealed notable deficiencies in both didactic structure and content accuracy. The average didactic score was 66.6% (SD 14.3%), while the average content score was even lower at 56.5% (SD 19.2%). Importantly, many of the most viewed videos contained inaccurate information or omitted essential steps of resuscitation, such as correct compression technique or emergency call initiation. These results highlight a critical gap in the educational value of widely accessed CPR content and underscore the need for improved quality control in this domain.

Popularity Versus Quality: The Algorithmic Mismatch

Among the educational videos examined, it became evident that popularity, as measured by view counts, did not consistently align with the quality of didactic instruction or content accuracy. For example, the video with the highest didactic score had only 5200 views, while the video with the highest content score had 151,401 views, neither of which was among the most viewed videos overall. These observations raise questions about the role of algorithms in promoting content to a broader audience, potentially exposing viewers to inadequate or misleading information. Regression analyses confirmed this mismatch: although some associations between popularity and quality reached statistical significance, all models yielded very low R2 values (<10%), indicating that views and rankings explain little of the variation in video quality and are therefore not reliable indicators of educational accuracy. Better curation and dissemination of accurate, high-quality resuscitation material is essential to ensure that the public has access to reliable information in this crucial field.

Incomplete Demonstration of Critical CPR Components

The findings align with previous studies by Katipoglu et al [15] and Ferhatoglu and Kudsioglu [14], which also analyzed CPR videos on YouTube and emphasized their poor quality. However, this study differs in its approach, as the checklists encompass a wider range of resuscitation aspects, including the often-overlooked complete chest recoil, revealing significant variations in video quality. While some critical actions, such as making an emergency call, were generally performed correctly, others, such as ensuring complete chest recoil, were frequently inadequately depicted or omitted. Favorable outcomes and high-quality CPR depend on the correct execution of all actions [56]. For instance, even with adequate chest compression depth and frequency, a positive outcome becomes less likely in the absence of chest recoil, which eliminates the diastolic filling phase [57]. In addition, only 22.3% of the videos adequately explained the shortest possible hands-off time, despite its pivotal role in maintaining continuous chest compression, a critical factor for a favorable outcome [58,59].

Instructional Gaps and the Concept of Conscious Competence

These gaps in video content may be attributed to the concept of “conscious competence” in teaching. Experienced educators often possess unconscious competence [60], automatically performing numerous details correctly without being able to explicitly articulate them. Creators of medical educational videos may also internalize critical aspects, such as chest recoil and minimal hands-off time, treating them as self-evident. To address this challenge, the study used checklists that were developed and validated by medical education experts to meticulously assess didactic and content-related aspects in detail [32,33].

Interestingly, when considering valid outliers, results showed a statistically significant association between view count and content quality score. However, after removing the outliers, no statistically significant association existed between video ranking, view counts, and assigned didactic or content quality scores. This implies again that viewers should exercise caution when relying solely on top-ranked videos to acquire practical medical skills, as these videos may not consistently offer the most accurate or comprehensive information. The recent removal of the dislike button on YouTube further complicates viewers’ ability to accurately assess video quality.

While videos produced by official medical organizations received higher-quality ratings compared to nonofficial sources, it is vital to acknowledge that their videos still achieved an average content checklist score of only 67.5% (SD 20.4%). Therefore, while they may provide valuable insights, they should be approached with discernment rather than being unquestionably recommended for medical education purposes.

Comparison With Previous Work

The findings echo those of previous research on resuscitation videos and extend to studies on the quality of other instructional videos across a range of medical topics found on YouTube [22,24,28]. For example, Yoo et al [27] in 2020 found no difference in the quality of videos related to knee examinations from professional and nonprofessional organizations. Similarly, Flinspach et al [23] demonstrated a lack of association between the content parameters endorsed by YouTube and the overall quality of videos pertaining to epidural catheterization in obstetrics.

Implications for Future Practice and Educational Strategy

This study highlights the urgent need for more rigorous standards in the development and dissemination of medical educational videos. Validated checklists, such as those used in this analysis, offer a practical and evidence-based framework for both guiding content creation and evaluating instructional quality. Their integration into the production process can help ensure that videos are pedagogically sound, clinically accurate, and aligned with defined learning objectives. In light of the tendency among experienced practitioners to omit critical explanatory detail due to unconscious competence, content creators, particularly those without formal training in medical education, may benefit from targeted support in didactic design and instructional clarity. Moreover, given that video popularity does not reliably reflect content quality, professional organizations and academic institutions have a responsibility to curate, recommend, or endorse high-quality resources. Such measures are essential to help learners navigate an oversaturated digital landscape and access trustworthy materials.

Study Limitations

This study has limitations that warrant discussion. First, the retrospective design and focus on videos from 2010 to 2018 narrows the applicability of findings to the current YouTube landscape. However, many of these older videos remain widely accessible and are frequently viewed, and the core CPR principles remain, supporting the continued relevance of the dataset. Second, the study focused solely on freely available content from YouTube, justified by its widespread use as a source of resuscitation information [61]. This choice, while providing a comprehensive dataset, excludes potential insights from other platforms or sources. Additionally, the study included videos without explicit classification as instructional, recognizing that viewers often prioritize popularity metrics over instructional labels. However, this approach introduces variability in viewer interpretation. The restriction to English- and German-language videos reduces the transferability of findings to other linguistic and cultural contexts. Furthermore, reliance on YouTube’s search and ranking system, combined with the decision to analyze only the first 25 results per term to reflect typical user behavior, may have introduced a visibility bias by favoring popularity over educational merit and potentially omitting higher-quality videos less prominently ranked. Interrater reliability was not formally calculated; however, the use of independent dual ratings and validated checklists provided safeguards against individual reviewer bias. Finally, the variation in the applicability of checklist items across different videos, especially regarding aspects related to AED and ventilation, represents an inherent limitation. Given the hands-only approach recommended for laypersons, certain video segments did not pertain to these specific categories, leading to uneven scoring across checklist items. To address this, scores were normalized by excluding nonapplicable items from the total possible points, allowing for a more accurate and comparable assessment.

Future Research

Future research should build on these findings in 2 ways. First, newer YouTube videos should be analyzed to determine whether content and didactic quality have improved in response to evolving CPR guidelines and changes in the platform’s algorithms. This would provide an updated picture of the educational value currently available to learners. Second, research should move beyond descriptive analyses to test practical interventions aimed at improving video quality and visibility. Examples include integrating validated checklists into the production process, evaluating whether institutional peer review or endorsement increases viewer trust, and examining whether algorithmic adjustments can direct users toward higher-quality content. Together, these approaches would not only monitor the current state of online CPR education but also help identify strategies to actively enhance its reliability and reach.

Conclusions

This research reveals deficiencies in both the didactical quality and content accuracy of the CPR-related videos available on YouTube. Despite the potential for these videos to disseminate life-saving knowledge, many of them failed to meet fundamental criteria for effective CPR guidance. This educational gap is particularly worrisome, given the crucial roles that laypeople, health care professionals, and medical students play in emergencies. In the rapidly evolving landscape of online education, it is imperative to prioritize the widespread availability of high-quality, accurate, and accessible instructional materials, especially in critical domains like CPR.

Moving forward, medical educators, content creators, and professional organizations should take an active role in ensuring that online CPR education adheres to pedagogical and scientific standards. This may include the broader adoption of validated quality checklists, which inherently support structured video creation processes, as well as institutional endorsement of high-quality content. These measures are essential to empower viewers to make informed decisions about the accuracy of online content and, most importantly, to strengthen CPR training effectiveness and improve patient survival in real emergencies [62].

Disclaimer

ChatGPT (OpenAI) was used to assist with language editing. The authors take the ultimate responsibility for the content of this publication.

Data Availability

The datasets generated and analyzed in this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

None declared.

Checklist 1

Didactic quality checklist.

DOCX File, 23 KB

Checklist 2

Content quality checklist.

DOCX File, 18 KB

  1. Kiguchi T, Okubo M, Nishiyama C, et al. Out-of-hospital cardiac arrest across the world: first report from the International Liaison Committee on Resuscitation (ILCOR). Resuscitation. Jul 2020;152:39-49. [CrossRef] [Medline]
  2. The top 10 causes of death. World Health Organization. 2024. URL: https://www.who.int/news-room/fact-sheets/detail/the-top-10-causes-of-death [Accessed 2025-10-15]
  3. Gräsner JT, Bein B. [Resuscitation - adult advanced life support]. Anästhesiol Intensivmed Notfallmed Schmerzther. Mar 2016;51(3):188-195. [CrossRef] [Medline]
  4. Fischer M, Wnent J, Gräsner JT, et al. Jahresbericht des Deutschen reanimationsregisters: außerklinische reanimation 2021 [Article in German]. Anasthesiol Intensivmed. 2022;63(6):V116-V122. [CrossRef]
  5. Rajan S, Wissenberg M, Folke F, et al. Association of bystander cardiopulmonary resuscitation and survival according to ambulance response times after out-of-hospital cardiac arrest. Circulation. Dec 20, 2016;134(25):2095-2104. [CrossRef] [Medline]
  6. Hasselqvist-Ax I, Riva G, Herlitz J, et al. Early cardiopulmonary resuscitation in out-of-hospital cardiac arrest. N Engl J Med. Jun 11, 2015;372(24):2307-2315. [CrossRef] [Medline]
  7. Abella BS, Aufderheide TP, Eigel B, et al. Reducing barriers for implementation of bystander-initiated cardiopulmonary resuscitation: a scientific statement from the American Heart Association for healthcare providers, policymakers, and community leaders regarding the effectiveness of cardiopulmonary resuscitation. Circulation. Feb 5, 2008;117(5):704-709. [CrossRef] [Medline]
  8. Yan S, Gan Y, Jiang N, et al. The global survival rate among adult out-of-hospital cardiac arrest patients who received cardiopulmonary resuscitation: a systematic review and meta-analysis. Crit Care. Feb 22, 2020;24(1):61. [CrossRef] [Medline]
  9. Chaudhary A, Parikh H, Dave V. Current scenario: knowledge of basic life support in medical college. Natl J Med Res. 2011;1(2):80-82. URL: https:/​/www.​researchgate.net/​publication/​266168638_Current_scenario_Knowledge_of_basic_life_support_in_medical_college [Accessed 2025-10-15]
  10. Baldi E, Contri E, Bailoni A, et al. Final-year medical students’ knowledge of cardiac arrest and CPR: we must do more! Int J Cardiol. Dec 1, 2019;296:76-80. [CrossRef] [Medline]
  11. Roshana S, Kh B, Rm P, Mw S. Basic life support: knowledge and attitude of medical/paramedical professionals. World J Emerg Med. 2012;3(2):141-145. [CrossRef] [Medline]
  12. Swor R, Khan I, Domeier R, Honeycutt L, Chu K, Compton S. CPR training and CPR performance: do CPR-trained bystanders perform CPR? Acad Emerg Med. Jun 2006;13(6):596-601. [CrossRef] [Medline]
  13. Bashir A, Bashir S, Rana K, Lambert P, Vernallis A. Post-COVID-19 adaptations; the shifts towards online learning, hybrid course delivery and the implications for biosciences courses in the higher education setting. Front Educ. Aug 12, 2021;6. [CrossRef]
  14. Yilmaz Ferhatoglu S, Kudsioglu T. Evaluation of the reliability, utility, and quality of the information in cardiopulmonary resuscitation videos shared on open access video sharing platform YouTube. Australas Emerg Care. Sep 2020;23(3):211-216. [CrossRef] [Medline]
  15. Katipoğlu B, Akbaş İ, Koçak AO, Erbay MF, Turan E, Kasali K. Assessment of the accuracy of cardiopulmonary resuscitation videos in English on YouTube according to the 2015 AHA resuscitation guidelines. Emerg Med Int. May 2, 2019;2019:1272897. [CrossRef] [Medline]
  16. Spence AD, Derbyshire S, Walsh IK, Murray JM. Does video feedback analysis improve CPR performance in phase 5 medical students? BMC Med Educ. Aug 12, 2016;16(1):203. [CrossRef] [Medline]
  17. Lehmann R, Lutz T, Helling-Bakki A, Kummer S, Huwendiek S, Bosse HM. Animation and interactivity facilitate acquisition of pediatric life support skills: a randomized controlled trial using virtual patients versus video instruction. BMC Med Educ. Jan 5, 2019;19(1):7. [CrossRef] [Medline]
  18. Nomura O, Irie J, Park Y, Nonogi H, Hanada H. Evaluating effectiveness of YouTube videos for teaching medical students CPR: solution to optimizing clinician educator workload during the COVID-19 pandemic. Int J Environ Res Public Health. Jul 2, 2021;18(13):7113. [CrossRef] [Medline]
  19. Duncan I, Yarwood-Ross L, Haigh C. YouTube as a source of clinical skills education. Nurse Educ Today. Dec 2013;33(12):1576-1580. [CrossRef] [Medline]
  20. ReFaey K, Tripathi S, Yoon JW, et al. The reliability of YouTube videos in patients education for Glioblastoma treatment. J Clin Neurosci. Sep 2018;55:1-4. [CrossRef] [Medline]
  21. Fischer J, Geurts J, Valderrabano V, Hügle T. Educational quality of YouTube videos on knee arthrocentesis. J Clin Rheumatol. Oct 2013;19(7):373-376. [CrossRef] [Medline]
  22. Helming AG, Adler DS, Keltner C, Igelman AD, Woodworth GE. The content quality of YouTube videos for professional medical education: a systematic review. Acad Med. Oct 1, 2021;96(10):1484-1493. [CrossRef] [Medline]
  23. Flinspach AN, Raimann FJ, Schalk R, et al. Epidural catheterization in obstetrics: a checklist-based video assessment of free available video material. J Clin Med. Mar 20, 2022;11(6):1726. [CrossRef] [Medline]
  24. Murugiah K, Vallakati A, Rajput K, Sood A, Challa NR. YouTube as a source of information on cardiopulmonary resuscitation. Resuscitation. Mar 2011;82(3):332-334. [CrossRef] [Medline]
  25. Hassounah MM, AlOwaini HS, Diab CN, Khamis NN. YouTube videos teaching Arabic speaking population how to perform cardiopulmonary resuscitation: the gap between the need and quality! Resuscitation. Oct 2018;131:e13-e14. [CrossRef] [Medline]
  26. Şaşmaz MI, Akça AH. Reliability of trauma management videos on YouTube and their compliance with ATLS® (9th edition) guideline. Eur J Trauma Emerg Surg. Oct 2018;44(5):753-757. [CrossRef] [Medline]
  27. Yoo M, Hong J, Jang CW. Suitability of YouTube videos for learning knee stability tests: a cross-sectional review. Arch Phys Med Rehabil. Dec 2020;101(12):2087-2092. [CrossRef] [Medline]
  28. Yaylaci S, Serinken M, Eken C, et al. Are YouTube videos accurate and reliable on basic life support and cardiopulmonary resuscitation? Emerg Med Australas. Oct 2014;26(5):474-477. [CrossRef] [Medline]
  29. Elicabuk H, Yaylacı S, Yilmaz A, Hatipoglu C, Kaya FG, Serinken M. The reliability of Turkish “Basic Life Support” and “Cardiac Massage” videos uploaded to websites. Eurasian J Med. Feb 2016;48(1):15-19. [CrossRef] [Medline]
  30. Kleinman ME, Goldberger ZD, Rea T, et al. 2017 American Heart Association focused update on adult basic life support and cardiopulmonary resuscitation quality: an update to the American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. Jan 2, 2018;137(1):e7-e13. [CrossRef] [Medline]
  31. Leitlinien-aktualisierung 2015 für HLW und kardiovaskuläre notfallmedizin: zusammenfassung der American Heart Association [Report in German]. American Heart Association; 2015. URL: https://www.crit.cloud/uploads/2/7/6/1/27612891/2015-aha-guidelines-highlights-deutsch.pdf [Accessed 2025-08-19]
  32. Rüsseler M, Sterz J, Kalozoumi-Paisi P, et al. Qualitätssicherung in der lehre – entwicklung und analyse von checklisten zur beurteilung von lehrvideos zum erlernen praktischer fertigkeiten [Article in German]. Zentralbl Chir. 2017;142(1):32-38. [CrossRef]
  33. Sterz J, Tückmantel PR, Bepler L, et al. [Development and validation of a checklist for evaluating videos for learning resuscitation measures]. Med Klin Intensivmed Notfmed. Oct 2022;117(7):525-530. [CrossRef] [Medline]
  34. Leitstellensymposium 2018 - reanimation. YouTube. 2018. URL: https://www.youtube.com/watch?v=soh9s4LD4Gc [Accessed 2024-08-19]
  35. Learn first aid gestures: adult cardiopulmonary resuscitation. YouTube. 2016. URL: https://www.youtube.com/watch?v=ZJVGPcd_BIM [Accessed 2024-08-19]
  36. Basic life support - Stefan Heschl. YouTube. 2015. URL: https://www.youtube.com/watch?v=bSQkXXd8Dxc [Accessed 2024-08-19]
  37. 4:56 Erste hilfe: was tun bei atem- und/oder herzstillstand? (ARD-buffet 28.09.2016). YouTube. 2017. URL: https://www.youtube.com/watch?v=nWBvvsR8Mn0 [Accessed 2024-08-19]
  38. Whats new in cardiac resuscitation AHA guidelines for ACLS and BLS. YouTube. 2017. URL: https://www.youtube.com/watch?v=6lCztv_3sMM [Accessed 2024-08-19]
  39. How to operate an automated external defibrillator (AED). YouTube. 2013. URL: https://www.youtube.com/watch?v=lKEL7rONx5c [Accessed 2024-08-20]
  40. How to do CPR | First aid treatments #learnengg #medicalemergency. YouTube. 2016. URL: https://www.youtube.com/watch?v=soq9CiuX67w [Accessed 2024-08-19]
  41. High-quality CPR and in-hospital adult resuscitation. YouTube. 2013. URL: https://www.youtube.com/watch?v=eXmAzsRQi9I [Accessed 2024-08-20]
  42. Herzdruckmassage flashmob. YouTube. 2013. URL: https://www.youtube.com/watch?v=78xSD7qyrx8 [Accessed 2024-08-20]
  43. Anleitung zur wiederbelebung: herz-lungen-massage. YouTube. 2013. URL: https://www.youtube.com/watch?v=M_05VPtRb8Q [Accessed 2024-08-20]
  44. How to perform CPR video. YouTube. 2014. URL: https://www.youtube.com/watch?v=cosVBV96E2g [Accessed 2024-08-19]
  45. CPR/AED refresher course (2012). YouTube. 2012. URL: https://www.youtube.com/watch?v=CuUXdQI5LLs [Accessed 2024-08-19]
  46. Adult CPR. YouTube. 2011. URL: https://www.youtube.com/watch?v=OaSovqEimyA [Accessed 2024-08-19]
  47. Vinnie-jones-hard-and-fast-hands-only-CPR. YouTube. 2013. URL: https://www.youtube.com/watch?v=tD2qTmDsiHk [Accessed 2024-08-19]
  48. Basic life support in 5 minutes. YouTube. 2013. URL: https://www.youtube.com/watch?v=_zQpiDhTc-0 [Accessed 2024-08-19]
  49. Herz-lungen-wiederbelebung und der einsatz eines AED retten leben. YouTube. 2015. URL: https://www.youtube.com/watch?v=Boj82HsDD60 [Accessed 2014-08-19]
  50. Cardiopulmonary resuscitation (CPR) and automated external defibrillation (AED) on an Ambu mannequin. YouTube. URL: https://www.youtube.com/watch?v=eNr9x3VJZyM [Accessed 2024-08-19]
  51. Thieme. Sofortmassnahmen reanimation. YouTube. 2018. URL: https://www.youtube.com/watch?v=D_C3hFhXXyY [Accessed 2018-11-22]
  52. Herzdruckmassage und wiederbelebung - allgemeines. YouTube. 2016. URL: https://www.youtube.com/watch?v=Eklew0tJb_o [Accessed 2024-08-19]
  53. How to perform CPR (cardiopulmonary resuscitation). YouTube. 2013. URL: https://www.youtube.com/watch?v=EGMSH7uz8kM [Accessed 2024-08-19]
  54. RC (UK) guidelines 2015 - professor Gavin Perkins. YouTube. 2015. URL: https://www.youtube.com/watch?v=D7qbWeEqd1E [Accessed 2024-08-19]
  55. “Hamburg shocks”: first aid for sudden cardiac arrest. YouTube. 2016. URL: https://www.youtube.com/watch?v=nUawgUynDsM [Accessed 2024-08-19]
  56. Harris AW, Kudenchuk PJ. Cardiopulmonary resuscitation: the science behind the hands. Heart. Jul 2018;104(13):1056-1061. [CrossRef] [Medline]
  57. Yannopoulos D, McKnite S, Aufderheide TP, et al. Effects of incomplete chest wall decompression during cardiopulmonary resuscitation on coronary and cerebral perfusion pressures in a porcine model of cardiac arrest. Resuscitation. Mar 2005;64(3):363-372. [CrossRef] [Medline]
  58. Brouwer TF, Walker RG, Chapman FW, Koster RW. Association between chest compression interruptions and clinical outcomes of ventricular fibrillation out-of-hospital cardiac arrest. Circulation. Sep 15, 2015;132(11):1030-1037. [CrossRef] [Medline]
  59. Cheskes S, Common MR, Byers PA, Zhan C, Morrison LJ. Compressions during defibrillator charging shortens shock pause duration and improves chest compression fraction during shockable out of hospital cardiac arrest. Resuscitation. Aug 2014;85(8):1007-1011. [CrossRef] [Medline]
  60. Broadwell MM. Teaching for learning (XVI). Gospel Guard. Feb 20, 1969;20(41):1a-3a. URL: https:/​/web.​archive.org/​web/​20130902013921/​https:/​/www.​wordsfitlyspoken.org/​gospel_guardian/​v20/​v20n41p1-3a.​html [Accessed 2025-11-03]
  61. Stollak MJ, Vandenberg A, Burklund A, Weiss S. Getting social: the impact of social networking usage on grades amongst college students. Presented at: Proceedings from ASBBS Annual Conference; Feb 22-27, 2011; Las Vegas, NV. URL: https:/​/www.​researchgate.net/​publication/​303818592_Getting_Social_The_Impact_of_Social_Networking_Usage_on_Grades_Amongst_College_Students [Accessed 2025-10-16]
  62. Marino PL, Galvagno SM, Sing RF. Marino’s the Little ICU Book. Lippincott Williams & Wilkins; 2016.


AED: automated external defibrillator
CPR: cardiopulmonary resuscitation


Edited by Amaryllis Mavragani; submitted 28.Nov.2024; peer-reviewed by Elif Tarihci Cakmak, Riccardo D'Ambrosi, Vitor Reis; final revised version received 15.Sep.2025; accepted 16.Sep.2025; published 05.Nov.2025.

Copyright

© Jasmina Sterz, Yvonne Beaugé, Pia Tueckmantel, Lena Bepler, Armin N Flinspach, Yves Gramlich, René Verboket, Philip Bintaro, Maren Janko, Mairen H Flinspach, Michael Merker, Sven Bepler, Jan T Vollrath, Sebastian H Voß, Miriam Rüsseler. Originally published in JMIR Formative Research (https://formative.jmir.org), 5.Nov.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.