Accessibility settings

Published on in Vol 10 (2026)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/81743, first published .
Recruiting a National Sample of First Response Agencies to Participate in an Overdose Prevention Research Project: Randomized Controlled Trial and Feasibility Study

Recruiting a National Sample of First Response Agencies to Participate in an Overdose Prevention Research Project: Randomized Controlled Trial and Feasibility Study

Recruiting a National Sample of First Response Agencies to Participate in an Overdose Prevention Research Project: Randomized Controlled Trial and Feasibility Study

1Prevention Insights, School of Public Health-Bloomington, Indiana University Bloomington, 809 E. 9th St., Bloomington, IN, United States

2Department of Applied Health Science, School of Public Health-Bloomington, Indiana University Bloomington, Bloomington, IN, United States

3Department of Epidemiology and Biostatistics, School of Public Health-Bloomington, Indiana University Bloomington, Bloomington, IN, United States

4Biostatistics Consulting Center, School of Public Health-Bloomington, Indiana University Bloomington, Bloomington, IN, United States

Corresponding Author:

Jon Agley, MPH, PhD


Background: US overdose deaths continue to exceed 77,000 per year, the majority of which involve opioids. One evidence-based response to this crisis is overdose education and naloxone distribution (OEND). There is a large national (US) network of citizens and first response agencies connected through an app called PulsePoint Respond, who are engaged in facilitating rapid layperson cardiopulmonary resuscitation administration in cases of public emergencies. Our goal is to recruit these first response agencies to provide targeted messaging about OEND to this large subpopulation of motivated layperson responders. This study focuses on the first step: the feasibility of our national efforts to recruit first response agencies to participate in our project.

Objective: This study aimed to determine whether more first response agencies were successfully recruited using materials that included preemptive correction of misperceptions about overdose and naloxone than with standard recruitment materials and to investigate the recruitment parameters observed when agencies were successfully recruited.

Methods: The overall study was a randomized controlled trial in which we randomly sampled 180 first response agencies from the total set of agencies subscribing to PulsePoint (n=773). Agencies were randomly allocated to 3 study arms (1:1:1) with stratification for rural status. Arm 1 received standard recruitment materials, arm 2 received similar materials that directly addressed common misperceptions about overdose and naloxone, and arm 3 was recruited to serve as a control arm for later parts of the study. The primary analysis of recruitment approaches used logistic regression, contrasting arms 1 and 2. Exploratory analyses included descriptive statistics and other logistic regression models.

Results: A total of 40 agencies signed memoranda of understanding to participate in the project (n=176, 22.7% of contacted agencies; n=151, 26.5% of the agencies where a point of contact had been established). We did not find evidence that the messaging contained in arm 2 significantly affected recruitment success (odds ratio 0.754, 95% CI 0.298‐1.904; P=.55). Likewise, arm assignment (3-way comparison) did not significantly affect the likelihood of an agency agreeing to participate. The recruitment process took a mean of 159.08 (SD 104.74) days per agency and involved 8.38 emails, 1.98 voicemails, 0.83 phone calls, and 1.23 video calls.

Conclusions: Recruiting first response agencies that subscribe to PulsePoint for participation in a national-level OEND project appears feasible, with an anticipated participation rate between 23% and 27% of agencies solicited. Successful recruitment timelines can be lengthy and involve extensive correspondence. Since the language used in our different study arms did not have a significant effect on agency recruitment, other factors (such as individual citizen responses to messaging) could reasonably be used to select the overall language used in subsequent project recruitment materials.

Trial Registration: OSF Registries osf.io/egn3z; https://osf.io/egn3z

International Registered Report Identifier (IRRID): RR2-10.2196/57280

JMIR Form Res 2026;10:e81743

doi:10.2196/81743

Keywords



Background

Although US overdose death rates have decreased substantially in recent years, including a 26.9% reduction from 2023 to 2024 [1], the most recent 12-month-ending overdose death count (77,677 in January 2025) still exceeded every other overdose death count reported on or before April 2020 [2]. The majority of those overdose deaths involved opioids [1,2].

The national response to this overdose crisis is complex and multifaceted; one key component is the use of overdose reversal medications such as naloxone (eg, Narcan), which can safely reverse an opioid overdose [3]. Evidence suggests that overdose education and naloxone distribution (OEND) programs of various kinds are effective preventive mechanisms [4-9]. Questions remain, however, about the optimal ways to implement such programming on a national scale.

In our protocol for this project, we argue that an ideal opioid overdose reversal infrastructure is one in which “naloxone is present at or near the scene” of an overdose and “used as quickly as possible,” by which we imply a saturation of (1) individuals ready, willing, and able to respond to an overdose and (2) accessible overdose reversal medication [10]. This paper outlines the first step in the systematic approach by which we are working to support such saturation.

Our Overall OEND Project

The objective of our overarching project is to partner with the “large, highly engaged system for cardiopulmonary resuscitation (CPR) that already exists nationally in the US (PulsePoint Respond)” to support proliferation of OEND by sending messages to PulsePoint Respond app users that provide information about opioid overdose, direct them to online overdose and naloxone administration training opportunities, and remind them of the importance of carrying naloxone [10]. The project involves a 2-stage process, of which the first stage is the focus of this study.

For this first phase, we randomly sampled 180 first response agencies from the total set of agencies partnered with the PulsePoint Respond app (n=773). We attempted to recruit those 180 agencies to participate in the second phase of our project (to send out messaging to their PulsePoint Respond users). The second phase, which is ongoing in conjunction with participating agencies, involves providing messaging to citizen responders subscribed to those agencies’ feeds in the PulsePoint Respond app. We briefly describe elements of that phase to contextualize our study, but it is not the focus of this paper.

PulsePoint Respond

At the time of our proposal, the PulsePoint Respond app, developed by the nonprofit PulsePoint Foundation, had more than 1 million monthly active users receiving geotargeted alerts on their mobile devices in around 4800 communities in the United States represented by more than 700 agencies [10]. As of July 2025, the number of monthly active users had surpassed 1.37 million, and there are now more than 5500 connected communities [11].

PulsePoint Respond was originally launched in 2011. Now, the hundreds of agencies working with PulsePoint are located across North America [12]. “PulsePoint implementations are typically championed and led by local Fire/EMS (Emergency Medical Services) agencies” who can reach out to PulsePoint to initiate discussions about joining as a community [13]. Those agencies then use a variety of methods to encourage individual-level (eg, user) participation via community outreach.

While first response agencies can directly recruit individuals with specialized skills or professions and enroll them as “Registered CPR Responders” or “Professional Responders,” our primary interest for this project is “Public CPR Responders,” who compose the majority of users. These individuals “are typically community members trained in CPR and automated external defibrillator use and willing to assist if an incident occurs near them” [14]. Any individual with the PulsePoint Respond app is able to sign up as a Public CPR Responder. Public CPR responders receive “notification of nearby cardiac arrest events occurring in public places” any time they are in a PulsePoint-connected community. They can also sign up for community alerts in multiple communities and are “shown a filtered list of emergencies occurring in the community and offered notifications of public interest events such as traffic collisions and wildland fires” [14].

PulsePoint’s Public CPR Responders

We speculate that the individuals who opt in as Public CPR Responders for the PulsePoint Respond app are systematically different from the general public in the sense that they have already displayed a willingness to be alerted about, and potentially respond to, emergency incidents occurring in nearby public spaces (such as an unconscious and unresponsive person). Therefore, these individuals represent promising candidates for OEND outreach. As we have noted, our overall project, of which this study is the first component, is to provide informative messaging to PulsePoint Respond users, alerting them to online overdose and naloxone administration training opportunities and reminding them of the importance of carrying naloxone [10]. However, the app does not have a feature by which messages can be sent to all users in the system. Instead, the highest “level” of messaging permission within the app occurs at the “agency level” (eg, subscribing to a 911 dispatch center or firehouse), which can send messages to all app users who have subscribed to that agency’s alerts.

The Need to Recruit First Response Agencies to Accomplish Our Study

Therefore, in order to study whether our targeted OEND messaging approach results in higher percentages of community-level responders reporting that they carry naloxone or have been trained in opioid overdose response, we needed to recruit first response agencies as partners willing to send messages to their users. Unfortunately, such a recruitment process does not appear to have precedent in the literature.

There are many examples where individuals at first response agencies have been recruited to participate in research related to overdose and naloxone, particularly interviews and surveys. For example, studies have done so using email surveys to agency leaders [15,16], surveys at in-service training [17], convenience and snowball sampling for interviews [18], and email and telephone recruitment from multiple divergent sources [19]. However, we were unable to locate any systematic descriptions of studies where first response agencies were recruited to partner with researchers to conduct subsequent research tasks (as was our goal here). Given this lack of information, we developed a study of agency recruitment as part of the first phase of our overall study.

The Value of Understanding First Response Agency Recruitment

We believe it is important to fill the gap in the literature on recruiting first response agencies as research partners for several reasons. First, clear information about agency participation rates will provide useful grounding for future studies. For example, the participation rate that one might reasonably expect from such an effort (eg, the number of agencies X required to actually achieve a participating sample Y) was unknown. While such estimates can vary by context, knowing an approximate participation rate can assist with statistical and practical study planning. Second, the overall effort required to recruit such agencies had not previously been studied or documented; obtaining a precise sense of the types of procedures and the length of time required to secure a participation agreement is critical for researchers planning a study timeline and allocating staffing time and costs.

In addition, this study provided us with an opportunity to test whether different message types were more likely to result in successful agency recruitment. As we describe in our protocol [10], “misperceptions about overdose and naloxone, as well as stigmatizing beliefs about people who use drugs, may affect both layperson and first responder willingness and interest in carrying and using naloxone,” a topic about which much has been written [19-28]. Research also suggests that inaccurate ideas about naloxone and overdose are relatively common among US adults [29]. Thus, we speculated that recruitment messages that proactively addressed concerns about OEND that were predicated on inaccurate information would be more effective than recruitment efforts that did not use this approach.

Objectives

Our goal was for agencies to agree to participate in a structured program to send push messages (within the PulsePoint app) to citizen responders to provide information about opioid overdose, naloxone, and online overdose training opportunities. This study aimed to examine the feasibility of recruiting first response agencies that already subscribe to the PulsePoint Respond program [13], to test the efficacy of different messaging strategies, and to estimate the parameters (eg, length of time, contact quantity, and type) involved in successfully recruiting an agency. Except for one exploratory analysis, these aims were preregistered [10].

The primary hypothesis was:

  • More first response agencies will be successfully recruited by study arm 2 (which included preemptive correction of inaccurate information about overdose and naloxone) than by study arm 1 (which included standard recruitment messaging).

Exploratory analyses were also conducted to answer the following questions:

  • Were there significant differences in recruitment success between any of the study arms (including arm 3, which was a control arm)? This exploratory analysis was not preregistered.
  • What was the mean length of time between establishing an initial point of contact (POC) and successfully establishing a memorandum of understanding (MOU) to participate in the project?
  • What were the mean numbers and types of correspondence associated with successfully establishing an MOU to participate in the project?

In our protocol, we also explicitly noted that “regardless of the results of null hypothesis significance testing, [we were also interested in obtaining] an overall sense of recruitment feasibility” [10].


Design, Setting, and Eligibility

This study was a randomized controlled trial using a parallel-group design and a 1:1:1 allocation ratio with postsampling allocation by rurality. Eligible participants were all first response agencies in the United States that subscribed to the PulsePoint Respond app as of January 30, 2024 (n=773). At the sampling stage, agencies were considered ineligible and replaced if they had previously worked with our study team on a similar project.

Sampling, Allocation, and Blinding

A sample of 180 agencies was drawn based on the primary hypothesis contrasting 2 of the 3 study arms. With 60 agencies per arm, 80% power, and a 2-tailed α of .05, we had the ability to detect a proportional difference of at least 0.25 (ie, a 15-agency difference between arms) [10].

A study statistician not involved with the recruitment process (LGA) produced a random computer-generated sequence of numbers and sent it to JA, who then overlaid it on the population of eligible agencies, which had been provided to our study team by the PulsePoint Foundation in no particular order. As prespecified, the 180 agencies with the lowest random numbers were considered to be the initial sample.

Of those agencies, 2 were excluded for active or past overdose training work with the study team on different projects, 2 were invalid entries (eg, test cases that had been left in the dataset), and one was located outside the United States. The 5 agencies with the next lowest numbers in the original sequence were then included in the sample.

Then, the rural status of each agency was determined using the procedure described in Supplemental File: Rurality Determination Method [30]. As planned, allocation to study arms was accomplished by allocating the lowest third of rural agencies and the lowest third of nonrural agencies to arm 1, then the middle thirds to arm 2, and the remainder to arm 3. Because allocation was directly determined by the initial random number sequence, one can infer that allocation was concealed until assignment.

Agencies were blinded to arm assignment; they were informed about the overall project and the total number of agencies but not that some agencies received different materials or were asked to complete slightly different tasks.

Intervention and Comparator Conditions

General Recruitment Procedures and Content Design

Outreach to all agencies as part of the recruitment process involved combinations of phone calls, emails, and voicemails. The initial points of contact were obtained through a search for the agencies’ publicly available data. The initial goal was to establish a POC (eg, a person at the agency familiar with PulsePoint and able to discuss the project with our team) [10]. Once the POC had been identified for an agency, we requested a videoconference (~10 min) with them to discuss the project using an arm-specific slide deck. We also provided arm-specific flyer sets in PDF format (2 variants targeted at the agencies and 2 variants that the agencies could prospectively use for outreach to citizens in their coverage areas). Except for ad hoc communication about specific topics, our team used a suite of template emails and voicemails depending on the state of recruitment (see Supplemental File: Email and Voicemail Templates [30]).

All materials (including push message examples) provided to agencies were codeveloped with marketing professionals at IU University Communications and Marketing Creative & Web Team (CWT). Our core project team developed initial materials that were then iteratively refined by personnel at the CWT. The CWT personnel were asked to document the key gray and academic literature used to inform their decision processes around push/SMS text messaging best practices [31-33], community-based overdose and naloxone-related communication [34-37], general marketing language [38], visuals to address harm reduction [39,40], and strategies to write about harm reduction [41-45].

Study Arm 1

Arm 1 was designated the “standard” recruitment arm, which focused on general messaging to encourage engagement with our program, such as (Figure 1):

  • “Your citizen responders can prevent overdose fatalities.”
  • “As a participating PulsePoint agency, you already have a network of volunteers responding to unconscious and unresponsive citizens. If those volunteers also carry naloxone and are trained to use it, they will be able to act when they encounter an opioid overdose. With an overdose, just like with cardiac arrest, every minute counts!”
  • “The US Surgeon General recommends: ‘Be Prepared. Get Naloxone. Save a Life.”
Figure 1. Sample agency flyer for study arm 1.

Agencies were asked to commit to sending a “push” message through the PulsePoint Respond app once per month for 1 year. They were also asked to send 3 sets of 2 data collection messages (baseline, 6 months, and 12 months) asking participants to indicate whether they had been trained and carry naloxone. When agencies verbally committed to the project, they were provided with an MOU outlining these procedures. Upon signing the MOU, agencies were provided with instructions and push message templates (to be shared once the citizen portion of the study is completed in 2026). All recruitment materials for arm 1 can be found in the Supplemental File: Arm 1 Recruitment Materials [30].

Study Arm 2

Arm 2 was designed to use most of the recruitment principles from arm 1 while preemptively providing brief statements reflecting the current state of scientific knowledge to counteract common misconceptions about opioid overdose and naloxone [29]. Examples of such statements include (Figure 2):

  • “Studies show that opioid use in a community does not increase when naloxone is widely available” [46-50].
  • “Studies show that most people who are revived after an overdose do not overdose again” [51-54].
Figure 2. Sample agency flyer for study arm 2.

Except for messaging changes described above, other recruitment procedures were similar to arm 1 (see Supplemental File: Arm 2 Recruitment Materials [30]).

Study Arm 3

Arm 3 was designed as a control arm for subsequent studies within this project. It used the standard messaging from arm 1 except that agencies were only recruited to send the sets of data collection messages. The modified slide deck for this arm is available in Supplemental File: Arm 3 Recruitment Slide Deck [30].

Outcomes and Covariates

The following variables were used in our analysis:

  • Agency recruitment: The “ratio of recruited agencies (MOU) to total agencies with which active communication has been established,” [10] (meaning that a POC was reached). This was the primary preregistered outcome.
  • Raw MOU completion: The ratio of agencies with a completed MOU to the total denominator within a study arm. This was a secondary exploratory outcome.
  • Recruitment timeline: The “length of time between initial contact and agreement to participate (MOU),” [10] measured in days. This was a secondary preregistered outcome.
  • Dose-response metric: The “mean number and types of correspondence associated with agreement to participate and completion of an MOU” [10]. This was a secondary preregistered outcome.

Changes to Trial Protocol After Preregistration

We made 3 specific decisions about how to analyze and report this trial following preregistration, which we detail subsequently. We also documented all other events that might plausibly have affected the results of the study but that we considered to be minor or ecological issues. That documentation was timestamped at the time it was produced and is available in the Supplemental File: Other Considerations for Study Interpretation [30].

First, we originally planned to include agency rurality (and vote share derived from ZIP codes) as covariates in our primary outcome analysis. As we learned throughout the data collection process, many of the agencies in the sample covered broad mixtures of rural and nonrural spaces, sometimes with agencies overlapping in the same ZIP code for different service types. As described in Supplemental File: Rurality Determination Method [30], we were able to calculate a rough estimate of rural status to accomplish stratified allocation. However, after learning more from agencies about their operational procedures and collaborative norms, it was not clear to us that including these geography-based covariates would accomplish the goal of reducing error variance [55]. Particularly since agencies had already undergone stratified allocation to study arms by rural status, we chose to exclude these covariates from further analysis.

Second, we originally proposed to treat “agreement to participate” (informal) and “signing an MOU to participate” as having the same meaning in our primary outcome variable (agency recruitment). After observing that a high number of agencies expressed interest in participating but ultimately transitioned to refusals (eg, appeared to indicate that they would participate, but ultimately did not sign an MOU), we determined that such a designation would be misleading. To more conservatively estimate recruitment feasibility, we chose to adopt the more stringent requirement (a signed MOU) as the recruitment success metric.

Third, we originally stated that we would conduct separate analyses at 6 and 12 months postcorrespondence. After conducting the project, it became clear that 6-month analyses would not yield useful information, particularly since recruitment of specific groups of agencies was temporarily paused for various ecological reasons within the 12-month period (eg, fires, hurricanes, and floods). Further, agencies were only ever informed of a final deadline (12 mo).

Statistical Analysis

Analyses of agency recruitment used unadjusted logistic regression, both on the full sample and, separately, on the subsample of agencies for which correspondence had been achieved (adjustment was not necessary due to randomization). There were no missing data to address for those analyses. Those analyses, as well as measures of central tendency and distribution for all variables, were calculated using SPSS (version 29; IBM Corp).

Ethical Considerations

The agency recruitment processes within our study were reviewed by the Indiana University Institutional Review Board and determined not to meet the federal definition of research involving human subjects (review board record number 20218). Results are reported in a manner consistent with the CONSORT (Consolidated Standards of Reporting Trials) 2025 guidelines [56] (Checklist 1).


Participants

An initial sample of 180 agencies was randomly drawn from the list of 773 agency subscribers. As described in the Methods section, 5 of those agencies were removed from the sample due to ineligibility criteria and then replaced. Each agency’s rural status was determined to the best degree possible, and then agencies were allocated with stratification for rural status to study arms (60:60:60). During the rolling 12-month recruitment period (May 2024 to May 2025), agencies were replaced and resampled if they:

  • Confirmed that they were no longer a PulsePoint subscriber (n=22)
  • Were completely inactive in the PulsePoint Respond app and did not respond to any inquiries, signifying likely nonsubscription (n=5)
  • Were part of another agency that was also listed in PulsePoint (eg, a single agency had 2 different “listings” in the app, but they were not actually separate) (n=9)

Each of these instances was documented in real time (see Supplemental File: Sampling Documentation [Redacted Names] [30]). In one instance, a case of “merged” agencies was identified very near to the end of recruitment, and so those final cases were not resampled, meaning that the final denominators by arm were 60:58:58 (Figure 3).

Figure 3. CONSORT (Consolidated Standards of Reporting Trials) (Sankey) study diagram.

Recruitment

We were able to establish POCs for the majority of agencies in study arm 1 (49/60, 81.7%), study arm 2 (56/58, 96.6%), and study arm 3 (46/58, 79.3%) (see Table 1). In most cases, failure to establish a POC reflected the inability to connect with any individual at the agency after the prescribed number of attempts; however, in some cases, the agency was also unable to determine who the correct POC would be.

Table 1. Recruitment statistics.
Study armPoint of contact established, nMOUa signed, n
1 (n=60)4912
2 (n=58)5611
3 (n=58)4617

aMOU: memorandum of understanding.

A total of 40 agencies signed MOUs to participate in the project (22.7% [176] of total agencies; 26.5% [151] of the agencies where a POC had been established). We did not find evidence that the nature of the messaging contained in study arms 1 or 2 significantly affected recruitment success (odds ratio 0.754, 95% CI 0.298‐1.904; P=.55). We conducted exploratory analyses to examine 3-way comparisons (including the control arm) between agencies with established POCs and, separately, between all agencies, and likewise found that arm assignment did not significantly affect the likelihood of an agency signing an MOU to participate (see Table 2).

Table 2. Results of primary and exploratory analyses.
ComparisonORa95% CIP value
Point of contact established (2-way comparison)b
Arm 2 vs arm 10.7540.298‐1.904.55
Point of contact established (3-way comparison)
Arm 2 vs arm 10.7540.298‐1.904.55
Arm 3 vs arm 11.8070.746‐4.377.19
Arm 3 vs arm 22.3980.984‐5.843.054
All agencies in sample (3-way comparison)
Arm 2 vs arm 10.9360.376‐2.330.89
Arm 3 vs arm 11.6590.710‐3.874.24
Arm 3 vs arm 21.7720.745‐4.213.20

aOR: odds ratio.

bPrimary analysis for this study.

The length of the recruitment period, defined as the number of days between the initial contact attempt and our receipt of the signed MOU, was 159.08 (SD 104.74; range: 18-365) days (see Table 3).

Table 3. Length of recruitment period.
Study armNumber of days, mean (SD)Range of days
1 (n=12)164.75 (113.47)18‐341
2 (n=11)209.27 (130.82)27‐365
3 (n=17)122.59 (63.42)38‐249
Combined (n=40)159.08 (104.74)18‐365

Between the initial contact attempt (to establish a POC) and the time when a signed MOU was received for each agency, our study team sent an average of 8.38 emails, left 1.98 voicemails, and conducted 0.83 phone calls, in addition to holding 1.23 video calls during which we shared the recruitment slide deck for the project. These figures represent outgoing messages only and do not include messages received by the team.

Three agencies in the final sample were single agencies resulting from “merges” of multiple agencies that were listed separately in the PulsePoint app (see Figure 3). Therefore, Table 4 also provides descriptive statistics that exclude those agencies (sensitivity analysis), because some of the correspondence that reasonably might be attributed to recruiting these agencies may have been tagged as being part of other agency IDs, so we share these additional numbers for purposes of transparency.

Table 4. Instances of outgoing recruitment correspondence.
Number of emails, mean (SD)Number of voicemails, mean (SD)Number of phone calls, mean (SD)Number of video callsa, mean (SD)
Study arm
1 (n=12)8.17 (4.09)2.42 (1.73)1.17 (1.11)1.50 (0.67)
2 (n=11)10.09 (3.94)2.55 (1.57)0.91 (1.04)1.18 (0.40)
3 (n=17)7.41 (3.08)1.29 (1.40)0.53 (0.62)1.06 (0.24)
Combined (n=40)8.38 (3.72)1.98 (1.62)0.83 (0.93)1.23 (0.48)
Study arm (sensitivity)
1 (n=12)8.17 (4.09)2.42 (1.73)1.17 (1.11)1.50 (0.67)
2 (n=9)b10.22 (4.32)2.78 (1.48)1.00 (1.12)1.22 (0.44)
3 (n=16)b7.69 (2.96)1.38 (1.41)0.56 (0.63)1.06 (0.25)
Combined (n=37)b8.46 (3.74)2.05 (1.61)0.86 (0.95)1.24 (0.49)

aEach agency was required to have at least one video call to discuss study materials, so the lowest possible mean value was 1.

bData from agencies that formed as part of “merges” were excluded


Summary of Results

This study was designed to examine the feasibility of recruiting first response agencies that already subscribe to the PulsePoint Respond program to participate in a project focused on increasing OEND on a national scale in the United States. The randomized controlled trial design also allowed us to test whether proactively addressing common misperceptions about overdose and naloxone increased the number of agencies successfully recruited compared with standard recruitment messaging.

We did not find any evidence that the type of messaging included in recruitment materials (standard vs addressing misperceptions) had an effect on whether an agency was successfully recruited to participate in our project. In exploratory analyses, we likewise did not observe any pairwise differences between any of the study arms in terms of recruitment success. However, we found that recruiting agencies to participate in an OEND project was feasible, successfully recruiting 40 of the 176 agencies that we attempted to recruit from the national PulsePoint subscriber database.

Other Useful Findings

Even though our team understood from the outset that we would need to establish POCs for each agency, we were surprised by the average amount of time it took to recruit agencies for our project. While there were outlier agencies in each study arm that were recruited quickly (see range data in Table 3), the average recruitment process lasted a little longer than 5 months per agency. Based on our records, the factors contributing to this range of time seemed to vary. In some cases, it took a long time to identify the person at an agency who worked with PulsePoint Respond, but recruitment proceeded quickly thereafter. In other cases, we identified a POC quickly, but other individuals within the agency (eg, supervisors) or outside the agency (eg, legal counsel) needed to review issues around OEND, the nature of the project, and data privacy at greater length. In yet other cases, the recruitment process generally proceeded as planned, but each step took extra time due to first responders’ need to prioritize ongoing emergencies such as wildfires.

Thus, depending on staffing size and capacity, we note the importance of allocating commensurate levels of time and effort at the front end of large-scale agency recruitment efforts to allow for this process to unfold. At the same time, in many cases, higher numbers of contact attempts reflected greater difficulty in determining which person in a given agency worked directly with the PulsePoint interface.

We also expressed concern at the outset that agencies randomized to the control arm might not participate in the project because that arm had fewer direct benefits to the community (eg, sending out push messages linking citizens to resources and training) [10]. However, agencies were just as likely to be recruited for study arm 3, with many of them expressing interest in the data that would be obtained through the control arm data collection regarding the current levels of overdose training and education, and the percentages of citizens regularly carrying naloxone.

Limitations and Alternative Interpretations

Multiple agencies were removed and replaced from the sample over the course of the project. In some cases, this resulted from agency “merges,” where more than one agency was listed separately in the sampling frame, and we only learned after establishing communication that the agencies had the same controlling agreement related to PulsePoint. In other cases, participating agencies in different study arms were in close proximity to each other. In both types of situations, this may have resulted in some cross-exposure between study arms, but the numbers are insufficient to have affected the analytic results.

There were also multiple ecological events, particularly major natural disasters, that occurred throughout the study period. These were equally likely to have affected any given study arm but may have reduced the overall recruitment rate for the study as a whole. It is also possible that the overall recruitment rate was affected by other overdose prevention efforts, such as naloxone leave-behind programs [57]. Like ecological events, such effects would have been randomly distributed between study arms, but recent informal discussions between a first response coalition from a separate project and one study author (DCS; see more about his other project here [58,59]) found the potential for multiple overdose response programs to be perceived as competing priorities.

Still, this study is likely to have a high level of generalizability to first response agencies that subscribe to PulsePoint, but whether those agencies, in turn, are systematically different from other first response agencies in the United States or elsewhere is unclear. Caution should therefore be used in inferring these findings outside of the study frame.

The lack of a significant finding for the recruitment messaging is limited to effects on agency recruitment. In subsequent studies as part of this overall project, we will determine whether there is an effect—or not—on individual layperson responders depending on the type of messaging available through their study arm, but this component of the study does not address those differences.

Finally, this study was not designed to compare recruitment in arm 3 against the active study arms (the control arm was designed to facilitate individual-level analyses in subsequent parts of the project). Exploratory analyses suggested that none of the study arms was significantly different in terms of agency recruitment. However, the raw count for the control arm was descriptively higher than for the other arms. Based on statistics alone, we cannot infer that this difference is meaningful, but we note that our recruitment discussions often touched on the low levels of staffing and high workload in first response agencies. Thus, we think it is plausible, though not established by this study, that there may be a small recruitment advantage for the control arm (on the basis of a reduced “ask” for participation) that we were underpowered to detect.

Conclusions

This study demonstrated that it is feasible to recruit first response agencies that subscribe to PulsePoint to participate in an OEND project, with an anticipated recruitment success rate between 23% and 27%. The recruitment process was relatively involved, lasting slightly more than 5 months per agency on average and requiring multiple correspondence efforts across several communication platforms. Incorporating messaging designed to address inaccurate information about overdose and naloxone into messaging materials did not affect recruitment success. Supplemental qualitative analyses of recruitment materials and interviews with agency personnel about the recruitment process are underway and will provide additional clarifying information about how to optimize recruitment.

Acknowledgments

We would like to thank Tracy Zollinger and Gail Godwin of the IU University Communications and Marketing Creative & Web Team for their design and marketing work on the recruitment materials and procedures. We would also like to thank the PulsePoint Foundation in general, and Kraig Erickson in particular, for their encouragement of this project. Finally, we would like to thank Sam Whalen for his work in assisting our project team with identifying contact information for the agencies and determining their rurality. We expect that some portions of this work, including limited amounts of verbatim text, will be used to facilitate the academic dissemination of these findings, including via abstracts, poster presentations, or panel talks at conferences, among other possibilities. We confirm that no generative artificial intelligence tool was used to prepare, format, or inform any portion of this manuscript.

Funding

Research reported in this publication was supported by the National Institute on Drug Abuse of the National Institutes of Health under award R34DA058162 (total award for the full project, which includes this study: US $713,250) to JA. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Data Availability

Data used in the preparation of this manuscript are available through the Open Science Framework [30]. This link also provides access to the analytic code as well as the supplemental files referenced throughout the paper.

Authors' Contributions

Conceptualization: JA, CH, DCS

Data curation: JA, CH, MN, DT

Formal analysis: JA

Funding acquisition: JA, CH, DCS, MP, DT

Investigation: JA, CH, MN

Methodology: JA, CH, DCS, MP, DT, MN, LG-A, SD

Project administration: JA, CH

Resources: JA, CH, DCS, MP, DT

Software: DT

Supervision: JA

Validation: CH, MN, LG-A, SD

Visualization: JA, CH, MN

Writing – original draft: JA, CH, MN

Writing – review & editing: JA, CH, DCS, MP, DT, MN, LG-A, SD

Conflicts of Interest

JA, CH, MP, and DCS have received funding through their employer (Indiana University) to conduct various research projects pertaining to opioids, harm reduction, and naloxone from federal, state, and local government entities and nonprofit foundations. CH was co–principal investigator of a National Institute on Drug Abuse Small Business Innovation Research grant that facilitated the development of a web-based naloxone training (Citizen Opioid Responders) being provided as an option for layperson responders in this study, but their participation was as an Indiana University employee; therefore, they do not have any mechanisms in place (eg, trademarks, patents, and access agreements) that would result in any personal profit or financial benefits accruing from the use of the training.

Checklist 1

CONSORT 2025 checklist.

PDF File, 337 KB

  1. US overdose deaths decrease almost 27% in 2024. CDC. 2025. URL: https://www.cdc.gov/nchs/pressroom/releases/20250514.html [Accessed 2026-03-10]
  2. Provisional drug overdose death counts. CDC. 2025. URL: https://www.cdc.gov/nchs/nvss/vsrr/drug-overdose-data.htm [Accessed 2026-03-10]
  3. Overdose reversal medications. National Institute on Drug Abuse. 2023. URL: https://nida.nih.gov/research-topics/overdose-reversal-medications [Accessed 2026-03-10]
  4. Ballreich J, Mansour O, Hu E, et al. Modeling mitigation strategies to reduce opioid-related morbidity and mortality in the US. JAMA Netw Open. Nov 2, 2020;3(11):e2023677. [CrossRef] [Medline]
  5. Beletsky L, Rich JD, Walley AY. Prevention of fatal opioid overdose. JAMA. Nov 14, 2012;308(18):1863-1864. [CrossRef] [Medline]
  6. Chimbar L, Moleta Y. Naloxone effectiveness: a systematic review. J Addict Nurs. 2018;29(3):167-171. [CrossRef] [Medline]
  7. Razaghizad A, Windle SB, Filion KB, et al. The effect of overdose education and naloxone distribution: an umbrella review of systematic reviews. Am J Public Health. Aug 2021;111(8):e1-e12. [CrossRef] [Medline]
  8. Smart R, Davis CS. Reducing opioid overdose deaths by expanding naloxone distribution and addressing structural barriers to care. Am J Public Health. Aug 2021;111(8):1382-1384. [CrossRef] [Medline]
  9. Walley AY, Xuan Z, Hackman HH, et al. Opioid overdose rates and implementation of overdose education and nasal naloxone distribution in Massachusetts: interrupted time series analysis. BMJ. Jan 30, 2013;346:f174. [CrossRef] [Medline]
  10. Agley J, Henderson C, Seo DC, et al. The feasibility of using the national PulsePoint cardiopulmonary resuscitation responder network to facilitate overdose education and naloxone distribution: protocol for a randomized controlled trial. JMIR Res Protoc. Mar 29, 2024;13:e57280. [CrossRef] [Medline]
  11. By the numbers: stats. PulsePoint. 2025. URL: https://www.pulsepoint.org/stats [Accessed 2026-03-10]
  12. PulsePoint. 2026. URL: https://www.pulsepoint.org [Accessed 2026-03-10]
  13. Community first responders. PulsePoint. 2026. URL: https://www.pulsepoint.org/pulsepoint-respond [Accessed 2026-03-10]
  14. App features by responder type. PulsePoint. 2026. URL: https://www.pulsepoint.org/responder-types-and-features#app-features-by-responder-type [Accessed 2026-03-10]
  15. Lofaro RJ, Lungu M, Witkowski K, Sapat A. The opioid crisis, policy deservingness, and humanization: a nationwide survey of first responders’ viewpoints on clients of color. Risk Hazards Crisis Public Policy. Mar 2025;16(1):e70006. [CrossRef]
  16. Remington CL, Witkowski K, Ganapati NE, Headley AM, Contreras SL. First responders and the COVID-19 pandemic: how organizational strategies can promote workforce retention. Am Rev Public Adm. Jan 2024;54(1):33-56. [CrossRef]
  17. La Manna A, Siddiqui S, Gerber G, et al. Overdose and overwork: first responder burnout and mental health help-seeking in Missouri’s overdose crisis. Drug Alcohol Depend. Jun 1, 2025;271:112590. [CrossRef] [Medline]
  18. Elswick Fockele C, Frohe T, McBride O, et al. Harm reduction in the field: first responders’ perceptions of opioid overdose interventions. West J Emerg Med. Jul 2024;25(4):490-499. [CrossRef] [Medline]
  19. Filteau MR, Green B, Kim F, McBride KA. “It’s the same thing as giving them CPR training”: rural first responders’ perspectives on naloxone. Harm Reduct J. Oct 3, 2022;19(1):111. [CrossRef] [Medline]
  20. Koester S, Mueller SR, Raville L, Langegger S, Binswanger IA. Why are some people who have received overdose education and naloxone reticent to call Emergency Medical Services in the event of overdose? Int J Drug Policy. Oct 2017;48:115-124. [CrossRef] [Medline]
  21. Murphy J, Russell B. Police officers’ views of naloxone and drug treatment: does greater overdose response lead to more negativity? J Drug Issues. Oct 2020;50(4):455-471. [CrossRef]
  22. Tobin K, Clyde C, Davey-Rothwell M, Latkin C. Awareness and access to naloxone necessary but not sufficient: examining gaps in the naloxone cascade. Int J Drug Policy. Sep 2018;59:94-97. [CrossRef] [Medline]
  23. Baumgart-McFarland M, Chiarello E, Slay T. Reluctant saviors: professional ambivalence, cultural imaginaries, and deservingness construction in naloxone provision. Soc Sci Med. Sep 2022;309:115230. [CrossRef] [Medline]
  24. Dahlem CHG, Granner J, Boyd CJ. Law enforcement perceptions about naloxone training and its effects post-overdose reversal. J Addict Nurs. 2022;33(2):80-85. [CrossRef] [Medline]
  25. Earnshaw VA. Stigma and substance use disorders: a clinical, research, and advocacy agenda. Am Psychol. Dec 2020;75(9):1300-1311. [CrossRef] [Medline]
  26. Marcu G, Aizen R, Roth AM, Lankenau S, Schwartz DG. Acceptability of smartphone applications for facilitating layperson naloxone administration during opioid overdoses. JAMIA Open. Apr 2020;3(1):44-52. [CrossRef] [Medline]
  27. Miller NM, Waterhouse-Bradley B, Campbell C, Shorter GW. How do naloxone-based interventions work to reduce overdose deaths: a realist review. Harm Reduct J. Feb 23, 2022;19(1):18. [CrossRef] [Medline]
  28. Tsai AC, Kiang MV, Barnett ML, et al. Stigma as a fundamental hindrance to the United States opioid overdose crisis response. PLoS Med. Nov 2019;16(11):e1002969. [CrossRef] [Medline]
  29. Agley J, Xiao Y, Eldridge L, Meyerson B, Golzarri-Arroyo L. Beliefs and misperceptions about naloxone and overdose among U.S. laypersons: a cross-sectional study. BMC Public Health. May 10, 2022;22(1):924. [CrossRef] [Medline]
  30. Agency recruitment paper files. Open Science Framework. URL: https://osf.io/k5ng9/?view_only=1543ac18aef64a258c9a9610199208a2 [Accessed 2026-03-13]
  31. SMS marketing: the ultimate guide for 2024. GatherUp. Jun 26, 2024. URL: https://gatherup.com/blog/sms-marketing-the-ultimate-guide-for-2024 [Accessed 2026-03-10]
  32. The best times to send SMS marketing and email in 2025. Attentive. 2025. URL: https://www.attentive.com/blog/best-time-to-send-sms-marketing#toc-0 [Accessed 2026-03-10]
  33. 6 SMS marketing tips for copywriting (with examples). Campaign Monitor. 2022. URL: https://www.campaignmonitor.com/blog/featured/sms-copywriting-best-practices [Accessed 2026-03-10]
  34. Overdose cluster response messaging: a guide for public health and prevention organizations. MN Department of Health. 2025. URL: https://www.health.state.mn.us/communities/opioids/documents/clusterresponsemessagingguide2.pdf [Accessed 2026-03-10]
  35. Cherrier N, Kearon J, Tetreault R, Garasia S, Guindon E. Community distribution of naloxone: a systematic review of economic evaluations. Pharmacoecon Open. May 2022;6(3):329-342. [CrossRef] [Medline]
  36. Wenger LD, Doe-Simkins M, Wheeler E, et al. Best practices for community-based overdose education and naloxone distribution programs: results from using the Delphi approach. Harm Reduct J. May 28, 2022;19(1):55. [CrossRef] [Medline]
  37. HEALing Communities Study. 2025. URL: https://hcs.rti.org [Accessed 2026-03-10]
  38. Kronrod A. Language research in marketing. Found Trends Mark. Apr 11, 2022;16(3):308-421. [CrossRef]
  39. New study on stigmatizing imagery for substance use disorders released. Addiction Policy Forum. 2023. URL: https:/​/www.​addictionpolicy.org/​post/​new-study-on-stigmatizing-imagery-for-substance-use-disorders-released [Accessed 2026-03-10]
  40. Hulsey J, Zawislak K, Sawyer-Morris G, Earnshaw V. Stigmatizing imagery for substance use disorders: a qualitative exploration. Health Justice. Jul 4, 2023;11(1):28. [CrossRef] [Medline]
  41. White SA, Lee R, Kennedy-Hendricks A, Sherman SG, McGinty EE. Perspectives of U.S. harm reduction advocates on persuasive message strategies. Harm Reduct J. Aug 18, 2023;20(1):112. [CrossRef] [Medline]
  42. McGinty EE, White SA, Sherman SG, Lee R, Kennedy-Hendricks A. Framing harm reduction as part of an integrated approach to reduce drug overdose: a randomized message testing experiment in a nationally representative sample of U.S. adults, 2022. Int J Drug Policy. Aug 2023;118:104101. [CrossRef] [Medline]
  43. Plain language for public health. Public Health Communications Collaborative; 2023. URL: https:/​/publichealthcollaborative.​org/​wp-content/​uploads/​2023/​02/​PHCC_Plain-Language-for-Public-Health.​pdf [Accessed 2026-03-10]
  44. Hoffman R, Ostby R, Rausch P. Use this research in your communication and social media campaigns. Presented at: National Rx Drug Abuse & Heroin Summit; Aug 18-21, 2021. URL: https:/​/www.​naccho.org/​uploads/​downloadable-resources/​24-R_3RazES9CRROCrfgResearch_Social_Media_Campaigns-ICF.​pdf [Accessed 2026-03-10]
  45. McGinty E, Pescosolido B, Kennedy-Hendricks A, Barry CL. Communication strategies to counter stigma and improve mental illness and substance use disorder policy. Psychiatr Serv. Feb 1, 2018;69(2):136-146. [CrossRef] [Medline]
  46. Coffin PO, Behar E, Rowe C, et al. Nonrandomized intervention study of naloxone coprescription for primary care patients receiving long-term opioid therapy for pain. Ann Intern Med. Aug 16, 2016;165(4):245-252. [CrossRef] [Medline]
  47. Doe-Simkins M, Quinn E, Xuan Z, et al. Overdose rescues by trained and untrained participants and change in opioid use among substance-using participants in overdose education and naloxone distribution programs: a retrospective cohort study. BMC Public Health. Apr 1, 2014;14:297. [CrossRef] [Medline]
  48. Jones JD, Campbell A, Metz VE, Comer SD. No evidence of compensatory drug use risk behavior among heroin users after receiving take-home naloxone. Addict Behav. Aug 2017;71:104-106. [CrossRef] [Medline]
  49. Kelly BC, Vuolo M. Do naloxone access laws affect perceived risk of heroin use? Evidence from national US data. Addiction. Mar 2022;117(3):666-676. [CrossRef] [Medline]
  50. Tse WC, Djordjevic F, Borja V, et al. Does naloxone provision lead to increased substance use? A systematic review to assess if there is evidence of a “moral hazard” associated with naloxone supply. Int J Drug Policy. Feb 2022;100:103513. [CrossRef] [Medline]
  51. Larochelle MR, Liebschutz JM, Zhang F, Ross-Degnan D, Wharam JF. Opioid prescribing after nonfatal overdose and association with repeated overdose: a cohort study. Ann Intern Med. Jan 5, 2016;164(1):1-9. [CrossRef] [Medline]
  52. Lowder EM, Amlung J, Ray BR. Individual and county-level variation in outcomes following non-fatal opioid-involved overdose. J Epidemiol Community Health. Apr 2020;74(4):369-376. [CrossRef] [Medline]
  53. Olfson M, Crystal S, Wall M, Wang S, Liu SM, Blanco C. Causes of death after nonfatal opioid overdose. JAMA Psychiatry. Aug 1, 2018;75(8):820-827. [CrossRef] [Medline]
  54. Suffoletto B, Zeigler A. Risk and protective factors for repeated overdose after opioid overdose survival. Drug Alcohol Depend. Apr 1, 2020;209:107890. [CrossRef] [Medline]
  55. Segerstrom SC. Statistical guideline #3: designate and justify covariates a priori, and report results with and without covariates. IntJ Behav Med. Dec 2019;26(6):577-579. [CrossRef]
  56. Hopewell S, Chan AW, Collins GS, et al. CONSORT 2025 statement: updated guideline for reporting randomised trials. BMJ. Apr 14, 2025;389:e081123. [CrossRef] [Medline]
  57. Scharf BM, Sabat DJ, Brothers JM, Margolis AM, Levy MJ. Best practices for a novel EMS-based naloxone leave behind program. Prehosp Emerg Care. 2021;25(3):418-426. [CrossRef] [Medline]
  58. Lee SH, Agley J, Sharma V, Williamson F, Zhang P, Seo DC. Opioid overdose and naloxone administration knowledge and perceived competency in a probability sample of Indiana urban communities with large Black populations. PLoS ONE. 2025;20(7):e0328444. [CrossRef] [Medline]
  59. Seo DC, Alba-Lopez L, Satterfield N, Lee SH, Crabtree C, Williamson F. “There’s no real urgency when it comes to us”: critical discourse analysis of Black communities’ lived experience with opioid overdose response in Indianapolis area. Soc Sci Med. May 2025;373:118039. [CrossRef]


CONSORT: Consolidated Standards of Reporting Trials
CPR: cardiopulmonary resuscitation
MOU: memorandum of understanding
OEND: overdose education and naloxone distribution
POC: point of contact


Edited by Amaryllis Mavragani; submitted 02.Aug.2025; peer-reviewed by Kitty Gelberg, Sage R Feltus; final revised version received 18.Feb.2026; accepted 27.Feb.2026; published 02.Apr.2026.

Copyright

© Jon Agley, Cris Henderson, Monica Nair, Dong-Chul Seo, Maria Parker, Lilian Golzarri-Arroyo, Stephanie Dickinson, David Tidd. Originally published in JMIR Formative Research (https://formative.jmir.org), 2.Apr.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.