Original Paper
Abstract
Background: Text messages offer the potential to better evaluate HIV behavioral interventions using repeated longitudinal measures at a lower cost and research burden. However, they have been underused in US minority settings.
Objective: This study aims to examine the feasibility of assessing economic and sexual risk behaviors using text message surveys.
Methods: We conducted a single-group study with 17 African-American young adults, aged 18-24 years, who were economically disadvantaged and reported prior unprotected sex. Participants received a text message survey once each week for 5 weeks. The survey contained 14 questions with yes-no and numeric responses on sexual risk behaviors (ie, condomless sex, sex while high or drunk, and sex exchange) and economic behaviors (ie, income, employment, and money spent on HIV services or products). Feasibility measures were the number of participants who responded to the survey in a given week, the number of questions to which a participant responded in each survey, and the number of hours spent from sending a survey to participants to receiving their response in a given week. One discussion group was used to obtain feedback.
Results: Overall, 65% (n=11/17) of the participants responded to at least one text message survey compared with 35% (n=6/17) of the participants who did not respond. The majority (n=7/11, 64%) of the responders were women. The majority (n=4/6, 67%) of nonresponders were men. An average of 7.6 participants (69%) responded in a given week. Response rates among ever responders ranged from 64% to 82% across the study period. The mean number of questions answered each week was 12.6 (SD 2.7; 90% of all questions), ranging from 72% to 100%. An average of 6.4 participants (84%) answered all 14 text message questions in a given week, ranging from 57% to 100%. Participants responded approximately 8.7 hours (SD 10.3) after receiving the survey. Participants were more likely to answer questions related to employment, condomless sex, and discussions with sex partners. Nonresponse or skip was more often used for questions at the end of the survey relating to sex exchange and money spent on HIV prevention services or products. Strengths of the text message survey were convenience, readability, short completion time, having repeated measures over time, and having incentives.
Conclusions: Longitudinal text message surveys may be a valuable tool for assessing HIV-related economic and sexual risk behaviors.
Trial Registration: ClinicalTrials.gov NCT03237871; https://clinicaltrials.gov/ct2/show/NCT03237871
doi:10.2196/14833
Keywords
Introduction
Prior research has found that text messaging may be a promising strategy for involving young adults in research [
- ], as young adults are among the largest consumers of digital communication technologies [ - ]. More than 8 billion text messages are sent in the United States each day [ , ], and 97% of US young adults, aged 18 to 29 years, report using text messages at least once a day [ , ]. According to smartphone user data in the United States, young adults send and receive as many as 75 text messages per day, regardless of socioeconomic status [ ].In recent years, two-way text messages in the form of text message questionnaires (or surveys) have been used to obtain real-time data in health research settings [
, , , - ]. Text messaging as an assessment tool has been valued given that it can be easily integrated into the lives of study participants, who often carry cell phones throughout the day, without being intrusive or requiring additional travel or study visitation time [ ]. For researchers, data collection has a rapid turnaround time, is scalable to large groups, and is relatively inexpensive [ , , , ]. Participants may also be more responsive to the convenience of text messaging [ ], and there is an additional benefit of anonymity when reporting sensitive behaviors, such as sexual activity, drug use, or housing instability [ ]. In fact, one study found that text message responses from participants were more candid than responses from voice interviews [ ]. Text message surveys yield data that are comparable with other paper and online assessment tools [ , ], while overcoming many of the limitations of these traditional approaches (ie, interviews, computer-assisted surveys, and school-based assessments) [ , , , ]. For example, real-time text message data can reduce recall biases inherent in costly assessments that may be several months or years apart [ , ]. More frequent text message surveys, which are administered daily or weekly over the life of a study, may also provide a more detailed picture of how behaviors change over time [ , , ]. Obtaining data in real-time can also enable researchers to address any issues related to measurement or nonresponse promptly [ , ]. Text message surveys may also result in more representative research data by better engaging out-of-school individuals or individuals living in lower-income and underserved communities, who might otherwise be missed when using school- and clinic-based assessments [ , ].Text message surveys have been used in many health areas, including diet and obesity [
, ], asthma [ ], teen pregnancy [ ], and depression [ ]. However, with the exception of measuring medication adherence [ , , , , ], two-way text message surveys have rarely been used in HIV prevention research. Sexual risk behaviors, such as unprotected sex, sex while intoxicated, or sex with multiple concurrent partners, are known to contribute to the spread of HIV [ - ]. As a result, reducing sexual risk-taking is a hallmark of many HIV behavioral prevention strategies, particularly among African American young adults who are disproportionately impacted by HIV [ - ]. Yet, despite the high rates of cell phone usage and the alarmingly high rates of HIV in African American young adults, few studies have used text messages to collect data on sexual behaviors [ ]. Commonly used methods to collect data on sexual behaviors, such as those mentioned earlier (ie, interviews, computer-assisted surveys, and clinic visits), are less likely to measure behaviors in the most recent hours or days prior [ ]. In addition, the economic drivers of HIV are rarely assessed using repeated measures. Prompting young adults to provide a text message reply regarding the frequency and type of sex they engaged in, in addition to other socioeconomic factors, may be a viable means of data collection, provided it is feasible, acceptable, and reliable.The aim of this study was to examine the feasibility of assessing sexual and economic behaviors using text message surveys in African American young adults who were out-of-school and experiencing homelessness and unemployment in Baltimore, Maryland. The majority (82%) of HIV diagnoses in Baltimore is found among African Americans, with young adults, aged 20 to 29 years, representing the highest proportion [
]. Young adults in the city make up an increasing proportion of the homeless and unemployed [ , ]. Young adults experiencing homelessness are 6-12 times more likely to become infected with HIV than housed young adults, with prevalence rates as high as 12% [ - ]. HIV prevalence among African Americans in Baltimore is 3.1%, which is more than 10 times the national HIV prevalence in the United States (0.3%) and which exceeds the Joint United Nations Programme on HIV and AIDS’s definition of a generalized epidemic (HIV prevalence >1%) [ , , ]. Specifically, this manuscript describes the process, challenges, and solutions regarding text message survey responsiveness and utility. In addition, it discusses the implications of using text message surveys in future HIV behavioral intervention trials.Methods
Design
A single-group cohort study was used to examine the feasibility of assessing economic and sexual risk behaviors using weekly text message surveys. Participants were invited to respond to a text message survey sent to their cell phone every Monday at 9 AM for 5 weeks.
Recruitment and Enrollment
Potential participants were recruited onsite from 2 community-based organizations (CBOs) providing emergency and supportive residential services to young adults in Baltimore, Maryland. A recruitment flyer was posted in the main building of both CBOs. Designated CBO staff introduced potential participants to the study team on scheduled visit days. Study eligibility was determined using a paper-based screening questionnaire that was administered by a trained research assistant. Individuals were eligible to participate if, at the time of enrollment, they were African American, aged 18-24 years, living in Baltimore, experiencing homelessness within the last 12 months (ie, defined as reporting any episode in which a person lacked a regular or adequate nighttime residence, such as a hotel/motel, vehicle, shelter, or friend’s home, and was living primarily on their own, apart from a parent or guardian), unemployed or underemployed (≤10 hours per week), out-of-school, reporting one or more episodes of unprotected sex in the last 12 months, and having a cell phone that could send and receive text messages. Eligible participants were then introduced to the study, and informed consent was obtained.
As part of the enrollment process, we invited participants to register their cell phone number with the text message survey app. Participants sent a text message with the word join to the study phone number to register. Each person then received a brief orientation regarding the survey’s content, timing, and payment incentive (US $20 in cash for answering 4 out of 5 weekly surveys). We also provided snacks and beverages. In the presence of a trained research assistant, the participant also completed a mock but identical version of the 14-question text message survey on his or her cell phone. This was done to confirm readability and understanding of the text message questions and prompts and to clarify any points of confusion. Participants were also advised on how to opt out of the survey by sending a text with the word leave at any time. As a final orientation step, participants were provided an informational sheet and advised on how to increase privacy during the study period and avoid unintended loss of confidentiality, such as activating cell phone passwords, deleting all text message surveys, responding only to the study’s phone number, and answering in a quiet and private space. Participants were also informed of the study’s security protocol that included separating cell phone numbers from identifying information; selecting a platform, such as TextIt.In, that did not require handing over participants’ names, addresses, or other identifying information to a mobile database software company; anonymizing phone numbers with a random code at the end of the study to render numbers invalid for future use; and using encrypted and compliant channels of TextIt.In.
Text Survey Design
We used TextIt.In to create, send, and receive text messages from participants. TextIt.In is an online service for building text messaging apps using a visual and interactive flow. The text message survey was powered by Twilio, a cloud communications platform, using a study-sponsored phone number. We then developed an online logic tree to order how survey questions would be texted to the participants.
shows an excerpt of the branch logic used in the question tree. Each of the 14 questions was sent sequentially and in the same order as a single text message after the prior question had been answered. To facilitate responsiveness and data quality, the text messaging app included automated reminders and quality check prompts. Participants had 24 hours to complete each weekly text message survey. One automated text message reminder was sent to participants who did not initiate responding to the survey or to those who started but did not complete the survey within the first 24 hours. Reminder text messages included the name of the study, the payment incentive, and a reminder to respond to the survey within the next 24 hours. In addition, participants who responded with ineligible words or numbers outside of preset ranges received a text message query asking them to re-enter a valid response. All completed surveys generated 1 automatic text message that thanked the participant for his or her time.Measures
Data were collected in August and September 2017. Participants received the same 14 questions as text messages each week, regardless of their responses to the week’s prior text message survey.
lists the questions used in the survey of this formative study. The set of questions was adapted from previous studies of economic empowerment and costs associated with HIV preventive and treatment services [ - ] and included questions developed by the study team, specifically for African American young adults experiencing homelessness and unemployment [ - ]. All questions were in English and reviewed by the study team, piloted with young adults, and revised as necessary prior to utilization. The questions referred to the last 7 days, equivalent to the prior week. Eligible responses were yes/no or number of units (ie, dollars, episodes, and people). There were 7 economic questions relating to involvement in any type of paid work, the amount of cash earned from a job, the amount of cash earned from one’s own business, the amount of cash deposited into a savings account, the occurrence of loss of housing, the occurrence of requesting for cash to meet living expenses, and the amount of cash spent on any HIV preventive services or products (ie, condom or lubricant purchases, insurance copays for HIV testing or antiviral medications, and travel expenses to HIV educational sessions). An additional 7 sexual behavioral questions inquired about the number of sex partners, engagement in sex while high or drunk, frequency of condomless sex, utilization of other noncondom HIV preventive methods, frequency of sex exchange, discussion of HIV testing with sex partners, and receipt of HIV testing.The primary feasibility measures of this study were: number of participants who responded to the survey in a given week, number of questions to which a participant responded in a given week, and number of hours from sending a survey to participants to receiving their response in a given week. We calculated the number, mean, and proportion of participants who responded to each question in each of the weekly surveys over the 5-week study period. A participant was categorized as responding to the question if he or she provided a valid response such as yes/no, a numerical response, a free-form text, or a skip response to proceed to the next question. A participant was categorized as responding to the survey if he or she provided a valid response to at least one question of the 14-question survey. Ever responders were defined as enrolled participants who responded to at least one text message survey over the course of the study period. Nonresponders were defined as enrolled participants who did not return a text message response to any of the text message surveys over the course of the study period. We considered the study to be feasible if we were able to identify and recruit >15 eligible participants within the study period. The study was also considered feasible if a mean question response rate of 70% or more among ever responders was achieved and if the mean response time was 24 hours or less. Feasibility was based on ever responders to account for any initial run-off of participants who signed up for the study but did not participate once the text message survey was initiated.
As part of the study’s process evaluation, 1 discussion group with 5 responders was used to obtain feedback. A recruitment flyer was posted in the main building of both CBOs, and all participants, including nonresponders, received a text message regarding the day, time, and location of the focus group discussion. The group discussion was moderated by the study PI who used a focus group discussion guide. Participants were asked to describe what they liked or disliked about the text message survey, what they considered to be barriers and facilitators to responding, and what changes they would recommend regarding the survey design (ie, questions, timing, and frequency), including suggestions for additional information or questions they would have liked to receive. Participant responses were documented using memo field notes that were expanded upon immediately after the discussion.
Indicator | Text message question | Response optionsa |
Employment | In the last 7 days, did you perform any activity for pay? | 01 Yes; 02 No |
Income earned from job | In the last 7 days, how much did you earn in total from working for someone else? | US $ |
Income earned from own business | In the last 7 days, how much did you earn from being self-employed or from your own business? | US $ |
Savings | In the last 7 days, how much did you deposit into a savings account? | US $ |
Housing stability | In the last 7 days, have you been without a place to stay? | 01 Yes; 02 No |
Financial distress | In the last 7 days, did you ask someone for money to meet your food, housing, or other living expenses? | 01 Yes; 02 No |
Money spent on HIV prevention | How much have you spent in the last 7 days on HIV prevention? | US $ |
Sex partners | In the last 7 days, how many people have you had sex with? | __ # people |
Sex while drunk or high | With any of these people, were you drunk or high while having sex? | 01 Yes; 02 No |
Condomless sex | In the last 7 days, how many times did you have sex without a condom? | __ # of times |
Noncondom HIV preventive methods | Not including a condom, what other method(s) did you use to prevent HIV in the last 7 days? | Free text |
Sex exchange | In the last 7 days, how many times did you receive money, food, or drugs in exchange for having sex? | __ # of times |
Discussion of HIV testing | In the last 7 days, did you discuss HIV testing with your sex partner(s)? | 01 Yes; 02 No |
Uptake of HIV testing | In the last 7 days, did you get tested for HIV? | 01 Yes; 02 No |
aAll questions included a response option of skip.
Analysis
To analyze the results of the text message survey, we first created a database in Excel that included: a cell phone number for each participant; a participant unique study ID; demographic data relating to participants’ age, gender, education level, years living in Baltimore, number of hours worked per week, and number of children living in and out of the household, the date and time of study enrollment, the date and time of all outgoing and/or incoming messages, and the numerical, textual, or free-from text message response to each of the 14 text message questions each week. Secondly, we calculated the study’s primary feasibility measures, as listed above. Third, we calculated the frequencies of sexual and economic behaviors per the specific responses for each weekly question. Finally, lessons learned from the study’s process evaluation were analyzed over 5 implementation domains: acceptability, enrollment and registration, responsiveness, data quality, and data access. This process involved a close reading of the study’s field notes, coding lessons learned by each implementation domain, and discussing findings with the study team.
Ethics Approval
This study received ethics approval from the Johns Hopkins Bloomberg School of Public Health institutional review board (IRB#00007563).
Availability of Data and Materials
The dataset analyzed during this study is available in the Mendeley repository [
].Results
Sample Characteristics
A total of 17 participants were enrolled in the study, accounting for 1 of the study’s 3 feasibility criteria.
describes the sample’s demographic characteristics. All participants (n=17/17, 100%) were African American (per inclusion criteria), living in Baltimore, and recruited from 1 of the 2 community organizations providing support to young adults experiencing homelessness. The mean age was 21.2 years (SD 2.1). 53% (n=9/17) of participants were female and 47% (n=8/17) of the participants were male. About half of participants (n=9/17, 53%) had not received a high school diploma or equivalent. None were currently enrolled in school. The majority of participants were unemployed (n=13/17, 76%), and 24% (n=4/17) were working part-time. 29% (n=5/17) were parents with children living in or outside of their household.Sample characteristics | Response group | Total | |||||
Ever responders | Nonresponders | ||||||
Number of participants, n (%) | 11 (65%) | 6 (35%) | 17 (100%) | ||||
Age (years), mean (SD) | 21.0 (1.9) | 21.5 (2.5) | 21.2 (2.1) | ||||
African Americana, n (%) | |||||||
Yes | 11 (100) | 6 (100) | 17 (100) | ||||
No | 0 (0) | 0 (0) | 0 (0) | ||||
Gender, n (%) | |||||||
Male | 4 (36) | 4 (67) | 8 (47) | ||||
Female | 7 (64) | 2 (33) | 9 (53) | ||||
Recruited fromcommunity-based organizations for homeless young adultsa, n (%) | |||||||
Yes | 11 (100) | 6 (100) | 17 (100) | ||||
No | 0 (0) | 0 (0) | 0 (0) | ||||
Highest level of education, n (%) | |||||||
<12th grade | 5 (45) | 4 (67) | 9 (53) | ||||
High school diploma or equivalent | 6 (55) | 2 (33) | 8 (47) | ||||
Post-baccalaureate | 0 (0) | 0 (0) | 0 (0) | ||||
Currently enrolled in schoola, n (%) | |||||||
Yes | 0 (0) | 0 (0) | 0 (0) | ||||
No | 11 (100) | 6 (100) | 17 (100) | ||||
Employment statusa, n (%) | |||||||
Unemployed | 8 (73) | 5 (83) | 13 (76) | ||||
Employed part-time | 3 (27) | 1 (17) | 4 (24) | ||||
Employed full-time | 0 (0) | 0 (0) | 0 (0) | ||||
Number of years living in Baltimore, mean (SD) | 17.5 (6.1) | 21.5 (2.5) | 18.9 (5.4) | ||||
Are parents, n (%) | |||||||
Yes | 4 (36) | 1 (17) | 5 (29) | ||||
No | 7 (64) | 5 (83) | 12 (71) |
aPer inclusion criteria.
Text Survey Responsiveness
describes additional feasibility measures of the study. 65% (n=11/17) of participants responded to at least 1 text message survey over the 5-week study period compared with 35% (n=6/17) of participants who never responded. The majority (n=7/11, 64%) of ever responders were young women. The majority (n=4/6, 67%) of never responders were young men ( ). Among those who ever responded, an average of 7.6 participants responded to the text message survey in any given week (69% response rate; ). Response rates among ever responders ranged from 64% to 82% across the 5-week study period, representing 62.7% of all survey questions in all 5 weeks. When participants responded in a given week, they also answered the majority of the 14 survey questions. The mean number of answered questions for responders in a given week was 12.6 (SD 2.7; 90% of all questions), ranging from 72% to 100% of all questions. This met the study’s feasibility criteria of an average weekly question response rate of 70% or more among ever responders. An average of 6.4 participants (84%) answered all 14 text message survey questions in a given week, ranging from 57% to 100%. Participants responded on average 8.7 hours (SD 10.3) after receiving the survey. In week 1, participants responded the fastest with an average of 1.7 hours (SD 2.2). The slowest mean time to response was 12.6 hours (SD 13.2) in week 3. This met the study’s feasibility criteria of a mean response time of <24 hours.
presents the number of responders per question per week among ever-responding participants. The questions at the beginning of the survey had the highest response rates. Response rates were comparable across all questions in weeks 2 and 5 but tapered at the end of the survey in weeks 1, 3, and 4. In week 5, 1 participant answered the first question of the survey but omitted answering any further questions. Participants were most responsive to questions about employment, condomless sex, and discussions with sex partners. Nonresponse was highest for questions relating to sex exchange and money spent on HIV prevention products or services. The text messaging app successfully sent and received 1289 text messages with few errors (0.1%), indicating relative efficiency and reliability ( ).
Study subgroups | Week | Total | ||||||
1 | 2 | 3 | 4 | 5 | ||||
All participants (n=17), n | ||||||||
Number of weekly text message surveys sent out | 17 | 17 | 17 | 17 | 17 | 85 | ||
Number of text messages sent and received (includes all welcome, survey, reminder, correction, and thank you messages) | 197 | 291 | 298 | 236 | 267 | 1289 | ||
Ever-responding participants (n=11) | ||||||||
Participantsa who responded to the survey each week, n (%) | 7 (64) | 8 (72) | 9 (82) | 7 (64) | 7 (64) | 7.6 (69) | ||
Questions participantsa responded to each week, mean (SD) (% out of 14) | 10.1, 5.2 (72) | 14.0, 0.0 (100) | 13.6, 1.3 (97) | 13.3, 1.9 (95) | 12.1, 4.9 (86) | 12.6, 2.7 (90) | ||
Hours from sending survey to receiving participants’a response each week, mean (SD) | 1.7 (2.2) | 10.2 (12.2) | 12.6 (13.2) | 4.9 (8.6) | 13.9 (15.4) | 8.7 (10.3) | ||
Participantsa,b who responded to all 14 questions each week, n (%) | 4 (57) | 8 (100) | 8 (89) | 6 (86) | 6 (86) | 6.4 (84) |
aNever responders are excluded.
bNonresponders for the specific week are excluded.
Question number | Week | ||||
1, n | 2, n | 3, n | 4, n | 5, n | |
1 | 7 | 8 | 9 | 7 | 7 |
2 | 7 | 8 | 9 | 7 | 6 |
3 | 6 | 8 | 9 | 7 | 6 |
4 | 6 | 8 | 9 | 7 | 6 |
5 | 6 | 8 | 9 | 7 | 6 |
6 | 6 | 8 | 9 | 7 | 6 |
7 | 5 | 8 | 9 | 7 | 6 |
8 | 5 | 8 | 9 | 7 | 6 |
9 | 5 | 8 | 9 | 7 | 6 |
10 | 4 | 8 | 9 | 6 | 6 |
11 | 4 | 8 | 8 | 6 | 6 |
12 | 4 | 8 | 8 | 6 | 6 |
13 | 4 | 8 | 8 | 6 | 6 |
14 | 4 | 8 | 8 | 6 | 6 |
Reported Economic and Sexual Behaviors
Weekly economic and sexual behaviors reported by the participants are shown in
. Employment rates remained low, ranging from 14% to 43% over the study period. Mean earnings from employment by others or from the participant’s own business ranged from US $37 (SD 62.8) to US $146 (SD 220.3) per week and from US $7 (SD 18.9) to US $55 (SD 107.2) per week, respectively ( ). Participants experiencing housing instability decreased from 43% to 0% over the course of the study period, as did the proportion of those requesting money from others to cover living expenses (57%-0%). For most weeks, no money was spent on HIV prevention services or products, such as condoms, HIV testing, lubricants, or antiviral medications. For 3 of the 5 weeks, approximately 14% of participants reported having sex while high or drunk at least once in the past week. Condomless sex was a common risk behavior, with 14%-75% of participants reporting condomless sex at least once in a given week. There were no reports of sex exchange for money, food, or housing. Using other noncondom prevention methods was also low (11% in week 3). Participants were more likely to respond yes to the last two sexual behaviors of the survey, which were about discussing HIV testing with any of their sex partners (43%-67%) and receiving an HIV test in the past week (14%-43%). Participants used the skip response infrequently and only during weeks 1 and 2. When used, skip was most common for questions relating to sex exchange and money spent on HIV prevention services or products ( ). The dataset analyzed during this study is publicly available [ ].Question number | Indicator | Number of times skip response was used | Week | ||||
1 | 2 | 3 | 4 | 5 | |||
1 | Participants who performed any activity for pay, % | 0 | 14 | 38 | 33 | 43 | 43 |
2 | Earnings from job (US $), mean (SD) | 1 | 69.7 (94.5) | 145.6 (220.3) | 114.4 (141.9) | 36.6 (62.8) | 85.7 (127.7) |
3 | Earnings from self-employment or own business (US $), mean (SD) | 0 | 54.5 (107.2) | 18.6 (49.1) | 34.4 (48.8) | 7.1 (18.9) | 16.7 (40.8) |
4 | Reported savings in (US $), mean (SD) | 1 | 48.6 (66.6) | 20.5 (35.2) | 28.9 (39.8) | 18.6 (32.9) | 18.3 (38.0) |
5 | Participants reporting having no place to stay, % | 0 | 43 | 25 | 22 | 14 | 0 |
6 | Participants who asked for money for living expenses, % | 0 | 57 | 38 | 33 | 14 | 0 |
7 | Reported spending on HIV prevention in (US $), mean (SD) | 2 | 0 (0.0) | 51.4 (136.1) | 25.6 (66.2) | 0 (0.0) | 0 (0.0) |
8 | Number of sex partners in past week, mean (SD) | 0 | 1.0 (0.7) | 1.4 (0.7) | 0.8 (0.7) | 0.6 (0.5) | 0.5 (0.5) |
9 | Participants who were drunk or high while having sex (at least once), % | 0 | 14 | 13 | 0 | 14 | 0 |
10 | Participants who reported condomless sex at least once in the past week, % | 0 | 43 | 75 | 33 | 29 | 14 |
11 | Participants who reported noncondom prevention methods, % | 0 | 0 | 0 | 11 | 0 | 0 |
12 | Participants who reported sex exchange in the past week, % | 2 | 0 | 0 | 0 | 0 | 0 |
13 | Participants who discussed HIV testing with sex partners, % | 1 | 57 | 50 | 67 | 71 | 43 |
14 | Participants who received an HIV test, % | 1 | 14 | 25 | 22 | 14 | 43 |
Implementation Lessons Learned
summarizes the successes, challenges, and lessons learned in using text message surveys in this population. Key successes included participant acceptability, willingness to respond to the survey, confirming readability and functionality using a mock text message survey at enrollment, having moderately high responsiveness, and building in quality checks. Implementation challenges were low responses to questions perceived as sensitive or stigmatizing, technological delays, and the time required for restructuring text message data for analysis. Additional feedback from responders in a post-study discussion was that having the text message surveys arrive weekly and at the same time was helpful, as participants were always on their cell phones and available to respond quickly and conveniently. This, along with receiving cash payments, was viewed as a positive outcome. However, the reported weaknesses were that, for some, receiving the same set of questions each week was repetitive and may have contributed to response fatigue. Participants also requested whether informational text messages such as job announcements or sexual health tips could be provided as a reward for responding to each week’s survey. The study team’s observations while implementing the text message survey was that reducing text message wording, including using response prompts (eg, reply with: yes/no, US $ dollars, and # of times), and reminders were important to facilitating participation.
Implementation domain | Successes | Challenges | Lessons learned |
Acceptability | Participants were eager to enroll and motivated by cash payments. Willingness to respond to sensitive questions was enhanced by privacy supports. | Response declined at the end of the survey. The reasons for nonresponse are not well known because of lost to follow-up. | Participants valued text message contact but requested to receive nonrepeated survey questions and texts on jobs or sexual health. |
Enrollment and registration | Readability and function of the survey were confirmed at enrollment for all participants who answered a mock survey and clarified points of confusion. | Some interested young adults did not have a working cell phone. Long wording of some questions appeared as multiple texts on small screens during enrollment. | Financial support for accessing cellular service may be needed to enroll more disconnected young adults. |
Responsiveness | Two-thirds of participants responded to the survey representing a moderately high response rate. No participants used the opt-out function. | One-third of participants (mostly men) enrolled but never responded. One participant responded to only the first question. | Increasing incentives, reducing the number of questions, or reducing the frequency of surveys may improve responsiveness. |
Data quality | A 7-day window and sending surveys on the same day and time were used to reduce recall bias. Query text messages were sent for invalid responses. | All data were self-reported and not administered by a researcher. The recall period of later responders may have included overlapping days. | More efforts are needed to assess data quality in lieu of response prompts, larger sample size, and responses over time. |
Data access | Data were available at low cost and in real time at the moment when the participant responded. | All output was generated into separate weekly files that required time-consuming restructuring. | Routinely restructuring data would facilitate real-time analysis of individual and aggregate statistics. |
Discussion
Principal Findings
The goal of this study was to examine the feasibility of a relatively new mode of data collection using text message surveys in a high HIV prevalence urban and ethnic minority setting. We found that using weekly automated text message surveys with short assessments was feasible with vulnerable young adults. Data collection with this population can be challenging, given the unpredictability of young adults’ schedules and the uncertainty regarding their interests in research participation. However, the majority of invited participants completed the survey and were receptive to answering the study’s text message questions. To our knowledge, this is one of the first studies in an urban setting to use text message surveys to assess economic and sexual risk behaviors in economically-vulnerable African American young adults, who also had little experience responding to text messages for research purposes.
The study’s experience is informative with regard to 3 research areas: acceptability of text message surveys, survey responsiveness, and implications for future studies relating to efficiency and data quality. First, in the context of acceptability, several factors may have contributed to the study’s generally positive reception. This study’s recruitment process began with an introduction to the study to young adults in the presence of their peers at the CBO center. Therefore, potential participants had an opportunity to enjoy snacks, ask questions, and determine their own interests, including the interests of their peers, in participating in the study. The study team also explained the cash incentives being used to compensate individuals who completed the 5-week test cycle. Although eligible young adults appeared to be motivated by cash payments, other potential drivers to participation may have been the perceived benefit of participating in a study on HIV prevention with friends, including being prompted to think about or discuss HIV. Another driver to participation may have been interests in using text messages as a new means of income generation, since nearly all young adults had a cell phone and were underemployed. The study also invited each participant to try a mock text message survey on their cell phone in a private location at the CBO. This enabled them to see what they would be receiving each week and to confirm their capacity to respond. It was our experience that participants found responding to relatively sensitive text message questions on economic and sexual behaviors acceptable, given the readability of the questions, the short completion time required (about 3 to 4 minutes), the anonymity of their cell phone, the ability to use phone passwords for additional privacy, and the option to delete all text messages.
A second area of consideration is responsiveness to the text message survey. Among all participants, the response rate of 65% was moderately high, representing nearly two-thirds of participants. For those individuals who responded to at least one text message survey, response rates were even higher in a given week, ranging from 64% to 82%, with participants answering about 90% of all questions. This study’s findings included higher response rates than similar text message survey studies, including 3 studies assessing substance use and sexual risk behaviors with US young adults aged 18 to 25 years (49% response rate) [
], medication adherence in HIV-negative transgender men and women (39% response rate) [ ], and medication adherence from caregivers of HIV-infected children in Uganda (24% response rate) [ ]. On the other hand, we have similar response rates as 2 additional studies assessing drug adherence using text message surveys with lesbian, gay, bisexual, and transgender young adults in the United States living with HIV (61% response rate) [ ] and assessing quality of life via text message surveys with older patients with rheumatoid arthritis (69% response rate) [ ]. Although higher responsiveness may be needed if text messages are the sole form of evaluation, such engagement by predominantly financially- and residentially-unstable young adults is encouraging. In addition, the average time to response ranged from 2 to 12 hours, representing a relatively rapid response period compared with mail-in or online surveys that may experience several days or weeks between distribution and response. Potential contributing factors may have been that the study’s in-person orientation process allowed participants to feel prepared in knowing what type of questions would be asked and how long it would take to respond. Our sequentially sending each question only after the prior question had been answered further allowed participants to track and respond to questions at their own pace. Being female also appeared to aid responsiveness, although more research is needed to understand whether and why this may be the case.To that end, it is important to consider the nonresponsiveness observed in this study. The reasons for nonresponse may have been poor network coverage, having one’s cell phone lost/stolen, or lacking sufficient charge or cellular credit. The potential to earn cash payments spurred interest in many participants. However, for some, this enthusiasm may have waned over time. Nonresponsiveness may also have resulted from experiencing negative emotions when thinking about financial hardships or prior sex partners. It is possible that some participants simply wanted a break from the study but chose not to use the opt-out commands (leave or stop). Future studies should assess reasons for nonresponse, as this could increase the number of participants providing study data. Increasing incentives, reducing the number of questions, or reducing the frequency of text message surveys may also improve responsiveness. Paying participants more frequently rather than at the end of the study and requesting alternative forms of contact (ie, email and social media) to reconnect with nonresponders may additionally be helpful. Greater engagement might also be achieved if the text message surveys are concurrently embedded within an intervention or other in-person contact targeting the outcomes of the text assessment.
A final important area relates to implications for future research. Our text message surveys showed some promise as a measurement tool in behavioral research. Having repeated measures each week from participants provides stronger statistical power and enables trialists to better characterize fluctuations over time. Our text message survey was configured to query again any invalid or out-of-range responses to maximize data quality over time. Once the survey flow was automated and launched, there was minimal maintenance. However, despite having data available from the moment participants responded, the process of restructuring and aggregating weekly data files was time-consuming and resulted in the team generally viewing and analyzing data at the end of the study, rather than on a weekly basis. Improvements in exporting and coding the messaging app’s data would increase efficiency. Finally, given that text messaging technology is constantly evolving as are young adults’ cell phone behaviors, including the option to respond to surveys via instant messaging or other text messaging apps may improve participation.
The study’s small sample size was a limiting factor, as the study was not designed to determine efficacy or estimate prevalence of economic and sexual risk behaviors. Rather, the study aimed to conduct a rapid test cycle using a small group of young adults with a preliminary goal of assessing instrument feasibility for a larger intervention trial. We additionally selected young adults who had a working cell phone, were literate, and were receiving residential services at the participating CBOs. Although the participants were vulnerable in other ways, such a sampling strategy could mean that the findings are not generalizable to more disconnected young adults. In addition, although not using an interviewer to administer surveys may have increased participants’ responsiveness to sensitive questions, the use of self-administered assessments could have reduced data quality. Finally, although weekly assessments provided more frequent contact than traditional pre-post study designs, asking participants to recall economic and sexual behaviors in the last 7 days rather than the day before via daily surveys may have been challenging for some participants. Despite these limitations, the study was successful in monitoring behaviors over time. This study’s findings provide support for using text message surveys to collect data in future behavioral trials.
Conclusions
Text messages offer the potential to better evaluate HIV behavioral interventions using repeated longitudinal measures at lower cost and research burden. However, they have been underutilized in US minority settings. We found that using weekly automated text message surveys with short assessments was feasible with vulnerable young adults. Additional research should focus on maintaining high responsiveness, improving the efficiency of data analysis, and ensuring data quality.
Acknowledgments
The authors would like to thank the EMERGE (Engaging Microenterprise for Resource Generation and Health Empowerment) study group, which consists of the EMERGE study participants and graduate student research assistants at the Johns Hopkins University Bloomberg School of Public Health. The authors are also grateful to the directors, managers, and staff at YO!Baltimore (Elizabeth Torres Brown, Burgundi Allison, Sherry Kizielewicz, and Kierra Sanders) and AIRS (Anthony Butler, Monifa Riggins, and Irisha Burrell) and Jamison Merrill, who made this work possible. This research was funded through resources and services provided by the National Institute of Mental Health (NIMH) (grant: K01MH107310, principal investigator: Larissa Jennings Mayo-Wilson). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIMH.
Authors' Contributions
LJMW conceived the research study and managed the study’s implementation. NG, AL, FS, SL, and MJ aided in framing the issues of the study and providing technical expertise. MD analyzed the data. LJMW prepared the first draft of the manuscript. All authors contributed to editing and interpreting the results of the manuscript. All authors have read and approved the final manuscript.
Conflicts of Interest
None declared.
References
- Aguilera A, Schueller SM, Leykin Y. Daily mood ratings via text message as a proxy for clinic based depression assessment. J Affect Disord 2015 Apr 1;175:471-474 [FREE Full text] [CrossRef] [Medline]
- Bonar EE, Koocher GP, Benoit MF, Collins RL, Cranford JA, Walton MA. Perceived risks and benefits in a text message study of substance abuse and sexual behavior. Ethics Behav 2018;28(3):218-234 [FREE Full text] [CrossRef] [Medline]
- Brown 3rd W, Giguere R, Sheinfil A, Ibitoye M, Balan I, Ho T, et al. Challenges and solutions implementing an SMS text message-based survey CASI and adherence reminders in an international biomedical HIV PrEP study (MTN 017). J Biomed Inform 2018 Apr;80:78-86 [FREE Full text] [CrossRef] [Medline]
- DeJonckheere M, Nichols LP, Moniz MH, Sonneville KR, Vydiswaran VG, Zhao X, et al. MyVoice national text message survey of youth aged 14 to 24 years: study protocol. JMIR Res Protoc 2017 Dec 11;6(12):e247 [FREE Full text] [CrossRef] [Medline]
- Dowshen N, Kuhns LM, Gray C, Lee S, Garofalo R. Feasibility of interactive text message response (ITR) as a novel, real-time measure of adherence to antiretroviral therapy for HIV+ youth. AIDS Behav 2013 Jul;17(6):2237-2243. [CrossRef] [Medline]
- Pedersen S, Grønhøj A, Thøgersen J. Texting your way to healthier eating? Effects of participating in a feedback intervention using text messaging on adolescents' fruit and vegetable intake. Health Educ Res 2016 Apr;31(2):171-184. [CrossRef] [Medline]
- Schnall R, Okoniewski A, Tiase V, Low A, Rodriguez M, Kaplan S. Using text messaging to assess adolescents' health information needs: an ecological momentary assessment. J Med Internet Res 2013 Mar 6;15(3):e54 [FREE Full text] [CrossRef] [Medline]
- Lenhart A, Ling R, Campbell S, Purcell K. Teens and Mobile Phones. Pew Research Center. 2010. URL: https://www.pewresearch.org/internet/2010/04/20/teens-and-mobile-phones/ [accessed 2020-05-19]
- Lenhart A, Smith A, Anderson M, Duggan M, Perrin A. Teens, Technology and Friendships. Pew Research Center. 2015. URL: http://www.pewinternet.org/2015/08/06/teens-technology-and-friendships/ [accessed 2020-05-19]
- Porath S. Text messaging and teenagers: a review of the literature. J Res C Educ Technol 2011;7:86-99 [FREE Full text]
- Burke K. 73 Texting Statistics That Answer All Your Questions. Text Request. 2016. URL: http://jasonswebdblog.blogspot.com/2018/02/73-texting-statistics-that-answer-all.html [accessed 2020-05-19]
- Smith A. US Smartphone Use in 2015. Pew Research Center. 2015. URL: http://www.pewinternet.org/2015/04/01/us-smartphone-use-in-2015/ [accessed 2020-05-19]
- Belavy D. A mobile telephone-based SMS and internet survey system for self-assessment in Australian anaesthesia: experience of a single practitioner. Anaesth Intensive Care 2014 Nov;42(6):771-776 [FREE Full text] [CrossRef] [Medline]
- Lee SS, Xin X, Lee WP, Sim EJ, Tan B, Bien MP, et al. The feasibility of using SMS as a health survey tool: an exploratory study in patients with rheumatoid arthritis. Int J Med Inform 2013 May;82(5):427-434. [CrossRef] [Medline]
- Stanton MC, Mkwanda SZ, Debrah AY, Batsa L, Biritwum N, Hoerauf A, et al. Developing a community-led SMS reporting tool for the rapid assessment of lymphatic filariasis morbidity burden: case studies from Malawi and Ghana. BMC Infect Dis 2015 May 16;15:214 [FREE Full text] [CrossRef] [Medline]
- dal Grande E, Chittleborough CR, Campostrini S, Dollard M, Taylor AW. Pre-survey text messages (SMS) improve participation rate in an Australian mobile telephone survey: an experimental study. PLoS One 2016;11(2):e0150231 [FREE Full text] [CrossRef] [Medline]
- Iribarren SJ, Brown W, Giguere R, Stone P, Schnall R, Staggers N, et al. Scoping review and evaluation of SMS/text messaging platforms for mhealth projects or clinical interventions. Int J Med Inform 2017 May;101:28-40 [FREE Full text] [CrossRef] [Medline]
- Schober MF, Conrad FG, Antoun C, Ehlen P, Fail S, Hupp AL, et al. Precision and disclosure in text and voice interviews on smartphones. PLoS One 2015;10(6):e0128337 [FREE Full text] [CrossRef] [Medline]
- Brodey BB, Gonzalez NL, Elkin KA, Sasiela WJ, Brodey IS. Assessing the equivalence of paper, mobile phone, and tablet survey responses at a community mental health center using equivalent halves of a 'gold-standard' depression item bank. JMIR Ment Health 2017 Sep 6;4(3):e36 [FREE Full text] [CrossRef] [Medline]
- Lagerros YT, Sandin S, Bexelius C, Litton J, Löf M. Estimating physical activity using a cell phone questionnaire sent by means of short message service (SMS): a randomized population-based study. Eur J Epidemiol 2012 Jul;27(7):561-566. [CrossRef] [Medline]
- Chang T, Gossa W, Sharp A, Rowe Z, Kohatsu L, Cobb EM, et al. Text messaging as a community-based survey tool: a pilot study. BMC Public Health 2014 Sep 8;14:936 [FREE Full text] [CrossRef] [Medline]
- Vyas AN, Landry M, Schnider M, Rojas AM, Wood SF. Public health interventions: reaching Latino adolescents via short message service and social media. J Med Internet Res 2012 Jul 12;14(4):e99 [FREE Full text] [CrossRef] [Medline]
- Shiffman S, Stone AA, Hufford MR. Ecological momentary assessment. Annu Rev Clin Psychol 2008;4:1-32. [CrossRef] [Medline]
- Dowshen N, Kuhns LM, Johnson A, Holoyda BJ, Garofalo R. Improving adherence to antiretroviral therapy for youth living with HIV/AIDS: a pilot study using personalized, interactive, daily text message reminders. J Med Internet Res 2012 Apr 5;14(2):e51 [FREE Full text] [CrossRef] [Medline]
- Haberer JE, Musiimenta A, Atukunda EC, Musinguzi N, Wyatt MA, Ware NC, et al. Short message service (SMS) reminders and real-time adherence monitoring improve antiretroviral therapy adherence in rural Uganda. AIDS 2016 May 15;30(8):1295-1300 [FREE Full text] [CrossRef] [Medline]
- Devine S, Leeds C, Shlay JC, Leytem A, Beum R, Bull S. Methods to assess youth engagement in a text messaging supplement to an effective teen pregnancy program. J Biomed Inform 2015 Aug;56:379-386 [FREE Full text] [CrossRef] [Medline]
- Brown 3rd W, Giguere R, Sheinfil A. Feasibility and acceptability of an international SMS text message-based adherence and survey system in a biomedical HIV prevention study (MTN-017). AIDS Res Hum Retrov 2016;32:386-396 [FREE Full text]
- Chen Y, McFarland W, Raymond HF. Behavioral surveillance of heterosexual exchange-sex partnerships in San Francisco: context, predictors and implications. AIDS Behav 2011 Jan;15(1):236-242. [CrossRef] [Medline]
- Hunter LM, Reid-Hresko J, Dickinson TW. Environmental change, risky sexual behavior, and the HIV/AIDS pandemic: linkages through livelihoods in rural Haiti. Popul Res Policy Rev 2011 Oct;30(5):729-750 [FREE Full text] [CrossRef] [Medline]
- Marks G, Crepaz N, Senterfitt JW, Janssen RS. Meta-analysis of high-risk sexual behavior in persons aware and unaware they are infected with HIV in the United States: implications for HIV prevention programs. J Acquir Immune Defic Syndr 2005 Aug 1;39(4):446-453. [CrossRef] [Medline]
- New HIV Infections in the United States. Centers for Disease Control and Prevention. 2012. URL: https://www.cdc.gov/nchhstp/newsroom/docs/2012/hiv-infections-2007-2010.pdf [accessed 2020-05-19]
- Reports Archive: HIV Surveillance Reports Archive. Centers for Disease Control and Prevention. 2011. URL: https://www.cdc.gov/hiv/library/reports/hiv-surveillance-archive.html#supplemental-archive [accessed 2020-05-19]
- Naranbhai V, Karim QA, Meyer-Weitz A. Interventions to modify sexual risk behaviours for preventing HIV in homeless youth. Cochrane Database Syst Rev 2011 Jan 19(1):CD007501 [FREE Full text] [CrossRef] [Medline]
- HIV and STD Data and Resources: HIV/AIDS Data. Baltimore City Health Department. 2018. URL: https://health.baltimorecity.gov/hivstd-data-resources [accessed 2020-05-19]
- Miller A, Unick J, Harburger D. Maryland Youth Count 2017: A Summary of the Findings From Youth Reach MD’s Second Survey of Unaccompanied Youth & Young Adults Experiencing Homelessness. Youth Reach MD. 2017. URL: http://www.youthreachmd.com/content/wp-content/uploads/2018/02/YRMD-2017-Report-Executive-Summary-FINAL.pdf [accessed 2020-05-19]
- Report of the SB764/HB823 Task Force to Study Housing and Supportive Services for Unaccompanied Homeless Youth. Maryland Governor's Office for Children. 2013 Nov 1. URL: https://goc.maryland.gov/reports/ [accessed 2020-05-19]
- Rotheram-Borus M, Song J, Gwadz M, Lee M, van Rossem R, Koopman C. Reductions in HIV risk among runaway youth. Prev Sci 2003 Sep;4(3):173-187. [CrossRef] [Medline]
- Pfeifer RW, Oliver J. A study of HIV seroprevalence in a group of homeless youth in Hollywood, California. J Adolesc Health 1997 May;20(5):339-342. [CrossRef] [Medline]
- Boivin J, Roy E, Haley N, du Fort GG. The health of street youth: a Canadian perspective. Can J Public Health 2005;96(6):432-437 [FREE Full text] [Medline]
- Estimated HIV Incidence and Prevalence in the United States 2010–2016. Centers for Disease Control and Prevention. 2018. URL: https://www.cdc.gov/hiv/pdf/library/reports/surveillance/cdc-hiv-surveillance-supplemental-report-vol-24-1.pdf [accessed 2020-05-19]
- Children and HIV and AIDS: How Widespread is the AIDS Epidemic? UNICEF. 2008 Feb 26. URL: https://www.unicef.org/aids/index_epidemic.html [accessed 2020-05-19]
- Lopera MM, Einarson TR, Bula JI. Out-of-pocket expenditures and coping strategies for people living with HIV: Bogotá, Colombia, 2009. AIDS Care 2011 Dec;23(12):1602-1608. [CrossRef] [Medline]
- Barennes H, Frichittavong A, Gripenberg M, Koffi P. Evidence of high out of pocket spending for HIV care leading to catastrophic expenditure for affected patients in Lao people's democratic republic. PLoS One 2015;10(9):e0136664 [FREE Full text] [CrossRef] [Medline]
- Onwujekwe OE, Ibe O, Torpey K, Dada S, Uzochukwu B, Sanwo O. Examining geographic and socio-economic differences in outpatient and inpatient consumer expenditures for treating HIV/AIDS in Nigeria. J Int AIDS Soc 2016;19(1):20588 [FREE Full text] [CrossRef] [Medline]
- Ssewamala FM, Han C, Neilands TB, Ismayilova L, Sperber E. Effect of economic assets on sexual risk-taking intentions among orphaned adolescents in Uganda. Am J Public Health 2010 Mar;100(3):483-488 [FREE Full text] [CrossRef] [Medline]
- Odek WO, Busza J, Morris CN, Cleland J, Ngugi EN, Ferguson AG. Effects of micro-enterprise services on HIV risk behaviour among female sex workers in Kenya's urban slums. AIDS Behav 2009 Jun;13(3):449-461. [CrossRef] [Medline]
- Pronyk PM, Kim JC, Abramsky T, Phetla G, Hargreaves JR, Morison LA, et al. A combined microfinance and training intervention can reduce HIV risk behaviour in young female participants. AIDS 2008 Aug 20;22(13):1659-1665. [CrossRef] [Medline]
- Sherman SG, German D, Cheng Y, Marks M, Bailey-Kloche M. The evaluation of the JEWEL project: an innovative economic enhancement and HIV prevention intervention study targeting drug using women involved in prostitution. AIDS Care 2006 Jan;18(1):1-11. [CrossRef] [Medline]
- Raj A, Dasgupta A, Goldson I, Lafontant D, Freeman E, Silverman JG. Pilot evaluation of the making employment needs [MEN] count intervention: addressing behavioral and structural HIV risks in heterosexual black men. AIDS Care 2014 Feb;26(2):152-159 [FREE Full text] [CrossRef] [Medline]
- Dunbar MS, Dufour MK, Lambdin B, Mudekunye-Mahaka I, Nhamo D, Padian NS. The SHAZ! project: results from a pilot randomized trial of a structural intervention to prevent HIV among adolescent women in Zimbabwe. PLoS One 2014;9(11):e113621 [FREE Full text] [CrossRef] [Medline]
- Goodman ML, Selwyn BJ, Morgan RO, Lloyd LE, Mwongera M, Gitari S, et al. Sexual behavior among young carers in the context of a Kenyan empowerment program combining cash-transfer, psychosocial support, and entrepreneurship. J Sex Res 2016;53(3):331-345. [CrossRef] [Medline]
- Jennings L, Shore D, Strohminger N, Allison B. Entrepreneurial development for US minority homeless and unstably housed youth: a qualitative inquiry on value, barriers, and impact on health. Child Youth Serv Rev 2015 Feb;49:39-47. [CrossRef]
- Jennings L, Lee N, Shore D, Strohminger N, Allison B, Conserve DF, et al. US minority homeless youth's access to and use of mobile phones: implications for mhealth intervention design. J Health Commun 2016 Jul;21(7):725-733. [CrossRef] [Medline]
- Jennings L. Do men need empowering too? A systematic review of entrepreneurial education and microenterprise development on health disparities among inner-city black male youth. J Urban Health 2014 Oct;91(5):836-850 [FREE Full text] [CrossRef] [Medline]
- Mayo-Wilson LJ. Text Message Survey Pilot Data. Mendeley Data. 2019. URL: https://data.mendeley.com/datasets/3kxpn6nsx4/1 [accessed 2020-06-05]
- Haberer JE, Kiwanuka J, Nansera D, Wilson IB, Bangsberg DR. Challenges in using mobile phones for collection of antiretroviral therapy adherence data in a resource-limited setting. AIDS Behav 2010 Dec;14(6):1294-1301 [FREE Full text] [CrossRef] [Medline]
Abbreviations
CBO: community-based organization |
HIV: human immunodeficiency virus |
Edited by G Eysenbach; submitted 27.05.19; peer-reviewed by D Santa Maria, T Guetterman; comments to author 16.12.19; revised version received 11.03.20; accepted 29.03.20; published 17.07.20
Copyright©Larissa Jennings Mayo-Wilson, Nancy E Glass, Alain Labrique, Melissa Davoust, Fred M Ssewamala, Sebastian Linnemayr, Matthew W Johnson. Originally published in JMIR Formative Research (http://formative.jmir.org), 17.07.2020.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on http://formative.jmir.org, as well as this copyright and license information must be included.