Published on in Vol 7 (2023)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/43903, first published .
Adapting a Telephone-Based, Dyadic Self-management Program to Be Delivered Over the Web: Methodology and Usability Testing

Adapting a Telephone-Based, Dyadic Self-management Program to Be Delivered Over the Web: Methodology and Usability Testing

Adapting a Telephone-Based, Dyadic Self-management Program to Be Delivered Over the Web: Methodology and Usability Testing

Original Paper

1Center for Innovation to Implementation, Veteran Affairs Palo Alto Health Care System, Menlo Park, CA, United States

2Stanford University, Palo Alto, CA, United States

3Learning Systems International Metcor, Washington, DC, United States

4Veteran Affairs Puget Sound Health Care System, Seattle, WA, United States

5Veteran Affairs Ann Arbor Health Care System, Ann Arbor, MI, United States

Corresponding Author:

Ranak Trivedi, PhD

Center for Innovation to Implementation

Veteran Affairs Palo Alto Health Care System

795 Willow Rd Bldg 324

152-MPD Ci2i

Menlo Park, CA, 94025

United States

Phone: 1 4157872805

Email: ranakt@stanford.edu


Background: The COVID-19 pandemic has amplified the need for web-based behavioral interventions to support individuals who are diagnosed with chronic conditions and their informal caregivers. However, most interventions focus on patient outcomes. Dyadic technology–enabled interventions that simultaneously improve outcomes for patients and caregivers are needed.

Objective: This study aimed to describe the methodology used to adapt a telephone-based, facilitated, and dyadic self-management program called Self-care Using Collaborative Coping Enhancement in Diseases (SUCCEED) into a self-guided, web-based version (web-SUCCEED) and to conduct usability testing for web-SUCCEED.

Methods: We developed web-SUCCEED in 6 steps: ideation—determine the intervention content areas; prototyping—develop the wireframes, illustrating the look and feel of the website; prototype refinement via feedback from focus groups; finalizing the module content; programming web-SUCCEED; and usability testing. A diverse team of stakeholders including content experts, web designers, patients, and caregivers provided input at various stages of development. Costs, including full-time equivalent employee, were summarized.

Results: At the ideation stage, we determined the content of web-SUCCEED based on feedback from the program’s original pilot study. At the prototyping stage, the principal investigator and web designers iteratively developed prototypes that included inclusive design elements (eg, large font size). Feedback about these prototypes was elicited through 2 focus groups of veterans with chronic conditions (n=13). Rapid thematic analysis identified two themes: (1) web-based interventions can be useful for many but should include ways to connect with other users and (2) prototypes were sufficient to elicit feedback about the esthetics, but a live website allowing for continual feedback and updating would be better. Focus group feedback was incorporated into building a functional website. In parallel, the content experts worked in small groups to adapt SUCCEED’s content, so that it could be delivered in a didactic, self-guided format. Usability testing was completed by veterans (8/16, 50%) and caregivers (8/16, 50%). Veterans and caregivers gave web-SUCCEED high usability scores, noting that it was easy to understand, easy to use, and not overly burdensome. Notable negative feedback included “slightly agreeing” that the site was confusing and awkward. All veterans (8/8, 100%) agreed that they would choose this type of program in the future to access an intervention that aims to improve their health. Developing and maintaining the software and hosting together cost approximately US $100,000, excluding salary and fringe benefits for project personnel (steps 1-3: US $25,000; steps 4-6: US $75,000).

Conclusions: Adapting an existing, facilitated self-management support program for delivery via the web is feasible, and such programs can remotely deliver content. Input from a multidisciplinary team of experts and stakeholders can ensure the program’s success. Those interested in adapting programs should have a realistic estimate of the budget and staffing requirements.

JMIR Form Res 2023;7:e43903

doi:10.2196/43903

Keywords



Background

Evidence-based behavioral interventions provide individuals with chronic health conditions with a variety of tools for making behavior changes, monitoring their health status, and communicating effectively with health care teams. These behaviors are collectively termed as “self-management,” and those with chronic conditions are often supported by a relative or friend in maintaining these behaviors. In this way, the stress of managing chronic conditions is shared by the person with the diagnosis and their relatives and friends who provide direct care, self-management support, encouragement, and emotional support [1-3].

Dyadic behavioral interventions are designed to support both the patient and their caregivers in coping with emotional and practical challenges [4]. Well-designed, dyadic programs can improve patients’ adherence to self-management recommendations, quality of life, and self-efficacy while reducing hospitalization rates [5,6]. Most [7] dyadic interventions require real-time communication between intervention recipients and health coaches or facilitators, either in person or via telephone. Asynchronous communication [4,8,9] between program participants and facilitators is less common. Technology-enabled interventions have the potential to decrease the amount of resources needed per patient-caregiver dyad, and as a consequence, such programs can be more scalable and more easily modified and personalized [10].

Distance technology is particularly important for self-management interventions because the goal of such interventions is to change users’ behavior in their day-to-day lives. Technology-enabled dyadic self-management interventions have been found to be effective for individuals with many clinical conditions, notably heart failure, diabetes, cancer, and depression [11-13]. Systematic reviews have shown that technology-based interventions can improve knowledge, behaviors, and clinical outcomes for patients and their informal caregivers [14-16]. However, key gaps remain, as summarized by a systematic review of 101 studies representing 52 unique dyadic eHealth interventions [17]. First, only 18 interventions focused on adult dyads, and 9 focused specifically on adult dyads managing cancer. This highlights the research gap in technology-enabled interventions that address the needs of adult dyads managing common chronic conditions. Second, dyadic interventions developed so far have been focused on outcomes of the care recipient, with only 1 in 5 studies including outcomes for the caregiver. Third, dyadic interventions rarely address the strain on the dyadic relationship caused by chronic illness management, even though such challenges have been well documented [18-22]. We sought to develop a dyadic technology–enabled intervention that would simultaneously support the needs of patients and caregivers and their interpersonal relationship.

We previously developed and successfully pilot-tested a telephone-based dyadic self-management program called Self-care Using Collaborative Coping Enhancement in Diseases (SUCCEED) [23]. Over six 1-hour sessions, patient-caregiver dyads learned and practiced cognitive behavioral skills to reduce individual and relationship stress, improve positive emotions, improve communication and collaboration, increase pleasant activities, and maintain behavior changes despite challenges. Weekly homework involved developing an action plan to practice and sustain new skills. A pilot test of SUCCEED showed high acceptability and feasibility. However, as has been documented by others [24], dyads found it logistically challenging to find time weekly to participate in synchronous sessions with the facilitator. In addition, as sessions required a live facilitator, programs such as SUCCEED often have long wait times and limited reach [25].

Objective

We developed a web-based, self-guided version of SUCCEED to address barriers to accessing the program and challenges associated with scaling up. In this paper, we defined a process of adapting existing dyadic behavioral interventions to web-based delivery platforms and estimated the resources necessary to complete this rigorously. Specifically, we described the process of adapting SUCCEED for web-based use and reported results of initial usability testing. We also reported estimates of the costs and staffing resources required for this adaptation to support future planning.


Ethics Approval

This study was reviewed and approved by the institutional review board at the Department of Veterans Affairs (VA) Palo Alto Health Care System and Stanford University (institutional review board protocol #40022) and received approval from the local scientific review committee and the health system’s information security officer (#TRI0008). All study participants provided verbal informed consent, as a waiver of documentation had been approved.

Overview of the Study

The analyses presented in this paper represent the primary purpose of the study. Study data were deidentified before the analysis. A code linking the study ID numbers with the identifying information was maintained by the study team. All data including linked data files were stored behind a firewall on secured drives. Study participants in the focus groups were compensated with US $25, and those who participated in the usability testing were compensated with US $50. This study was conducted between 2016 and 2020. The web design was completed between 2016 and 2019 (before the COVID-19 pandemic), and usability testing was conducted between 2019 and 2021 (overlapping with the COVID-19 pandemic).

Our study was informed by input from multiple stakeholders. We established and sustained partnerships with the Veteran and Family Advisory Board at our institution and obtained their input at each stage of the study. We assembled a multidisciplinary team of content experts, methodologists, administrative personnel, web designers, and software engineers. Stakeholders’ input guided all aspects of the program development process, which included ideation, prototyping, refining the prototype, finalizing the module content and scripts, programming web-SUCCEED, and usability testing (Figure 1) [26]. We strived to use human-centered design principles in the development process, including input from patients and their caregivers at each stage, to ensure that our eventual program was not only theoretically sound but also valuable to end users. In this study, we were guided by the User Experience Honeycomb [27], a user experience framework, which notes that for a product to be valuable, it should be useful, usable, desirable, findable, credible, and accessible. In addition, to ensure human-centeredness, we optimized the user experience by eliciting users’ needs and values, including making our technology accessible across abilities and disabilities.

Figure 1. Timeline and development process for adapting the Self-care Using Collaborative Coping Enhancement in Diseases intervention to a web-based format. SUS: System Usability Scale.

Step 1—Ideation

The usefulness and credibility of the program were established during the development of the SUCCEED intervention [23]. The content foci for SUCCEED were based on cognitive behavioral theory and dyadic coping and communication frameworks, including the Dyadic Behavioral Health Change Model [23]. Training objectives were drawn from the original SUCCEED intervention and included skills to manage stress and negative emotions, improve interpersonal communication and collaboration, and build a fulfilling life in the context of chronic health conditions. Disease-specific content was eliminated from session 1 based on feedback we received from participants in the SUCCEED pilot study. The technology platform was selected based on input from members of the local Veteran and Family Council, who shared that veterans preferred web-based programs over mobile apps, primarily because the content was easy to view on a large screen.

Step 2—Prototyping

The web design was completed by a company experienced in developing learning systems for the Veterans Health Administration and was intended to maximize the desirability, usability, and accessibility domains of SUCCEED. We were committed to the principles of equity and inclusivity in our web-based platform and made accessibility central to the design. Our platform was designed to be compliant with Section 508 of the Rehabilitation Act [28]. Amended in 1998, the Rehabilitation Act is a federal law requiring federal agencies to “provide individuals with disabilities equal access to electronic information and data comparable with those who do not have disabilities, unless an undue burden would be imposed on the agency.” Section 508 documents the technical standards that must be met to ensure that technology platforms are accessible for individuals with physical, sensory, or cognitive disabilities. In collaboration with the web designers, the project team developed initial wireframes, that is, nonfunctional schematics depicting the framework and flow of the website. We incorporated inclusive design elements in the prototype, including using a sans serif font, large font size, and minimalist design, which are recommended for those with visual disability. After the wireframes were established, we added graphics, color, and styling. We designed the home page to include an introductory video and the content area of SUCCEED. Initial design concepts for module 1 were carried through in the planning of all 3 skills training modules, each of which included an overview of the content, a link to the action plan review, links to the module content, and a link to a blank action plan to be completed as homework.

Step 3—Refining the Prototype

Overview

We assessed the desirability of the program by obtaining feedback from focus group participants about the website prototypes created in Microsoft PowerPoint (Microsoft Corporation). Focus group participants were recruited via flyers posted in VA clinics that included a phone number to contact the study team if they were interested. Study participants were required to be aged at least 18 years and either have at least one chronic condition (veteran) or be a caregiver of someone with at least one chronic condition (caregiver). Participants were provided a stipend worth US $25 for participating in a 1-hour focus group to provide feedback about web-SUCCEED. Focus groups were facilitated by an expert in qualitative research and assisted by a study team member who took notes. A structured interview guide was developed. During the focus groups, participants were provided with the rationale of the web-SUCCEED program, and feedback was elicited about the utility of the program, color scheme, layout, stock photos, and readability. Sample questions included, “Would you use an online program to manage your health?” “What do you like most about this website? What do you like the least?” and “What about the webpage resonates with you, and what does not?” Participants were provided with writing materials and encouraged to provide both verbal and written feedback. Focus groups were audio recorded and professionally transcribed. Focus group data were analyzed for themes around the usefulness of a web-based format and the aesthetics of the prototypes. Rapid analytic approaches were used to tabulate responses to the questions that would help address the questions, “Would an online program be useful in helping veterans with chronic health conditions manage their health” and “How accessible and appealing were the prototypes of web-SUCCEED?”

Overall, 2 focus groups with a total of 13 veterans were conducted. Unfortunately, no caregivers volunteered for the focus groups. Typical of VA patient samples, of the 13 participants, 12 (92%) participants across the 2 focus groups were men. Our sample was ethnically diverse (6/13, 46% were from underrepresented minority groups). Of these, 4 were African American (31%), and 2 (15%) were multiracial. Among the 13 participants, the mean age was 68.3 years, 9 (69%) were retired, 2 (15%) were disabled, and the remaining (n=2, 15%) were employed. All participants (13/13, 100%) had at minimum a high school degree and 31% (4/13) had a college degree or higher level of education. Of the 13 participants, 5 (38%) participants lived alone. Participants reported receiving care for their health conditions from a variety of caregivers, mainly significant others (3/13, 23%) and children (3/13, 23%).

Preliminary coding was conducted by the study team, and themes were finalized by the principal investigator (PI). In total, 2 themes were identified through the focus groups.

Theme 1—Web-Based Interventions Can Be Useful for Many but Should Include Ways to Connect With Other Users

Participants noted that web-based programs would be useful because they helped overcome logistical barriers and noted that they found MyHealtheVet to be useful in communicating with their health care team. When asked whether participants would use a program such as web-SUCCEED, participants noted, “I would try it” and “I think it’d be a good idea.” However, web-based programs were not universally acceptable, and participants expressed concerns around the security of medical information and that web-based programs would reduce interpersonal connections. A participant summarized the following:

But it sounds like everybody’s saying that if you have a chance to do it, do it both ways...there are some people, you know, it’s the actual face to face is better for certain people. And some people, like me...sometimes I just want to be on the computer.

Participants noted that web-SUCCEED should include strategies to communicate with the study team and with one another. We elicited feedback about whether discussion boards for communication among patient users should be included, as these have been shown to enhance programmatic engagement in self-management programs [29,30]. Although some participants (7/13, 54%) supported having a way to connect with other participants, other patients in the focus groups disagreed, noting that web-based connections were not as desirable as personal connections. People noted that preferences for discussion boards likely varied. A veteran said the following:

I would use it. My wife wouldn’t use it...I like the idea of being able to send little messages back.

Alternatives to discussion boards were suggested, including group meetings at set intervals where people could meet in person or via teleconference calls to share stories. A participant compared this with the focus group participation and noted the following:

This type of sit-down group [is] useful...for the interpersonal connections that we’re able to make.

Another participant noted the following:

But you can’t throw computers away. And so the computers are just adjunct.

Focus group respondents suggested that we clearly include our contact information on the website and make it easy to communicate directly with the study team. Focus groups also addressed the pros and cons of a combined discussion board for veterans and caregivers.

Theme 2—Prototypes Were Sufficient to Elicit Feedback About the Esthetics, but a Live Website Would Elicit Better Feedback

Focus group participants indicated that they liked the color scheme and design depicted by the prototype and that they could relate to the people in the photos. Participants noted that the prototypes were “straightforward,” “not convoluted,” and “simple, but effective.” When asked, participants preferred having the choice of navigating between modules rather than having to follow a forced order. They also preferred a program they could engage in individually, even if their caregiver did not want to participate or if they did not have a caregiver.

Feedback from focus groups was summarized using templates provided by the design team to modify the website. This information was used to inform an extramural grant proposal that was important for securing funds to develop the website for web-SUCCEED. The multidisciplinary team supported by that grant included the lead (a clinical health psychologist and caregiver expert), web designers, psychologists, internists, and an epidemiologist.

Step 4—Finalizing the Module Content and Scripts

This step focused on the usefulness and credibility of the program while incorporating previous feedback related to other domains. We used specification documents from the web design team as an outline for developing the content, layout, flow, and script for the narration. We sought to develop the module narratives without degrading the key behavioral content from SUCCEED. The initial narrative script was developed by the research team and 4 experts (2 internists, 1 cardiologist, and 1 clinical psychologist) based on the content in the original SUCCEED modules, and the remainder of the experts provided feedback. The finalized modules were the following:

  1. Introduction
  2. Module 1—skills to reduce stress and improve positive emotions
  3. Module 2—skills to reduce relationship stress and improve interpersonal relationships, which combined SUCCEED sessions 4 and 5
  4. Module 3—building a fulfilling life and maintaining behavior change, which was SUCCEED session 6

Example screenshots are provided in Figures 2 and 3. The introduction module provided a welcome message, an overview of the program and its rationale, steps to develop an action plan, and tips to navigate the website. The site required that all participants complete the introduction module before proceeding to subsequent modules, which was designed based on focus group feedback to allow modules to be completed in any order at the user’s preference. At the end of each module, participants developed an action plan to practice new skills and were prompted to provide feedback regarding the relevance of the content. We designed a resource page that included skills training exercises; a list of VA resources, including links to other VA programs that support those with chronic disease and their caregivers; community resources such as the Family Caregiver Alliance; and popular chronic disease management apps. All links referenced reliable information about chronic diseases for patients and their caregivers. Worksheets and action plan homework were also available to be downloaded from the website.

Figure 2. Web-based Self-care Using Collaborative Coping Enhancement in Diseases home page.
Figure 3. Example landing page.

Step 5—Programming Web-SUCCEED

We chose WordPress as our platform, which is both Section 508 compliant and widely used to support behavioral interventions in the VA. In addition to the design elements that were included in the prototype, we provided voice-over narration to assist those with visual disabilities and verbatim written text of all module content for those with hearing disabilities. When accessed via a PC, the web pages did not require scrolling, which streamlined access to information and made navigating the website easy for those with motor disabilities. We developed audio recordings of key exercises that could be accessed on the resources web page. Our team reviewed 600 stock photographs to choose those that represented a diversity of skin tones, ages, and relationships (eg, old woman–young woman, 2 men and a child, and Black man–Black woman).

To allow user access while ensuring system security, each participant was required to have an email address, which served as their default user ID that could not be changed. Users could change their display names after the initial log-on. In addition to unique IDs, each user set their own password, and we were able to log session activity according to participant. Users also were given a unique study identifier.

A combined discussion board for patients and their caregivers was developed, and posts were reviewed by the moderator before being made public to the group. On the basis of the focus groups’ feedback, we configured 2 ways to communicate with the program team. A “Questions” button on the navigation bar allowed participants to connect with the team for nonurgent matters, such as with inquiries about creating their first action plan. An “Emergency” button was placed on the navigation bar, and clicking this button generated a prefilled message that had the sender’s email address and an optional text field. This page also listed the phone numbers of the National Suicide Help Line and a reminder to call 911 in case of an emergency. If patients or caregivers generated an urgent message, the study PI, a licensed clinical psychologist, would be alerted immediately via email and would reach out to evaluate the situation and respond as needed. However, this did not occur during testing. The initial version of the website did not render well on iPads; therefore, these were not recommended for use with the intervention.

Step 6—Usability Testing

Recruitment and Eligibility

Veterans and caregivers were recruited in person through the VA Palo Alto Health Care System (VAPAHCS) primary care clinics, women’s health clinics, and nephrology departments and via flyers placed throughout the VAPAHCS Palo Alto and Menlo Park health system campuses. Caregiver coordinators at VAPAHCS also helped disseminate information about the study and identify caregivers.

Studies have suggested that 80% to 85% of all usability problems can be uncovered by having 4 participants navigate web-based tools [31]; however, more recent evaluations suggest increasing that number [32]. Therefore, our goal was to recruit at least 5 veterans and 5 caregivers to participate in usability testing. Veterans were eligible if they were aged ≥18 years, had been diagnosed with a chronic condition that had clear self-management recommendations, and had a relative or friend who helped them manage their health condition. We excluded veterans who had cognitive impairments; were receiving intravenous chemotherapy, radiation therapy, or hemodialysis; lived in a skilled nursing facility; or had a paid home-based caregiver (eg, home nurse aide) who provided >50% of their home care. Eligible caregivers were aged at least 18 years, had been identified as primary caregivers by the veteran, and were not being treated for cancer and not undergoing hemodialysis.

Potential patients and caregivers were further screened for challenges with self-management and their degree of comfort with technology. To screen for challenges with self-management, we used an adapted version of the Diabetes Distress Screening scale that did not use diabetes-specific language [33]. Veterans were asked to rate the extent to which they felt overwhelmed by the demands of living with chronic illnesses and were failing regimens in any of their conditions. Caregivers were asked to what extent they felt overwhelmed in managing their care recipient’s chronic conditions and were failing in managing their care recipient’s conditions. Participants rated each item on a scale of 1 (not a problem) to 6 (serious problem). Veterans and caregivers were eligible if their average rating was at least 3 or if the sum of the 2 items was ≥6. Eligible and interested veterans and caregivers provided informed consent and contact information, including email. Although our goal was to create a program that would eventually address dyadic needs, we also allowed veterans and caregivers to participate individually for this usability study if they met all other criteria but the other person was not interested in participating.

We iteratively refined the retention methods. After noting that previous participants were not progressing through the program, we sought to re-engage participants and encourage them to complete the study. These procedures involved calling 3 times at various times of day for participants who had provided consent but had not begun the program. For participants who had begun the program, we called and left voice mails multiple times, varying the times of day and calling over the weekend. One of our initial challenges was that our study team was unable to check website use for progress. Eventually, a study team member was trained by the web development team to check website use and progress. This allowed us to proactively reach out to participants who were not progressing through the program. Of the 53 individuals who were approached and were eligible, 17 (32%) patients and 16 (30%) caregivers were enrolled (total: 33/53, 62%). Our recruitment rate was higher than the average recruitment rate of 51.2% found in a systematic review of 53 trials of dyadic behavioral interventions [34]. Of these 33 participants, 8 (24%) patients and 8 (24%) caregivers completed the usability testing. The main reason for participants’ withdrawal was the stress of the pandemic.

Measures and Procedures

Once consented, participants were asked to complete baseline surveys to capture detailed demographic information including their age, gender, race, ethnicity, education, and marital status and the relationship between the veteran and caregiver. Participant’s financial status was measured using a tool in which respondents rated the extent to which they could afford necessities and luxuries. Baseline surveys also assessed participants’ use of and comfort with technology. We asked a series of questions that assessed the frequency with which respondents used computers in their daily lives, the ways in which they used computers (eg, spreadsheets, Word documents, and internet), and their comfort with technology (eg, “I feel very comfortable learning how to use new programs” and “I try to avoid using computers when possible.”) After completing the baseline measures, participants were emailed a unique link to log in to web-SUCCEED and were encouraged to complete all modules.

Upon the completion of the program, we emailed a Qualtrics link with follow-up surveys designed to assess the usability of web-SUCCEED. Usability was measured using a modified version of the Systems Usability Scale [35], which scored items on a 7-point scale (instead of the currently recommended 5-point scale) where 1 was labeled “Strongly Agree,” 4 was labeled “Neutral,” and 7 was labeled “Strongly Disagree.” Example items include, “I thought this program was easy to understand” and “I would need help from a technical support person to be able to use this program.” The reliability coefficient α of the Systems Usability Scale is excellent, ranging between 0.85 and 0.92, and its concurrent validity correlation coefficient is 0.81 [36,37]. Scores >4 are coded as “agree,” scores of exactly 4 are coded as “neutral,” and scores <4 were coded as “disagree.” An exit survey via Qualtrics or telephone was used to provide feedback about the program’s length, content, mode of delivery, and perceived utility and study burden. We tracked program costs through personnel full-time equivalent employee, costs of building the website, and costs to conduct usability testing.

It should be noted that we had designed a “think-aloud” protocol using the Health IT Usability Evaluation Model framework [38]. However, we were unable to conduct them. Neither veterans nor caregivers were interested in attending 4 in-person sessions to assess the usability of each module. Once COVID-19 shelter-in-place restrictions were established, we offered to conduct these interviews over telephone; however, only 6% (1/16) of the participants agreed to this modified approach. Therefore, we relied on the usability survey and exit interviews to obtain feedback.


Overview

Overall, 16 participants including 8 (50%) veterans and 8 (50%) caregivers participated in usability testing. All participants (16/16, 100%) completed at least one module and the usability survey (Table 1). All veterans (8/8, 100%) in this study were men, whereas all caregivers (8/8, 100%) were women. Most participants (10/16, 63%) identified as White, except for a veteran who identified as having multiple ethnicities (Black, White, and Native American) and a caregiver who identified as Black or African American. Most veterans and caregivers (13/18, 72%) had at least a high school education, and most (11/16, 69%) reported that they were able to afford to pay their bills. Overall, one-third (5/16, 31%) of our participants were disabled (2/8, 25% of veterans; 3/8, 38% of caregivers); however, the type of disability was not documented. All veterans (8/8, 100%) reported that a spouse or partner cared for them, whereas all caregivers (8/8, 100%) reported caring for themselves.

Most participants (12/16, 75%) were frequent users of technology and used computers for several tasks listed in Table 2. Comfort with using computers and the internet is shown in Figure 4.

Participants, especially veterans, rated web-SUCCEED high on usability. Mean scores for each item are provided in Figure 5. Veterans agreed that the site was easy to understand (mean score 5.8, SD 1.5), easy to use (mean score 6, SD 1), and easy to complete (mean score 6, SD 1.2) and were confident in their ability to use it (mean score 5.8, SD 0.7). All veterans (8/8, 100%) noted that they could learn to use the site quickly (mean score 6.3, SD 0.5). Notable negative feedback included “slightly agreeing” that the site was confusing (2/8, 25%) and awkward (1/8, 13%). All veterans (8/8, 100%) agreed that they would choose this type of program in the future to access an intervention that aims to improve their health (mean score 6.2, SD 0.9). Caregivers’ mean scores for each usability item were lower than those of veterans; however, we did not test for statistical differences. Caregivers agreed that the site was easy to understand and easy to use (mean score 5.1, SD 1.4 for both) and easy to complete (mean score 5.1, SD 1.5) and that most people would learn to use the site quickly (mean score 5, SD 1.3). Unlike the veterans, caregivers were neutral in their confidence in using the site (mean score 4.4, SD 1.1). Caregivers also gave slightly high mean scores for negative statements such as “I thought this program was too complex” and “Using this program felt awkward to me.”

Overall, 75% (12/16) of the participants disagreed with the statement that “they would need help from a technical support person” to navigate the website. However, many participants required support from the study team. The key issues included helping participants find the web-SUCCEED link in their email, helping with changing their password from the default, and encouraging them to complete the program. Overall, 13% (2/16) of the participants who accessed the program on their iPad noted that the “next” arrow was not visible. The study team manually reset the password for some participants to reduce participant frustration. For additional security, we included a task that would confirm that the user was human. Participants described this step as frustrating and time-consuming.

Feedback from participants included the following comments by veterans:

It was helpful to me.
It was pretty easy, my wife helped me with it; I’m not that good with computers, [but] it wasn’t that hard to use.
I’ve saved the URL and hope to access it in the future as a resource.

Caregivers were also generally positive in their feedback and noted the following:

I thought it was pretty user-friendly.
Overall, a good program.

Veterans noted that they would use an intervention such as this; recommend this intervention to others; and if permitted, continue to use web-SUCCEED beyond their study participation. Participants completed the modules in the order presented. All participants completed modules in multiple sittings. A veteran noted the following:

Sometimes it felt too long, but you could take breaks.

Many caregivers noted that they were able to navigate with ease, and those who initially found it challenging were able to complete web-SUCCEED with assistance from a member of the study team.

Table 1. Demographics of participants involved in usability testing.
CharacteristicsVeterans (n=8)Caregivers (n=8)
Age (years), mean (SD)66 (18)58 (16)
Gender (women), n (%)0 (0)7 (88)
Race (White), n (%)5 (63)5 (3)
Ethnicity (Hispanic), n (%)1 (13)1 (13)
Married or in a romantic partnership, n (%)5 (63)6 (75)
Highest grade completed, n (%)

High school or GEDa1 (13)1 (13)

Some college or 2-year degree2 (25)5 (63)

4-year college1 (13)1 (13)

Higher than 4-year college2 (25)0 (0)
Employment status, n (%)

Employed at a job for pay—full time2 (25)1 (13)

Homemaker—not currently working for pay0 (0)1 (13)

Not employed—retired2 (25)2 (25)

Not employed—disabled2 (25)3 (38)
Financial situation, n (%)

After paying bills, has enough for special expenses3 (38)2 (25)

Has enough to pay bills but little for special expenses2 (25)4 (50)

Has enough to pay bills but only while cutting back1 (13)1 (13)
Number of people available to help, n (%)

≥25 (63)4 (50)

11 (13)1 (13)

00 (0)2 (25)

aGED: General Educational Development.

Table 2. Participants’ uses of computers.
Technology useVeterans (n=8), n (%)Caregivers (n=8), n (%)
Uses the computer...

Everyday4 (50)2 (25)

1-5 times a week1 (13)5 (63)

Less than once a month1 (13)0 (0)
Uses computers for...

Word processing4 (50)4 (50)

Spreadsheets2 (25)1 (13)

Photos3 (38)4 (50)

Games3 (38)3 (38)

Searching for information5 (63)5 (63)

Buying products3 (38)4 (50)

Social networking2 (25)3 (38)

Email5 (63)5 (63)

Searching for health information5 (63)3 (38)

Buying medications or medical supplies2 (25)3 (38)

Communicating with their health provider4 (50)5 (63)

Assisting with making treatment decisions1 (13)0 (0)

Social networking regarding health issues1 (13)1 (13)

Other1 (13)5 (63)
Figure 4. Comfort with technology.
Figure 5. Usability test results.

Estimated Cost

The software development phase cost US $100,000 in direct costs paid to the team based in Washington, District of Columbia, responsible for development. This included a project manager and a web designer (ideation and prototyping: US $25,000; building the website: US $75,000). The other substantial contributor to cost was personnel. Project staff (excluding PI time) involved 0.5 full-time equivalent employee masters-level staff during the software development phase who served as the overall project coordinator and helped design the content and guides and a total of 2.25 full-time equivalent employee for the usability testing. All personnel were part time contributors on this project, which may have increased the time needed to conduct the project. Many were in training and were not provided a salary or were compensated via a stipend for their time. Having dedicated, paid personnel for this project from a common source of funding may reduce both the cost and time required to adapt existing programs into a web-based format.


Principal Findings

Our first goal in this project was to develop and conduct usability testing of a web-based dyadic self-management program for adults managing chronic health conditions. Our second goal was to document the process of doing so to aid future adaptations. There were several key findings. First, we successfully used a systematic and rigorous process that included a multidisciplinary team, multiple stakeholder involvement, and human-centered design principles to adapt a facilitated dyadic self-management support program into a version that was web based and self-guided. Second, we were able to demonstrate the usability of this adapted web-based program for both veterans and caregivers. Third, we found that despite being self-guided, our program required engagement of the study team to encourage completion and solve technical questions.

One of the most important lessons learned from this process was that even “completely self-guided programs” require live technical support and encouragement from study staff. Although it is possible that the pandemic exacerbated usability problems, a recent review by Shaffer et al [17] concluded that the most efficacious programs had study staff maintain engagement with study participants. Other reviews have also found that web-based stress management interventions were most effective when web-based coaching was also provided [39]. These contacts can be useful to provide additional behavior change coaching, encourage active participation, and troubleshoot any challenges or misunderstandings that may arise among users. On the basis of the results of our pilot study and this review, our clinical trial will be using a “flipped classroom” format that uses both the web-based modules and facilitated telephone-based sessions to review the content, keep participants engaged, and solve technical or computer programming–related challenges.

Dyadic health behavior change is an emerging field, with previous studies examining technology-based interventions for patients and their caregivers managing various chronic diseases including cancer and diabetes [21]. One of the most well-known dyadic interventions was the Family Involvement, Optimistic Attitude, Coping Effectiveness, Uncertainty Reduction, Symptom Management (FOCUS) psychoeducational program, which aimed to improve outcomes in family involvement, patient and caregiver optimism, coping, uncertainty management, and symptom management for survivors of breast cancer and their family members. Dyads in the FOCUS program showed improvements in quality of life, emotional and functional well-being, and self-efficacy [22]. Trials of the dyadic CarePartner self-management interventions have also shown that providing self-management support to patient-caregiver dyads improves outcomes when compared with both usual care and patient-only interventions [5,40,41]. Web-SUCCEED builds on the success of these programs by explicitly supporting the role of interpersonal caregiving relationships in the self-management processes. Our newly funded randomized clinical trial of web-SUCCEED will further advance our knowledge of dyadic self-management across different chronic conditions.

The costs of developing multicomponent behavioral programs such as web-SUCCEED are not often published despite it being an important consideration, especially among early career investigators. We found that most of the costs were attributed to software development, followed by study personnel. Resourcing program adaptation through multiple funding sources caused delays and likely introduced inefficiencies in the development process. Future studies should use the cost guidelines that we provide and apply for sufficient funding to conduct both the adaptation and usability testing processes.

We missed the opportunity to ask participants why they did not use the discussion board. There may be several explanations, such as participants’ privacy concerns [42] and insufficient number of participants at any given time to have robust synchronous conversations. Our study team did not provide specific prompts to encourage participation, and participants could have merely forgotten about this feature. This is supported by the findings that participants needed prompts to progress through the program. The literature about the use of study-specific discussion boards is mixed, and their utility remains as an open question. Although some studies have found study-specific social media to support robust engagement, other studies highlight the benefits of using existing social media platforms, including their ubiquitous nature; minimal skills training of study participants; and vibrant, user-friendly graphics [43]. We also noted that, unlike the Veteran and Family Council, which recommended discussion boards to develop a sense of community, focus groups stated a preference for connecting live, either in person or over the telephone. As web-based gatherings have become normalized since the COVID-19 pandemic, future studies should consider this strategy of fostering a sense of community within the study context and making this forum available to both current participants and alumni.

Participants’ time to complete the program was considerably longer than expected. Although we had expected to deliver the program within 6 to 8 weeks, participants required 12 to 16 weeks. In most cases, delays were caused by participants forgetting about their study progress and participation. This was exacerbated by the fact that only the web design company had access to information about the participants’ progress. Although this additional step was a safeguard, it introduced substantial inefficiencies. Assigning a study team member to have administrative access allowed us to track progress, and we discovered that some participants viewed the introductory video but not the modules. Refining study procedures to track progress, engaging participants through weekly reminders, and offering technical help improved recruitment and retention.

On the basis of this experience, we offer the following recommendations for future efforts to adapt dyadic, facilitated behavioral interventions into self-guided, web-based versions (Textbox 1).

A strong multidisciplinary team with the required content, methodological, and technical expertise is critical. We recommend that research teams secure concurrent funding to support adaptation and usability testing to ensure quick iterations and more efficient use of resources. Interventions should follow the same conceptual framework as the original program while incorporating a usability framework. Research teams should follow a rigorous and iterative process that involves ideation, prototyping, refining, and usability testing, as is demonstrated in this study. An underreported domain in technology development is the cost of developing technology-enabled programs. Research teams should develop cost measures that include not only the cost of initial development but also the cost of maintenance and hosting. On the basis of our experience and current literature, study-specific discussion boards are controversial. If used, research teams should proactively engage users through both reminders and by initiating discussions. Tracking user engagement and tracking the success of each engagement strategy can provide valuable lessons about the usability and eventual scalability of programs. This can be accomplished by training a core research team to have administrator-level access to the software analytics. As with any study, research teams should develop protocols to maintain contact with participants and track the success of each strategy; use validated measures, including for usability; and finally, elicit feedback from participants via semistructured interviews.

Textbox 1. Recommendations.
  • Assemble a multidisciplinary team with content, methodological, and technical expertise.
  • Secure concurrent funding to support all steps of the adaptation and usability testing.
  • Maintain integrity to the theoretical underpinnings of the original program.
  • Use a strong usability framework to guide the adaptation.
  • Follow an iterative process involving ideation, prototyping, refining, and usability testing.
  • Develop and track cost, including cost of maintenance and hosting.
  • Proactively engage participants in program-specific discussion boards.
  • Develop and track the success of engagement protocols.
  • Train the study staff to use software analytics to track user engagement in real time.
  • Use validated measures to measure usability and acceptability.
  • Elicit feedback about usability and acceptability of interventions from participants via semistructured interviews.

Strengths and Limitations

A key strength of the study was including extensive stakeholder engagement and feedback from a Veteran and Family Advisory Board; a multidisciplinary team of content experts, methodologists, administrative personnel, web designers, and software engineers; and study participants. Furthermore, we used a rigorous and iterative approach that allowed us to continually improve our product from an initial telephone-based program to a prototype of web-SUCCEED to a beta version of web-SUCCEED that we tested for usability. Our study also had important limitations in addition to those noted previously. First, the logistical challenges of having individuals attend multiple in-person sessions was a key barrier to our ability to conduct think-aloud sessions. We sought to pivot to conducting these over the telephone, but participants were less willing to engage in this way. This step could have given us important data regarding the subjective experience of navigating web-SUCCEED. Second, focus groups comprised only veterans. Caregivers’ input may have modified the web design in ways that were different from the veterans’ input. This may be a reason why caregivers’ usability scores were generally lower than those of the veterans. However, it should be noted that caregivers provided feedback about the look and feel of the inanimate wireframes and were otherwise involved in all aspects of developing SUCCEED and usability testing. Third, owing to funding gaps, the adaptation and usability testing process of web-SUCCEED took longer than anticipated. To avoid this, researchers should consider applying for funding to conduct all aspects of adaptation that are described in this paper. We are currently launching a randomized clinical trial to evaluate the effectiveness of web-SUCCEED and have enhanced our procedures to address the limitations that we described in this paper, including improved staffing, regularly scheduled calls with participants, procedures to track engagement, and modifications to the software to improve usability.

Conclusions

The COVID-19 pandemic has been a catalyst to rapidly develop and deploy tools that enhance technology-enabled care [44]. The pandemic has also spurred new frameworks that can be used to adapt existing interventions to different contexts [45,46]. Although our study predates the pandemic, the principles we followed are consistent with the new frameworks. Adapting existing interventions is potentially cheaper than creating programs de novo, as the empirical base of the programs has been established. Our experience of developing web-SUCCEED and recommendations provide a road map to adapt other dyadic self-management programs. In parallel, more studies are needed to expand the body of knowledge about adapting and testing traditional-modality interventions into web-based interventions and to understand the strengths and limitations of each approach.

Acknowledgments

This project was funded by a locally initiated project from the Center for Innovation to Implementation, Department of Veterans Affairs (VA) Health Services Research and Development (HSR&D; PPO-16-139), and Stanford Presence program. RT was funded by a VA HSR&D Career Development Award (CDA 09-206), and DMZ was funded by VA HSR&D Career Development Award (CDA 12-173). JDP, CT, and KH were funded through VA Senior Research Career Scientist Awards. MBH was supported by an Advanced Fellowship in Health Services Research through the VA Office of Academic Affairs. The authors thank Dr Josef Ruzek, Ms Cindie Slightam, and Dr Andrea Nevedal for their contributions during the early stages of this project. The views expressed in this paper are those of the authors and do not necessarily reflect the views of the VA. Preliminary results of this study were presented at the Virtual Annual Meeting of the Society of Behavioral Medicine in April 2021.

Data Availability

The data sets generated and analyzed during this study are available from the corresponding author upon reasonable request and consistent with our authorization from our institutional review board.

Conflicts of Interest

None declared.

  1. Lee AA, Piette JD, Heisler M, Janevic MR, Langa KM, Rosland AM. Family members' experiences supporting adults with chronic illness: a national survey. Fam Syst Health. Dec 2017;35(4):463-473. [FREE Full text] [CrossRef] [Medline]
  2. AARP; National Alliance for Caregiving. Caregiving in the United States 2020. American Association of Retired Persons. May 14, 2020. URL: https://www.aarp.org/ppi/info-2020/caregiving-in-the-united-states.html [accessed 2023-05-10]
  3. Ramchand R, Tanielian T, Fisher MP, Vaughan CA, Trail TE, Epley C, et al. Hidden heroes: America's military caregivers - executive summary. Rand Health Q. Jun 01, 2014;4(2):14. [FREE Full text] [Medline]
  4. Hartmann M, Bäzner E, Wild B, Eisler I, Herzog W. Effects of interventions involving the family in the treatment of adult patients with chronic physical diseases: a meta-analysis. Psychother Psychosom. 2010;79(3):136-148. [CrossRef] [Medline]
  5. Piette JD, Striplin D, Marinec N, Chen J, Trivedi RB, Aron DC, et al. A mobile health intervention supporting heart failure patients and their informal caregivers: a randomized comparative effectiveness trial. J Med Internet Res. Jun 10, 2015;17(6):e142. [FREE Full text] [CrossRef] [Medline]
  6. Chen HL, Liu K, You QS. Effects of couple based coping intervention on self-efficacy and quality of life in patients with resected lung cancer. Patient Educ Couns. Dec 2017;100(12):2297-2302. [CrossRef] [Medline]
  7. Piette JD, Striplin D, Aikens JE, Lee A, Marinec N, Mansabdar M, et al. Impacts of post-hospitalization accessible health technology and caregiver support on 90-day acute care use and self-care assistance: a randomized clinical trial. Am J Med Qual. May 2021;36(3):145-155. [CrossRef] [Medline]
  8. Martire LM, Lustig AP, Schulz R, Miller GE, Helgeson VS. Is it beneficial to involve a family member? A meta-analysis of psychosocial interventions for chronic illness. Health Psychol. Nov 2004;23(6):599-611. [CrossRef] [Medline]
  9. Stahl ST, Rodakowski J, Saghafi EM, Park M, Reynolds CF, Dew MA. Systematic review of dyadic and family-oriented interventions for late-life depression. Int J Geriatr Psychiatry. Sep 2016;31(9):963-973. [FREE Full text] [CrossRef] [Medline]
  10. Kvedar JC, Fogel AL, Elenko E, Zohar D. Digital medicine's march on chronic disease. Nat Biotechnol. Mar 2016;34(3):239-246. [CrossRef] [Medline]
  11. Pang Y, Zhang X, Gao R, Xu L, Shen M, Shi H, et al. Efficacy of web-based self-management interventions for depressive symptoms: a meta-analysis of randomized controlled trials. BMC Psychiatry. Aug 11, 2021;21(1):398. [FREE Full text] [CrossRef] [Medline]
  12. Zhang X, Ma L, Feng L. Web-based self-management intervention for patients with cancer: a meta-analysis and systematic review. J Nurs Scholarsh. Sep 2022;54(5):598-606. [CrossRef] [Medline]
  13. Portz JD. A review of web-based chronic disease self-management for older adults. Gerontechnology. Mar 2017;16(1):12-20. [FREE Full text] [CrossRef] [Medline]
  14. Wasilewski MB, Stinson JN, Cameron JI. Web-based health interventions for family caregivers of elderly individuals: a scoping review. Int J Med Inform. Jul 2017;103:109-138. [CrossRef] [Medline]
  15. Irani E, Niyomyart A, Hickman Jr RL. Systematic review of technology-based interventions targeting chronically ill adults and their caregivers. West J Nurs Res. Nov 2020;42(11):974-992. [FREE Full text] [CrossRef] [Medline]
  16. Ploeg J, Ali MU, Markle-Reid M, Valaitis R, Bartholomew A, Fitzpatrick-Lewis D, et al. Caregiver-focused, web-based interventions: systematic review and meta-analysis (part 2). J Med Internet Res. Oct 26, 2018;20(10):e11247. [FREE Full text] [CrossRef] [Medline]
  17. Shaffer KM, Tigershtrom A, Badr H, Benvengo S, Hernandez M, Ritterband LM. Dyadic psychosocial eHealth interventions: systematic scoping review. J Med Internet Res. Mar 4, 2020;22(3):e15509. [CrossRef]
  18. Lee AA, Heisler M, Trivedi R, Leukel P, Mor MK, Rosland AM. Autonomy support from informal health supporters: links with self-care activities, healthcare engagement, metabolic outcomes, and cardiac risk among Veterans with type 2 diabetes. J Behav Med. Apr 2021;44(2):241-252. [FREE Full text] [CrossRef] [Medline]
  19. Trivedi RB, Slightam C, Nevedal A, Guetterman TC, Fan VS, Nelson KM, et al. Comparing the barriers and facilitators of heart failure management as perceived by patients, caregivers, and clinical providers. J Cardiovasc Nurs. Sep 2019;34(5):399-409. [CrossRef] [Medline]
  20. Rosland AM, Heisler M, Choi HJ, Silveira MJ, Piette JD. Family influences on self-management among functionally independent adults with diabetes or heart failure: do family members hinder as much as they help? Chronic Illn. Mar 2010;6(1):22-33. [FREE Full text] [CrossRef] [Medline]
  21. Risbud RD, Kim JS, Trivedi RB. It takes a village: interpersonal factors that enhance management of heart failure. J Cardiovasc Nurs. Sep 2022;37(5):E160-E168. [CrossRef] [Medline]
  22. Nelson KE, Saylor MA, Anderson A, Buck H, Davidson PM, DeGroot L, et al. "We're all we got is each other": mixed-methods analysis of patient-caregiver dyads' management of heart failure. Heart Lung. Sep 2022;55:24-28. [FREE Full text] [CrossRef] [Medline]
  23. Trivedi R, Slightam C, Fan VS, Rosland AM, Nelson K, Timko C, et al. A couples' based self-management program for heart failure: results of a feasibility study. Front Public Health. Aug 29, 2016;4:171. [FREE Full text] [CrossRef] [Medline]
  24. Ratcliff CG, Vinson CA, Milbury K, Badr H. Moving family interventions into the real world: what matters to oncology stakeholders? J Psychosoc Oncol. Mar 2019;37(2):264-284. [FREE Full text] [CrossRef] [Medline]
  25. Northouse LL, Mood DW, Schafenacker A, Montie JE, Sandler HM, Forman JD, et al. Randomized clinical trial of a family intervention for prostate cancer patients and their spouses. Cancer. Dec 15, 2007;110(12):2809-2818. [FREE Full text] [CrossRef] [Medline]
  26. Zulman DM, Schafenacker A, Barr KL, Moore IT, Fisher J, McCurdy K, et al. Adapting an in-person patient-caregiver communication intervention to a tailored web-based format. Psychooncology. Mar 2012;21(3):336-341. [FREE Full text] [CrossRef] [Medline]
  27. User experience basics. Usability.gov. URL: https://www.usability.gov/what-and-why/user-experience.html [accessed 2021-10-21]
  28. Information and communication technology: revised 508 standards and 255 guidelines. U.S. Access Board. URL: https://www.access-board.gov/ict/ [accessed 2022-10-21]
  29. Lorig KR, Laurent DD, Deyo RA, Marnell ME, Minor MA, Ritter PL. Can a back pain e-mail discussion group improve health status and lower health care costs?: a randomized study. Arch Intern Med. Apr 08, 2002;162(7):792-796. [CrossRef] [Medline]
  30. Shaw BR, Han JY, Baker T, Witherly J, Hawkins RP, McTavish F, et al. How women with breast cancer learn using interactive cancer communication systems. Health Educ Res. Feb 2007;22(1):108-119. [CrossRef] [Medline]
  31. Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems. Presented at: CHI '93; April 24-29, 1993, 1993;206-213; Amsterdam, The Netherlands. URL: https://dl.acm.org/doi/10.1145/169059.169166 [CrossRef]
  32. Bevan N, Barnum CM, Cockton G, Nielsen J, Spool JM, Wixon DR. The "magic number 5": is it enough for web testing? In: Proceedings of the CHI '03 Extended Abstracts on Human Factors in Computing Systems. Presented at: CHI EA '03; April 5-10, 2003, 2003;698-699; Ft. Lauderdale, FL, USA. URL: https://dl.acm.org/doi/abs/10.1145/765891.765936 [CrossRef]
  33. Fisher L, Glasgow RE, Mullan JT, Skaff MM, Polonsky WH. Development of a brief diabetes distress screening instrument. Ann Fam Med. May 2008;6(3):246-252. [FREE Full text] [CrossRef] [Medline]
  34. Trivedi RB, Szarka JG, Beaver K, Brousseau K, Nevins E, Yancy Jr WS, et al. Recruitment and retention rates in behavioral trials involving patients and a support person: a systematic review. Contemp Clin Trials. Sep 2013;36(1):307-318. [CrossRef] [Medline]
  35. Brooke J. SUS: a 'quick and dirty' usability scale. In: Jordan PW, Thomas B, McClelland IL, Weerdmeester B, editors. Usability Evaluation in Industry. Boca Raton, FL, USA. CRC Press; 1996;189-194.
  36. Landauer T. Behavioral research methods in human-computer interaction. In: Helander MG, Landauer TK, Prabhu PV, editors. Handbook of Human-Computer Interaction. Amsterdam, The Netherlands. Elsevier; 1997;203-227.
  37. Lewis JR, Sauro J. The factor structure of the system usability scale. In: Proceedings of the 1st International Conference on Human Centered Design: Held as Part of HCI International 2009. Presented at: HCD '09; July 19-24, 2009, 2009;94-103; San Diego, CA, USA. URL: https://dl.acm.org/doi/10.1007/978-3-642-02806-9_12 [CrossRef]
  38. Brown 3rd W, Yen PY, Rojas M, Schnall R. Assessment of the health IT usability evaluation model (Health-ITUEM) for evaluating mobile health (mHealth) technology. J Biomed Inform. Dec 2013;46(6):1080-1087. [FREE Full text] [CrossRef] [Medline]
  39. Heber E, Ebert DD, Lehr D, Cuijpers P, Berking M, Nobis S, et al. The benefit of web- and computer-based interventions for stress: a systematic review and meta-analysis. J Med Internet Res. Feb 17, 2017;19(2):e32. [FREE Full text] [CrossRef] [Medline]
  40. Aikens JE, Valenstein M, Plegue MA, Sen A, Marinec N, Achtyes E, et al. Technology-facilitated depression self-management linked with lay supporters and primary care clinics: randomized controlled trial in a low-income sample. Telemed J E Health. Mar 2022;28(3):399-406. [FREE Full text] [CrossRef] [Medline]
  41. Rosland A, Piette JD, Trivedi R, Lee A, Stoll S, Youk AO, et al. Effectiveness of a health coaching intervention for patient-family dyads to improve outcomes among adults with diabetes: a randomized clinical trial. JAMA Netw Open. Nov 01, 2022;5(11):e2237960. [FREE Full text] [CrossRef] [Medline]
  42. Partridge SR, Gallagher P, Freeman B, Gallagher R. Facebook groups for the management of chronic diseases. J Med Internet Res. Jan 17, 2018;20(1):e21. [FREE Full text] [CrossRef] [Medline]
  43. Pagoto S, Waring ME, May CN, Ding EY, Kunz WH, Hayes R, et al. Adapting behavioral interventions for social media delivery. J Med Internet Res. Jan 29, 2016;18(1):e24. [FREE Full text] [CrossRef] [Medline]
  44. Zhou X, Snoswell CL, Harding LE, Bambling M, Edirippulige S, Bai X, et al. The role of telehealth in reducing the mental health burden from COVID-19. Telemed J E Health. Apr 2020;26(4):377-379. [CrossRef] [Medline]
  45. Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts-the ADAPT guidance. BMJ. Aug 03, 2021;374:n1679. [FREE Full text] [CrossRef] [Medline]
  46. Morton K, Ainsworth B, Miller S, Rice C, Bostock J, Denison-Day J, et al. Adapting behavioral interventions for a changing public health context: a worked example of implementing a digital intervention during a global pandemic using rapid optimisation methods. Front Public Health. Apr 26, 2021;9:668197. [FREE Full text] [CrossRef] [Medline]


FOCUS: Family Involvement, Optimistic Attitude, Coping Effectiveness, Uncertainty Reduction, Symptom Management
PI: principal investigator
SUCCEED: Self-care Using Collaborative Coping Enhancement in Diseases
VA: Department of Veterans Affairs
VAPAHCS: Veterans Affairs Palo Alto Health Care System


Edited by A Mavragani; submitted 28.10.22; peer-reviewed by D Petrovsky, J Pearson; comments to author 18.01.23; revised version received 31.03.23; accepted 03.04.23; published 16.06.23.

Copyright

©Ranak Trivedi, Sierra Kawena Hirayama, Rashmi Risbud, Madhuvanthi Suresh, Marika Blair Humber, Kevin Butler, Alex Razze, Christine Timko, Karin Nelson, Donna M Zulman, Steven M Asch, Keith Humphreys, John D Piette. Originally published in JMIR Formative Research (https://formative.jmir.org), 16.06.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.