Original Paper
Abstract
Background: The recent growth of eHealth is unprecedented, especially after the COVID-19 pandemic. Within eHealth, wearable technology is increasingly being adopted because it can offer the remote monitoring of chronic and acute conditions in daily life environments. Wearable technology may be used to monitor and track key indicators of physical and psychological stress in daily life settings, providing helpful information for clinicians. One of the key challenges is to present extensive wearable data to clinicians in an easily interpretable manner to make informed decisions.
Objective: The purpose of this research was to design a wearable data dashboard, named CarePortal, to present analytic visualizations of wearable data that are meaningful to clinicians. The study was divided into 2 main research objectives: to understand the needs of clinicians regarding wearable data interpretation and visualization and to develop a system architecture for a web application to visualize wearable data and related analytics.
Methods: We used a wearable data set collected from 116 adolescent participants who experienced trauma. For 2 weeks, participants wore a Microsoft Band that logged physiological sensor data such as heart rate (HR). A total of 834 days of HR data were collected. To design the CarePortal dashboard, we used a participatory design approach that interacted directly with clinicians (stakeholders) with backgrounds in clinical psychology and neuropsychology. A total of 8 clinicians were recruited from the Rhode Island Hospital and the University of Massachusetts Memorial Health. The study involved 5 stages of participatory workshops and began with an understanding of the needs of clinicians. A User Experience Questionnaire was used at the end of the study to quantitatively evaluate user experience. Physiological metrics such as daily and hourly maximum, minimum, average, and SD of HR and HR variability, along with HR-based activity levels, were identified. This study investigated various data visualization graphing methods for wearable data, including radar charts, stacked bar plots, scatter plots combined with line plots, simple bar plots, and box plots.
Results: We created a CarePortal dashboard after understanding the clinicians’ needs. Results from our workshops indicate that overall clinicians preferred aggregate information such as daily HR instead of continuous HR and want to see trends in wearable sensor data over a period (eg, days). In the User Experience Questionnaire, a score of 1.4 was received, which indicated that CarePortal was exciting to use (question 5), and a similar score was received, indicating that CarePortal was the leading edge (question 8). On average, clinicians reported that CarePortal was supportive and can be useful in making informed decisions.
Conclusions: We concluded that the CarePortal dashboard integrated with wearable sensor data visualization techniques would be an acceptable tool for clinicians to use in the future.
doi:10.2196/46866
Keywords
Introduction
Overview
According to surveys conducted by the American Hospital Association and Center for Disease Control, the use of eHealth systems has seen a dramatic increase from 2010 to 2020 [
, ]. eHealth offers the integration of modern daily used technologies such as wearable devices and smartphones for the diagnosis and longitudinal symptom monitoring of chronic conditions such as Parkinson disease, dementia, diabetes, hypertension, cardiovascular diseases, and mental disorders [ - ]. Wearable technologies such as smart watches, smart rings, and smart garments are becoming popular because they enable the gathering of quantified health information in daily life environments such as homes, nursing homes, and assisted living facilities [ ]. The health data from wearables include but are not limited to physiological parameters (heart rate [HR], blood oxygen saturation, respiration rate, and body temperature) and activity and behavioral data (step count, calories burned, active periods, and sedentary periods) [ - ].One of the key challenges is to make these massive multimodal data (collected from wearables) interpretable to clinicians for clinical decision support. Owing to busy schedules and time constraints, clinicians seek meaningful and interpretable methods for wearable data visualizations. To ensure usability, it is important to integrate clinicians’ needs and preferences in the process of designing interpretable dashboards. A clinician-centered dashboard demands the development of new ways to analyze and visualize data for clinicians who are looking for specific health outcomes relevant to their patients and practices. Therefore, it is essential to understand the workflow of clinicians. For example, clinicians use specific software systems (electronic health records [EHRs] and patient portals) to provide information and make decisions. It is important not to burden clinicians with excessive data visualization. At the same time, the dashboards must not omit important features derived from the continuous stream of wearable data.
To address these challenges and meet the needs of clinicians, our research was aimed at designing a data visualization dashboard, CarePortal, to display wearable device data to clinicians. CarePortal is envisioned as a potential solution that can reduce the complexity of wearable sensor data interpretation and visualization and help clinicians make informed decisions more efficiently. This study aimed to answer the following broad research question: “What are the vital elements of a data analytics dashboard that can help to visualize, analyze, and interpret patient wearable data for clinicians to support an informed decision-making process in an easy interpretable way?”
Contributions of this research are as follows:
- Participatory design with clinicians: we aimed to design and test a user interface (UI) for the CarePortal data dashboard that visualizes symptomatic health data from a smartwatch in an interpretable manner. We used a participatory design approach involving interactions with clinicians (with backgrounds in clinical psychology and neuropsychology). A 5-stage participatory design workshop study was conducted that allowed the evolution of the CarePortal dashboard through the iterative feedback of clinicians about the ways they prefer to visualize wearable device data. The workshops started with an understanding of the needs and workflow of the clinicians. Subsequently, the dashboard design was iterated from low-fidelity prototypes to an interactive web application.
- A web application dashboard architecture of longitudinal symptomatic data of wearable smartwatch: we used a wearable data set collected from 116 adolescent participants who were exposed to trauma [ ]. Participants wore a Microsoft Smart Band 2 that logged physiological sensor data such as HR. A total of 834 days of HR data were collected. In this study, we developed a data processing pipeline for wearable device data (from Microsoft Band) and the CarePortal dashboard software architecture.
Background and Related Works
Participatory design is a methodology that promotes the iterative engagement of end users in the design process of solutions [
- ]. In participatory design, each phase is planned by reflecting on the results of the previous phase with respect to the participants’ contributions. In general, participatory design can be divided into several phases, including understanding the needs and problems of end users, brainstorming ideas for possible solutions, developing prototypes, testing and refining prototypes, and evaluation. One of the key benefits of participatory design is that end users have a say in the design process and are often included as co-designers because they are invited to proactively collaborate and brainstorm potential solutions [ ].In the areas of eHealth and digital health, participatory design processes are widely used to ensure that the challenges of end users are addressed in end products that are designed and tested interactively for high user acceptance and satisfaction. Seals et al [
] inquired about the need for clinicians to visualize wearable data on remote gait assessments in people with multiple sclerosis. This was a participatory design study involving researchers as participants from different domains, such as human-computer interaction, biomedicine, neurology, and rehabilitation. Their participatory design process resulted in insights pointing toward the need for quantitative sensor data that can help track specific rehabilitation goals. The study also reported the need to balance between a quick overview and a detailed understanding of the critical information. This study demonstrates how participatory design can help identify key insights into what matters most to end users and form recommendations on potential solutions.Regarding patient-centered design, participatory design processes are widely adopted to meet the needs of patients by targeting customized health apps for various applications, such as care planning [
], self-management of asthma [ ], and procedure education [ ]. However, the use of participatory design is limited in meeting the needs of clinicians who are an integral part of health care and demand to understand the status quo of their patients. Clinicians are busy professionals who see and treat many patients daily. They follow standard protocols and workflow. They need to make clinical decisions based on the information gathered from patients and self-reported evaluations (often retrospective reports, daily logs, or notes). Modern technologies such as eHealth and wearable devices can add significant burden to clinicians if these technologies are not integrated within their workflow according to the clinician’s needs.This study offers insights into how wearable device data can be integrated into their workflow. This study aims to investigate the visualization of wearable device data gathered daily, as clinicians do not have the time and energy to see the raw data because of their busy schedules. Accordingly, it is important for software and app developers to assess clinician preferences for the visualization of data, ensuring that the end product will facilitate informed decisions. This study leveraged participatory design processes with clinicians to develop a finalized web application dashboard, which is a product of iterative feedback and testing from clinicians as end users.
Methods
Overview
This participatory design study involved the creation of a software framework for the development of rapid webapp iterations. Hence, our research was divided into 2 parts. First, we instrumented a preliminary software architecture for CarePortal that can host and process the wearable device data. The software architecture also allowed us to design and iterate web application interfaces to visualize real-world wearable data. Later, the interfaces were used during participatory design workshops with clinicians.
Dashboard Software Architecture
Wearable Sensor Data Set
The CarePortal dashboard was designed to visualize smartwatch sensor data previously collected in a clinical study (in the years 2017-2020) [
]. This study included 116 patients who presented to the emergency department following traumatic injury. The Microsoft Smart Band 2, paired with a custom-made companion Android app, was deployed in the participant’s daily life for 2 weeks. The app was developed using the Microsoft Software Development Kit to collect sensor logs. The app collected data from the photoplethysmography sensor, which was then stored in comma-separated values files. A total of 834 days of HR data were collected. A data preparation and processing pipeline (discussed in detail in the following sections) was developed to meet the dashboard requirements raised by clinicians. presents an overview of the data processing and dashboard deployment pipelines.Wearable Data Set Preparation
This section describes the data preparation methods applied to the wearable sensor data set. This includes identifying and removing duplicate data from the data set.
Duplicate Data: Identification and Removal
To check for duplicate data in this data set, the timestamps for each day for each participant were analyzed. We found that 5 (out of 116 adolescents) participants had over 50% of duplicate data for unknown reasons. However, as this data set was longitudinal and each participant performed the study for several days, not all the day’s data were duplicated. We found that at least 1 day of data was duplicated for these 5 participants. Therefore, CarePortal dropped the days for which ≥50% of the data were found, which was 9 days in total out of the total 834 days of data. After removing the duplicates, the data were forwarded to the processing pipeline.
Feature Extraction: Wearable Sensor Data Processing
After assessing the requirements of clinicians in workshops 1, 2, and 3 (discussed in later sections), we incorporated HR-based features into our processing pipeline. To process and extract features from the wearable sensor data set, we focused on (1) HR-based statistical features, (2) HR variability (HRV)–based features, and (3) HR-based activity levels.
HR-Based Statistical Features
On the basis of inputs from clinicians, we selected the maximum, minimum, SD, and average HR as statistical features. These features present simple statistical metrics that can easily be interpreted by clinicians. We calculated the HR value’s daily and hourly maximum, minimum, SD, and average. Daily statistics will allow clinicians to quickly analyze several days of data, and hourly statistics will allow clinicians to see the details of specific events of interest.
HRV-Based Features
We used this metric in the CarePortal dashboard, as research suggests that HRV is a relevant metric to measure stress [
, ]. Daily and hourly HRV were calculated. In addition, we computed the maximum, minimum, and average HRV for the day [ ]. Equation 1 shows the calculation of HRV. Similar to the HR, the daily and hourly HRV were calculated.where n represents the total number of samples, i indicates the increment counter variable, and xi represents the HR value at the ith index.
HR-Based Activity Levels
For clinicians, it was important to know when the HR was high and when it was low, which can be attributed to the high or low activity or stress levels. This was observed in the phase 1 interviews. To achieve this, we calculated HR-based activity levels and followed the guidelines of the American Heart Association [
]. The maximum HR (max HR) was computed using the formula given in equation 2. To compute HRMAX, age was selected as 15 (average age of adolescents). On the basis of HRMAX, we calculated 4 different activity levels. These levels were as follows: ≤50% activity level, 50% to 70% activity level, 70% to 85% activity level, and ≥85% activity level. shows the different activity classes for the sample HR data based on different activity levels. presents the raw heart data collected from a single day’s activity for 1 participant.Max HR = 220 – Age (2)
Level 1: HR≤(max HR) × 50% (3)
Level 2: (max HR) × 50%<HR<(max HR) × 70% (4)
Level 3: (max HR) × 70%<HR<(max HR) × 85% (5)
Level 4: HR≥(max HR) × 85% (6)
CarePortal Dashboard Development
The dashboard web application development process involved an interdisciplinary team of UI or user experience (UX) designers and software engineers. The dashboard design was divided into three parts:
- Data visualization (frontend): the CarePortal dashboard was developed using the Python Flask micro web framework. The Flask framework combined HTML, CSS, JS, and Python. This dashboard used the JavaScript version of the Plotly framework for visualizations. This version of Plotly allowed us to customize the built-in features that were not required by clinicians.
- Curated database (backend): the CarePortal dashboard used Google Drive to store the curated database and application programming interface from the Google Cloud Platform to query the data from Google Drive. The database was formatted in a JavaScript Object Notation file format, where each metric was separated by a key value pair. In this JavaScript Object Notation format, each day and hours data values were separated.
- Dashboard deployment: we used Apache 2 Web Server Gateway Interface configurations on a Linux server to deploy the dashboard application. Once the application was deployed, a custom domain name was used. This domain name was then sent to clinicians who tested the dashboard web application.
CarePortal: Participatory Design Study
To design the CarePortal dashboard web application, it was of utmost importance to consider feedback from end users (clinicians) during the design process. Participatory design allowed us to create the interfaces with them in an iterative process that helped us understand which metrics (discussed above) were useful for clinicians to analyze patient data.
Five one-on-one design workshops with clinicians were conducted in this study. Workshops 1, 2, and 3 were conducted sequentially, whereas workshops 4 and 5 were conducted after a few months of analyzing and reflecting on the data from the first 3 interviews and developing the web dashboard accordingly. This participatory design process is shown in
. Owing to the restrictions of the COVID-19 pandemic, all workshops were conducted virtually using the Zoom platform. Each workshop lasted for approximately 35 to 40 minutes. To analyze the qualitative data from this workshop, the research team carefully took notes and coded clinicians’ responses from the Zoom recordings.Ethical Considerations
This study was approved by the institutional review board (IRB) of the University of Rhode Island (#1776865-1). Video recordings of clinicians were kept on a local hard drive and only researchers approved by the IRB had access to these videos. As this research was exempt from IRB approval, we obtained a digital consent signed by participants for this study. The participants volunteered to participate in this study, and no compensation was provided.
Clinician Recruitment
Clinicians with a background in psychology and neuropsychology with clinical practice in behavior therapy were recruited for this study. Clinicians in these areas of expertise were selected to best match the presenting concerns of the research population receiving behavior therapies. Clinicians from hospitals and clinics in and around Rhode Island, the United States, were recruited via an email campaign. This email included a short abstract about the study and the time commitment required by the clinicians.
Workshop 1: Understanding Clinician Needs
The main aim of workshop 1 was to gain an empathetic understanding of clinicians’ daily practice and the tools they use during their treatment process.
For this, questions regarding their behavior therapy practice, treatment understanding, and use of technology in day-to-day clinical practice were asked.
presents a list of questions asked during this process. This was a conversational workshop. The workshop began by discussing the meeting agenda, followed by a permission request to record the meeting. To analyze the qualitative data from this interview, methods for creating user personas were applied [ ]. For each clinician, a user persona was created that included a short biography, pain points (user concerns), goals, and the treatment process. presents a list of questions asked during this workshop.Part 1: behavioral health understanding
- How would you describe your practice?
- How does your therapy work?
- What type of patient populations do you serve?
- How often do you see your clients?
- Do you interact with your clients outside of office hours? if so, how?
Part 2: treatment understanding
- Can you walk me through the initial evaluation process?
Part 3:technology use in clinical practice
- Do you see wearable technology playing a role in your therapy now or in the future?
- How did the COVID pandemic impact your practice? In what ways did it change?
Workshop 2: Graph Wireframes
The second workshop incorporated initial low-fidelity wireframes. This workshop was aimed at understanding how clinicians would like to visualize patients’ health data collected from wearables. The workshop protocol consisted of 2 parts. The first part was an interactive conversation in which we asked questions to understand what kind of health data are helpful for clinicians. In the second part, the aim was to understand what types of graphs or visualizations were preferred by clinicians. For this purpose, low-fidelity wireframes were designed using the Figma prototyping tool (developed by Figma Inc). These wireframes were presented using slides via Zoom screen sharing. The wireframes (shown in
) consisted of different types of graphs, such as (1) a bar plot representing the hourly HR and (2) a radar plot representing the hourly HR variability.Workshop 3: Low-Fidelity Dashboard Prototypes
The main aim of this workshop was to show clinicians low-fidelity prototypes of the CarePortal dashboard and to better understand the requirements of clinicians from those prototypes. For this purpose, 3 low-fidelity prototypes of CarePortal were presented to the clinicians. These prototypes were created after analyzing the requirements of the clinicians in previous workshops. Figma (developed by Figma Inc) was used for prototyping and allowed us to quickly iterate the design prototypes. This was presented to clinicians via Google Slides using the share screen feature of Zoom.
shows this prototype version.Workshop 4: Medium-Fidelity Interactive Prototype
In this workshop, the aim was to showcase the CarePortal dashboard prototype to clinicians (so they can use it remotely) and receive feedback on the prototype. A link to the high-fidelity dashboard prototype, a functional Python Flask web app, was provided to the clinicians. In this workshop, clinicians could interact directly with the dashboard using their own machines. This prototype was developed based on the feedback received from previous workshops, and the suggested UX features (eg, hovering over legends for graphs and loading spinners) were implemented on the dashboard. Subsequently, we describe the specific features and navigation of the CarePortal dashboard in detail. To understand clinicians’ perspectives, during this workshop, clinicians shared their screens so that the researcher could observe their interactions and clinicians could provide feedback while moving their mouse pointer over the screen. A detailed feature description of this prototype version is provided in the Dashboard UI section.
Workshop 5: High-Fidelity Interactive Prototype
In this workshop, the primary aim was to have clinicians interact with different versions of graphs to give us feedback on their experience and preference.
This version of the CarePortal dashboard was designed based on feedback from clinicians on the deployment of workshop 4. Three data metrics were presented to the clinicians: (1) HR, (2) HRV, and (3) activity levels (HR based). Options for different types of graphs were provided for each data type, which were implemented in the form of a carousel on the dashboard. At the end of this interview, the User Experience Questionnaire (UEQ) was administered to clinicians to quantitatively analyze the usability of the dashboard [
]. On providing permission to record the session, participants were asked to share their screen. This allowed researchers to observe how clinicians navigate this dashboard. The workshop began with a guided tour of the CarePortal dashboard. Once the workshop was completed, a link to a Google Form consisting of the UEQ questions was provided. Similar to workshop 4, we also implemented UX features in this dashboard version, including hover-over legends and loading spinners. The following sections describe the changes made to the CarePortal dashboard UI based on feedback.Dashboard UI
Overview
In this section, we describe the UI of the deployed CarePortal dashboard. This prototype encompassed three sections: (1) the overall progress page, (2) the HR page, and (3) the activity level page. Within each page, a carousel was implemented so that clinicians could view different types of graphs on the 3 pages for the same information.
and show the prototypes of the app. presents a navigational chart of the finalized CarePortal web application.Overall Progress Page
Once the clinician selected a patient on the patient’s page, they were directed to the overall progress page. The overall progress page displayed all patient data, including the maximum, minimum, and average HRs for all days, in a card view. If the maximum HR of a particular day was above level 3 (>174 beats/min, ie, 85% max HR), which was calculated based on the details discussed in the Wearable Sensor Data Processing section, the maximum HR bar within that card view was coded as red. A detailed overview of the overall progress page is shown in
.HR Page
On the HR page, the idea was to provide an overview of the HR data by presenting the following: (1) the raw data plot of HR over the whole day and (2) a scatter plot of hourly averages to present information in aggregate form. A day filter option was provided to clinicians to filter data for each day.
A displays the raw HR plot, and B presents the hourly HR plot.HR Activity Levels Page
On the activity level page, a pie chart (shown in
C) was presented, showing the percentage activity levels calculated based on HR. Four activity levels were derived, that is, (1) less than moderately intense (level 1), (2) moderately intense (level 2), (3) high intensity (level 3), and (4) beyond intense activity (level 4). Similar to the HR page, a filter option was provided to clinicians to filter the daily data. The percentage levels were additionally presented in a table, as it was difficult to see small percentages in the pie chart.Loading Spinner
Feedback was given to users once they clicked on the “click here to view details” link while the page was loaded in the form of a loading spinner. This is shown in
A in a green circle (Patient 21).On Hover Color Change
When the mouse hovered over a particular card view, the border turned orange to indicate the card that was selected. This was implemented in both patients’ selection (
A) and overall progress page ( B).Navigation Bar (Navbar)
The hover method was implemented in the navbar of the web application, which made the navbar link turn orange when clicked on it. This was implemented in both the horizontal and vertical navbars. When the user logged in, they were greeted by their username on the vertical navbar in the top-left corner, as shown in
B.Results
In this section, we discuss the results of the participatory design workshops with clinicians along with qualitative and quantitative analysis.
Clinician Recruitment
Recruitment resulted in a total of 8 clinicians from Rhode Island Hospital and University of Massachusetts Memorial Health. As summarized in
, a total of 3 clinicians were recruited in workshops 1, 2, and 3, whereas 6 clinicians were recruited in workshops 4 and 5. One clinician dropped out in workshop 3. A similar case occurred in workshop 5, where 1 participant withdrew owing to personal emergencies. Therefore, we did not include their comments in this section. Clinicians recruited in workshops 1, 2, and 3 were different from those recruited in workshops 4 and 5; however, Participant 1 was the same for all workshops.Participant number | Gender | Clinical background | Specialty | Workshop 1 | Workshop 2 | Workshop 3 | Workshop 4 | Workshop 5 |
P1 | Female | Clinical psychology | Anxiety disorders | Yes | Yes | Yes | Yes | Yes |
P2 | Male | Neuropsychology | Neurodegenerative disorders | Yes | Yes | No | —a | — |
P3 | Female | Music therapy | Mental health disorders | Yes | Yes | Yes | — | — |
P4 | Male | Neuropsychology | Cognitive impairments | — | — | — | Yes | Yes |
P5 | Male | Neuropsychology | Epilepsy | — | — | — | Yes | No |
P6 | Male | Clinical psychology | Family therapy | — | — | — | Yes | Yes |
P7 | Female | Clinical psychology | Emergency department | — | — | — | Yes | Yes |
P8 | Female | Clinical psychology | Adolescents with stress | — | — | — | Yes | Yes |
aDid not participate in the corresponding workshop.
Workshop 1: Understanding Clinician Needs (Insights)
In this workshop, we gained insights into the treatment process for anxiety disorders. We found that the first step in the treatment process was to collect the patient’s prior history, symptoms, and active problems. This information was entered into an EHR system. Participant 1 uses Simply Practice as the EHR system. The technology used daily by Participant 2 includes phone and fax, whereas Participant 3 uses Spotify and HIPAA (Health Insurance Portability and Accountability Act)–compliant Google Drive. Participant 1 followed a 16-week protocol on average. Clients are scheduled to see a clinician on a weekly basis and each meeting lasts for approximately 45 minutes to 1 hour. During each session, the clinician took notes following the Data Assessment and Plan notes. Data Assessment and Plan is a form of clinical note–taking technique. In the first 8 weeks of therapy, a skill-building process is followed. In this process, a clinician teaches patients how anxiety feels in the body and what to do if an anxiety event occurs. Participant 1 mentioned, “We try to identify what their anxious patterns are, what their anxious thoughts are, and then replacing those with more positive coping like thoughts.” In the next 8 weeks, exposure therapy with response prevention was administered. In this way, a fear hierarchy is created, which is a list of things the client is concerned about. Participant 1 described to us what that means explicitly:
We make a list of all the things that make the client anxious, so let’s just say for example, they are in because they are really afraid of their best friend’s dog, and wouldn’t go over to the best friend’s house, and this was causing a lot of problems because the families were friends. So, what we do is create a fear hierarchy and identify things that make the client scared and how we are going to work on those in the session. We would look at pictures of calm cute dogs and on a scale of 1-10 the fear is maybe a 2 out of 10. We would work on this until the client told us their anxiety went down from a 2 to a 1.
One of the pain points during this process for Participant 1 was the ability to track how anxiety felt when the client was doing their therapy homework. As Participant 1 said, “We ask the clients to do homework. It would be interesting to be able to take that piece of it and look at what their anxiety feels while they are doing their homework, because all we have now is self-reports from the client, and it is probably not that accurate.”
Participant 2 uses prior information about patients’ health as their primary source of data to diagnose them. Therefore, Participant 2 does not see patients on a regular basis; instead they prepare a treatment plan and recommendations for patients after diagnoses. Participant 3 with a background in music-based therapy mentioned the use of software such as Spotify and HIPAA–compliant G-suit applications, for example, Excel for data analysis and Google Docs for note taking. Participant 3 also mentioned that one of the limitations of using Excel for data analysis is that they have a larger amount of data to analyze, and it is not possible to look at everything via Excel sheets. Participant 2 mentioned, “Currently we use Excel to store and conduct our data analysis however an app would be more helpful to track quantifiable data.” This motivated us to create data visualization mechanisms that can be used to easily interpret data and provide actionable insights to clinicians. We describe the clinician’s biography, pain points, general goals, and daily used technology in
.Participant no. | Brief biography | Pain points | Goals | Preferred health parameters |
P1 | P1 is a Doctor of Philosophy in clinical psychology who works for a clinical psychology clinic and sees children. Specializes in seeing children with anxiety disorder. |
| Help her clients get better through evidence-based therapy who are diagnosed with anxiety disorder. |
|
P2 | P2 is a neuropsychologist who sees children ranging from preschool to high school age. Conducts neurological evaluations such as ability in attention, memory function, motor coordination, judgment, reasoning, and emotional and mental health concerns. |
| Diagnose patients and create a treatment plan for them. |
|
P3 | P3 is a music therapist who sees children with a wide variety of mental health issues. Patients’ therapy goals are related to nonmusic goals. Music is the means to get to those goals. She works with a variety of persons such as patients, caregivers, and family members who help to increase awareness of brain conditions and reduce social stigma. |
| Help clients with mental health issues using music. Interested in the use of rhythm and gait. |
|
aHIPAA: Health Insurance Portability and Accountability Act.
bEHR: electronic health record.
Workshop 2: Graph Wireframes—Insights
In this workshop, 2 types of graph wireframes were shown to clinicians, as shown in
. All 3 clinicians preferred the bar plot version of the graph, and the radar chart was described as hard to interpret. Participant 1 said, “It will be important to keep the visuals very simple. Because these are master’s level (degree type) clinicians, they have not necessarily got a lot of research training, they also don’t have a lot of time, so this has to be something they can look at very quickly and extract meaningful information.” Participant 1 also suggested that instead of displaying only the maximum, minimum, and average numbers, “it would be helpful if you presented the number of minutes there was a maximum.” Participant 3 mentions that “instead of just showing maximum, minimum and average a baseline to compare the patient data with will be helpful.”A daily view of the data was preferred by both Participant 1 and Participant 2 in this version. Participant 1 explicitly said, “instead of hourly heart rate, a daily/weekly view would be better.” For the radar chart, Participant 1 said, “Looks cool, but most clinicians are trained on a horizontal axis. This would probably confuse people.” In the second prototype design, we incorporated a bar plot and a daily view of the HR data.
Workshop 3: Low-Fidelity Dashboard Prototypes (Insights)
In this workshop, clinicians aimed to understand how mood, exercise, and sleep were tracked. Participant 2 asked, “Can you tell me more about what is on the left mood, exercise, and sleep? What is the input?” They wanted to know the sources of their data. Clinicians also wanted to obtain a daily view of the data. Participant 2 asked, “For this one, what are number of spikes always per week?” In addition, Participant 1 shared similar thoughts and said “I think clinicians are going to use it across the day because there would be multiple instances across the day.” This led us to create 2 views for the deployment study, the first being daily data presentation and the second being the hourly data presentation in the high-fidelity prototypes.
Overall, in our first 3 workshops, we learned significantly about the needs of clinicians and how they would like wearable data to be visualized. Here, we summarize the key considerations for the deployment prototypes in the next phase of interviews: (1) a daily view of the data is preferred by clinicians and (2) simpler visualizations are preferred by clinicians, for example, the bar plot version is easier to interpret than the radar plot.
Workshop 4: Medium-Fidelity Interactive Prototype (Insights)
In the following sections, we describe the insights gained from workshop 4. We divide this section by presenting an analysis of the overall progress page, the HR page, and the activity levels page.
Insight 4.1: Overall Progress Page (“A Graphical Presentation Is More Interpretable”)
In this view, clinicians shared the unanimous concern that numbers were too much to interpret; Analyzing data in this view was difficult. Participant 7 critiqued saying the following:
Numbers are important to have, they probably would not be the first thing I would put forward because they are hard to digest. First thing I noticed was there was a lot of data on the page, so it took me a second to get oriented.
Graphs showing trends were more desirable for this view. Participant 4 explicitly described to us how the graph might look in the overall progress page: “It would be interesting to present a view like the graph view of the heart rate tab. Maybe in the overall progress tab you could have a graph with x axis as the days and three lines for max, min, average HR. That could be kind of cool to see that.” This suggestion was incorporated into the second high-fidelity prototype. A detailed overview of the overall progress page is presented in
B.Insight 4.2: Overall Progress Page (“Clickable Card View Navigation”)
On the overall progress page, the card’s view seemed clickable. Participant 4 said, “If I could click on the maximum heart rate coded in red and it would bring me to that day’s max heart data, as I would like to see the activity level of that day.” Instead of going through each day for the different data types presented, clinicians felt that this navigation method would be much more intuitive. P5 simply put it “If I click on it, that just takes me to that day’s activity level. That makes it more efficient.”
Insight 4.3: HR Page (“Excessive Detail in Raw HR Plot”)
A high-level overview of the HR data such as the hourly average plot was preferred as seen in
A and then a more detailed plot was preferred, that is, the raw HR plot ( B). This is because clinicians wanted to see an overview of the data first and then see the details. Participant 1 mentioned the following:Start off with a big picture, click on it and go deeper, click on it go deeper. I probably would not put the raw heart rate plot first. It may be a good idea for a cardiologist to see the raw heart rate graphs upfront but for me I would like to put it back.
Participant 8 shared similar thoughts and suggested improvements on this plot:
I would think about presenting data in a most aggregate sort of easy to interpret way at first. If I were to present this, I would present a graph of 10 days and they if you put a range, like a light bar for where the range is and if it goes out of range may be color that in red. So, when I look, I can see in seconds this person has hit the target for almost the entire week, so like very high level and easy to interpret data.
In contrast, Participant 1 felt that the raw heart provided a good overview. This is because of the following:
I like the raw heart rate better because for me I do a lot with EEGs and sleep. It just looks more familiar. You can see general trends in a more holistic way and the hourly plot has less info to work with.
Insight 4.4: Activity Levels Page (“Activity Levels Are Informative”)
Quantifying activity level information was very informative to clinicians. Participant 6 gave us key details into why he thought that.
C shows the activity levels page:I think this will be really helpful and I will tell you why. Before I do any kind of Psychotherapy. I am a big fan of basic selfcare, basic behavior management. I talk about sleep, diet, exercise, and social support with patients. If someone had anxiety or depression, my spiel is if you can get your HR up at least 20 mins a day just enough to break a sweat and get your blood flowing. There is research that shows that it is as effective as any kind of antidepressant in helping with depression. This is great for monitoring that, what I do now is just ask whether you got any exercise or not, did you go for a walk, did you do anything that got your blood pumping a little bit. If you are wearing the watch, I can pull this up, look at it and talk about it. So, I think this is great in that regard.
[Participant 6]
Even though the information was useful, the pie chart presenting the percentage of activity level done was difficult to interpret because it did not display the time of day the activity was done. Participant 6 shared key insights into improvements in this graph. Participant 8 suggested, “number of minutes would be more helpful because I am thinking 10% of what? Does that mean day, and does that day include sleep time?” In addition, Participant 6 also asked, “Is there a way to know they have taken the watch off?” In addition, Participant 7 suggested the following:
Ways that this might be more helpful would be if this was displayed as a trend over time. So, it would be easy to capture all that data for a week. That gives a high-level overview.
Workshop 5: High-Fidelity Interactive Prototype
Insight 5.1: HR Page (“Identifying Peaks and Trends”)
All clinicians preferred to see the first plot, that is, the scatter plot combined with the line plot shown in
A. Participant 7 and Participant 8 provided some key details regarding why this was the case. Participant 7 said, “It’s easy to look at upfront. I like the fact that some things are red so it drew my eye to those things, so clearly there was something about those days where they were out of range or hit some target.” whereas Participant 8 mentioned, “My preference would be to see this one, maybe because I am more used to seeing data this way. I like this because it just highlights certain important ones.” Participant 1 also mentioned a similar opinion and said, “I like the data points better, and to me I can see this better. I like this one the best, as it is easy to see what is happening, and I can see across days. I like the idea of being able to hover over them and see the information.” In the line plot in C, overall trends were visible to clinicians but identifying the peaks was difficult, as Participant 8 said, “I have to go searching for red dots in here.” Participant 1 also mentioned, “I like the data points better. I mean I can see the trend here but the peaks don’t stand out.” For the box plot, B clinicians were not able to see a trend over the days which is why they did not prefer this plot as Participant 1 puts it, “I guess it’s interesting. The issue I have with it is that it’s harder to get a sense of the days.” Participant 8 felt that box plots were more appropriate for research rather than clinical interpretation and said, “I don’t know how I would interpret this clinically. From a research perspective I know what it means.”Insight 5.2: HR Page (“Range Slider Is Useful but Not Intuitive”)
Clinicians in this study appreciated the range slider feature although it was not intuitive at first to them. Participant 1 explicitly said, “I don’t think I would figure that out, what those things are and that I can move those. I thought they were just like a design there, but I like this feature, it’s nice.” Participant 8 shared similar thoughts as Participant 1 and said, “I think the ability to hone in on certain timeframes is good, but I think this mechanism is not intuitive. I wouldn’t have been able to figure out that this is until you told me.” Participant 4 and Participant 6 both argued that this feature was very useful but not intuitive. They shared key insights on how to improve this feature Participant 4 suggested, “Probably make that like an arrow or something, make that so it’s obvious that that’s what you are doing.” as shown in
A. Participant 4 also suggested the following:I think you would have to do some kind of tutorial where you can show the mouse when it hovers over this happens, when the mouse hovers over the reset axis button of the plot this happens, or when you hover over the range slider feature this happens.
Insight 5.3: HR Variability Page (Bar vs Line vs Scatter Plot)
Most clinicians preferred to see the bar plot of all the 3 presented plots. The reasons included the following: (1) the data point legend was visible on first sight without having to hover over it and (2) the trend was clearly visible in the bar plot over the scatter plot. In the line plot, even though the trend was visible, clinicians did not want to hover over the line plot to see the data point. All 3 HRV plots are shown in
D and 7F. Participant 8 shared, “This one [bar plot] I like better because I can see the numbers at a glance.” Participant 1 wanted to see consistency in the presented plots and said, “I don’t have a strong preference for this. I think it might make sense to keep it consistent with the heart rate plot [scatter + line] if that’s what you are going to use.”Insight 5.4: HR Variability Page (“Interpreting HRV”)
One of the concerns during the presentation of HRV as a metric was that clinicians did not know how to incorporate it in daily clinical practice. Participant 8 shared the concern that “HR is a parameter that all physicians are familiar with. HRV is a measure physician don’t really consider in practice. I use it in my research, but you don’t really use it in practice much.” Clinicians also wanted a color-coded plot as well in the HRV bar plot but they debated that it might not make sense as Participant 1 said, “For Heart Rate Variability I know there is a lot more controversy on how to interpret that, what is good, what is bad, so it’s probably harder to code, and may be it doesn’t make sense to do that from the perspective of HRV. I am not an expert on that so I can’t really speak to it.” In the HRV view, clinicians preferred the bar plot over the other plots.
shows all three HR variability plots.Insight 5.5: Activity Levels Page (“All in One Plot”)
The consensus of clinicians was that
A was the preferred plot on the activity levels page. This was because it showed all the data simultaneously. For example, all the 4 levels of activities are visible in the stacked bar plot in B, and clinicians did not want to click on the button every time. Participant 1 expressed simply, “I like this chart [ A] over the other one for this purpose, because everything is all together.” Other clinicians shared the same sentiment, Participant 8 mentioned, “I definitely like this one better. You can just click on something, and you can look at what their whole day looks like.”User Experience Questionnaire
The UEQ was conducted by 5 clinician participants (participants 1, 4, 6, 7, and 8). In this section, we analyze the outcomes of the UEQ for each question [
]. Observing the results in , we learn that on average, clinicians reported that the CarePortal dashboard was supportive, scoring the application 1.4 on a 7-point Likert-type scale ranging from −3 to +3. A similar positive score was achieved regarding ease of use for question 2: complicated or easy. The highest scores were observed for questions 5 and 9. Clinicians indicated that the application was exciting and toward the leading edge. The lowest score was observed for question 4 (clear or confusing), in contrast with question 2, in which the participants answered whether the application was complicated or easy. We hypothesized that clinicians may have found it confusing to navigate multiple graphs for the same information. This was later removed from the future version of the CarePortal dashboard. lists the questions selected from the UEQ [ ].The UEQ benchmark data set contains data from 21,175 people from 468 studies concerning different products (business software, web pages, web shops, and social networks). These data were collected using the full UEQ questionnaire. The benchmark data set is used to compare the hedonic, that is, the pleasing appearance aspect of CarePortal, and pragmatic quality, that is, the practical functionality of the dashboard. Comparing the results with the benchmark UEQ data set we can observe that the pragmatic quality of our application was below average, implying that 50% of the benchmark data set results were better. The hedonic quality was above average, implying that 25% of the benchmark data set results were better. Overall, our app was above average, implying that 25% of the benchmark data set results were better.
Score | |
Complicated | −3 |
Inefficient | −3 |
Confusing | −3 |
Boring | −3 |
Not interesting | −3 |
Conventional | −3 |
Obstructive | −3 |
Usual | −3 |
Easy | +3 |
Efficient | +3 |
Clear | +3 |
Exciting | +3 |
Interesting | +3 |
Inventive | +3 |
Supportive | +3 |
Leading edge | +3 |
Discussion
Overview
In this study, we designed and implemented an interpretable wearable sensor data visualization dashboard for clinicians with a background in psychology and neuropsychology. We aimed to answer the following research question through this study: “What are the vital elements of a data analytics dashboard that can help to visualize, analyze, and interpret patient wearable data for clinicians to support an informed decision-making process in an easy interpretable way?” To answer this question, we used a participatory design process to identify the needs of clinicians and to involve them in the dashboard design process. We not only analyzed qualitative data from workshops but also implemented CarePortal as a Python Flask data dashboard. Clinicians used the CarePortal dashboard in a remote setting and evaluated its usability.
Principal Findings
From this study, we concluded that data visualization techniques such as raw HR data (
) might not be interpretable versus providing an overview of HR data during the day, for example, in A. We also learned that clinicians prefer data plots that show trends rather than distribution statistics of the data, as shown in the box plots in B. In addition, visualizations that offer an overview of several days of data in a single presentation can be easily interpreted by clinicians. Finally, we conclude that clinicians prefer plots as shown in A, 10A, and 12A, and , as these include all the factors discussed earlier. shows the navigational version of the finalized CarePortal dashboard. The final version takes a user from the loading page to the patient page, where a particular patient can be selected. Once the patient is selected, it allows the user to visualize the clinical data. The menu bar on the side allows toggling between data types. In this study, we found that clinicians not only want to see the wearable sensor data but would also want to get context behind what patients are doing, for example, when the HR is high, clinicians want to know whether they are running up a staircase or late for office work or if it is because of a stressful event. Contextualizing wearable sensing data with this type of information can provide clinicians with detailed insights into what is happening when patients do not visit their office.We found 2 other similar studies that designed wearable sensor data dashboards for (1) neurologists treating Parkinson disease and (2) a gait analysis tool for clinicians. Elm et al [
] found that in a data dashboard for clinicians’ medication compliance, symptoms were the most beneficial feature, followed by the severity of electronic patient-reported outcomes. Activity level and nighttime activity, both sensor-derived data points, were the least beneficial components for clinicians but still supported the clinical assessment two-thirds of the time. Similar to our dashboard, Elm et al [ ] presented hourly and daily metrics, these daily metrics included trends over a period of several days, whereas the hourly metrics included symptoms present over an hour. Seals et al [ ] developed a clinician-centered dashboard for gait analysis, in which they found that clinicians prefer to have a longitudinal view. They wanted to understand the trends in how patients performed the walking test during different seasons such as winter and summer. Although this work provides insights into clinicians’ data visualization preferences, we observed a common theme during this participatory design study with the previously described studies: clinicians not only want to see quantitative data but also would like to see an overlay of qualitative self-reports by patients. This is because a high HR can be one of the factors indicating stress level; however, it is not the only one. For example, Participant 2 mentioned, “The high heart rate can be because of the person or running or walking up the stairs.” Therefore, it is important to know what the person is doing when the HR is high. A tool that displays this type of information can be a powerful source of information. In the future, we would like to incorporate accelerometer-based activity data along with HR data to offer additional context to clinicians for improved decision-making. Nevertheless, the CarePortal seemed promising to clinicians, providing insights related to patients’ daily health and activity markers to improve clinical decision-making.Limitations
Although we obtained promising results for interpretable visualization techniques, there are some limitations to our study. In this work, we only present wearable sensor data to clinicians, as [
] suggested that electronic patient-reported outcomes along with medication data are most beneficial; in this work, we did not introduce this in the final version of the dashboard due to the lack of data availability. Another limitation of this study is that we do not yet integrate CarePortal into the EHR system from which we can integrate other health data of patients, such as blood pressure. During our first workshop, we found that clinicians used separate EHR platforms during their practice, and they were not integrated with each other; therefore, it was not possible to obtain data directly from their EHR system. In this work, we only presented HR data from a smartwatch; however, in today’s world, wearables can provide much more information, such as blood pressure, sleep quality, and sedentary information, and integrating these data can prove to be valuable for clinicians. Another limitation of this study was that we did aim to explore how confident clinicians would be with the wearable data presented to them. Therefore, we aim to explore how many such inputs clinicians can deal with in a real-world setting in further studies and determine the confidence level of clinicians in viewing such wearable data in their day-to-day practice.Acknowledgments
Funding for the collection of adolescent heart rate data was provided by the National Institutes of Health (R01MH108641; principal investigator: NRN). The authors are thankful to all the clinicians who participated in this study.
Data Availability
As the study participants who helped collect the wearable data did not consent to share the data beyond the research team, the authors are unable to share the wearable sensor data set at this point.
Conflicts of Interest
KM is a co-founder of EchoWear, a digital health startup company. This is to affirm that the work presented in this journal article is entirely independent of KM's responsibilities and role at EchoWear. He has no financial or personal interests that could inappropriately influence, or be perceived to influence, the content of the research, review, or editorial process associated with this JMIR manuscript. NRN is an unpaid member of the Scientific Advisory Board for Ilumivu, a software company for research with mobile technologies. The work presented here does not use Ilumivu software.
References
- Fact sheet: telehealth. American Hospital Association. URL: https://www.aha.org/system/files/2019-02/fact-sheet-telehealth-2- 4-19.pdf [accessed 2023-11-10]
- Koonin LM, Hoots B, Tsang CA, Leroy Z, Farris K, Jolly T, et al. Trends in the use of telehealth during the emergence of the COVID-19 pandemic - United States, January-March 2020. MMWR Morb Mortal Wkly Rep. Oct 30, 2020;69(43):1595-1599. [FREE Full text] [CrossRef] [Medline]
- Elm JJ, Daeschler M, Bataille L, Schneider R, Amara A, Espay AJ, et al. Feasibility and utility of a clinician dashboard from wearable and mobile application Parkinson's disease data. NPJ Digit Med. Sep 25, 2019;2(1):95. [FREE Full text] [CrossRef] [Medline]
- Erb MK, Karlin DR, Ho BK, Thomas KC, Parisi F, Vergara-Diaz GP, et al. mHealth and wearable technology should replace motor diaries to track motor fluctuations in Parkinson's disease. NPJ Digit Med. Jan 17, 2020;3(1):6. [FREE Full text] [CrossRef] [Medline]
- Angelucci A, Kuller D, Aliverti A. A home telemedicine system for continuous respiratory monitoring. IEEE J Biomed Health Inform. Apr 2021;25(4):1247-1256. [CrossRef] [Medline]
- Iqbal SM, Mahgoub I, Du E, Leavitt MA, Asghar W. Advances in healthcare wearable devices. npj Flex Electron. 2021;5(1):9. [FREE Full text] [CrossRef]
- Constant M, Brick L, Nugent N, Armey M, Mankodiya K. Smartwatch-driven multisensory recorder: design and testing of a smartwatch-based framework to support psychiatric disorders. In: Proceedings of the 2016 IEEE MIT Undergraduate Research Technology Conference. Presented at: URTC '16; November 4-6, 2016, 2016;1-4; Cambridge, MA. URL: https://ieeexplore.ieee.org/document/8284078 [CrossRef]
- Muller MJ, Kuhn S. Participatory design. Commun ACM. Jun 1993;36(6):24-28. [FREE Full text] [CrossRef]
- Robertson T, Jesper S. Participatory design: an introduction. In: Robertson T, Jesper S, editors. Routledge International Handbook of Participatory Design. London, United Kingdom. Routledge; 2012;1-17.
- Simonsen J, Toni R. Routledge International Handbook of Participatory Design. New York, NY. Routledge; 2013.
- Robertson T, Wagner I. Ethics: engagement, representation and politics-in-action. In: Simonsen J, Robertson T, editors. Routledge International Handbook of Participatory Design. London, United Kingdom. Routledge; 2012;64-85.
- Seals A, Dove G, Nov O, Charvet L, Rizzo JR, Sanchez R, et al. ‘Are they doing better in the clinic or at home?’: understanding clinicians’ needs when visualizing wearable sensor data used in remote gait assessments for people with multiple sclerosis. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Presented at: CHI '22; April 29-May 5, 2022, 2022;1-16; New Orleans, LA. URL: https://dl.acm.org/doi/10.1145/3491102.3501989 [CrossRef]
- Dykes PC, Stade D, Chang F, Dalal A, Getty G, Kandala R, et al. Participatory design and development of a patient-centered toolkit to engage hospitalized patients and care partners in their plan of care. AMIA Annu Symp Proc. Nov 2014;2014:486-495. [FREE Full text] [Medline]
- Davis SR, Peters D, Calvo RA, Sawyer SM, Foster JM, Smith L. "Kiss myAsthma": using a participatory design approach to develop a self-management app with young people with asthma. J Asthma. Sep 28, 2018;55(9):1018-1027. [FREE Full text] [CrossRef] [Medline]
- Boyd AD, Moores K, Shah V, Sadhu E, Shroff A, Groo V, et al. My interventional drug-eluting stent educational app (MyIDEA): patient-centered design methodology. JMIR Mhealth Uhealth. Jul 02, 2015;3(3):e74. [FREE Full text] [CrossRef] [Medline]
- Kim HG, Cheon EJ, Bai DS, Lee YH, Koo BH. Stress and heart rate variability: a meta-analysis and review of the literature. Psychiatry Investig. Mar 2018;15(3):235-245. [FREE Full text] [CrossRef] [Medline]
- Castaldo R, Melillo P, Bracale U, Caserta M, Triassi M, Pecchia L. Acute mental stress assessment via short term HRV analysis in healthy adults: a systematic review with meta-analysis. Biomed Signal Process Control. Apr 2015;18:370-377. [FREE Full text] [CrossRef]
- Shaffer F, Ginsberg JP. An overview of heart rate variability metrics and norms. Front Public Health. Sep 28, 2017;5:258. [FREE Full text] [CrossRef] [Medline]
- Physical activity guidelines advisory committee report, 2008. U.S. Department of Health and Human Services. 2008. URL: https://health.gov/sites/default/files/2019-10/CommitteeReport_7.pdf [accessed 2023-11-10]
- Mulder S, Yaar Z. The User Is Always Right: A Practical Guide to Creating and Using Personas for the Web. Indianapolis, IN. New Riders; 2007.
- Schrepp M, Hinderks A, Thomaschewski J. Applying the user experience questionnaire (UEQ) in different evaluation scenarios. In: Marcus A, editor. Design, User Experience, and Usability: Theories, Methods, and Tools for Designing the User Experience. Cham, Switzerland. Springer; 2014;383-392.
Abbreviations
EHR: electronic health record |
HIPAA: Health Insurance Portability and Accountability Act |
HR: heart rate |
HRV: heart rate variability |
IRB: institutional review board |
UEQ: User Experience Questionnaire |
UI: user interface |
UX: user experience |
Edited by A Mavragani; submitted 28.02.23; peer-reviewed by M Grady, J Thomas; comments to author 26.07.23; revised version received 31.08.23; accepted 08.09.23; published 05.12.23.
Copyright©Shehjar Sadhu, Dhaval Solanki, Leslie A Brick, Nicole R Nugent, Kunal Mankodiya. Originally published in JMIR Formative Research (https://formative.jmir.org), 05.12.2023.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.