This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on http://formative.jmir.org, as well as this copyright and license information must be included.
Smartphones are positioned to transform the way health care services gather patient experience data through advanced mobile survey apps which we refer to as smart surveys. In comparison with traditional methods of survey data capture, smartphone sensing survey apps have the capacity to elicit multidimensional, in situ user experience data in real time with unprecedented detail, responsiveness, and accuracy.
This study aimed to explore the context and circumstances under which patients are willing to use their smartphones to share data on their service experiences.
We conducted in-person, semistructured interviews (N=24) with smartphone owners to capture their experiences, perceptions, and attitudes toward smart surveys.
Analysis examining perceived risk revealed a few barriers to use; however, major potential barriers to adoption were the identity of recipients, reliability of the communication channel, and potential for loss of agency. The results demonstrate that the classical dimensions of perceived risk raised minimal concerns for the use of smartphones to collect patient service experience feedback. However, trust in the doctor-patient relationship, the reliability of the communication channel, the altruistic motivation to contribute to health service quality for others, and the risk of losing information agency were identified as determinants in the patients’ adoption of smart surveys.
On the basis of these findings, we provide recommendations for the design of smart surveys in practice and suggest a need for privacy design tools for voluntary, health-related technologies.
High-quality patient-centered care is widely recognized as a priority in health care [
Paper-based collection of experience surveys remains time-consuming, expensive, and limited by factors such as nonresponse, recall bias, and inadequate sample size [
Patient experience surveys are validated questionnaires, which are developed by health services experts to understand patient perceptions of their health care experience and serve an integral role in patient engagement and service improvement [
The health care literature has found no significant differences in data equivalence or validity between paper- and Web-based surveys [
Smart surveys, which we define as smartphone survey apps that use advanced functionality to provide more intuitive surveys or gather more contextual complementary data in addition to participant responses, introduce a number of unfamiliar behaviors for patients to undertake that may work as barriers to adoption. For example, they require users to download an app to their personal smartphone and to disclose potentially sensitive information via a digital channel that may be perceived as public or insecure. To better identify and understand how these barriers may impact the adoption of smart surveys, we turn to the theory of perceived risk [
A user’s perceptions of risk can have a negative effect on information system (IS) adoption [
Seven dimensions of perceived risk framework.
Risk dimension | Definition |
Performance | The possibility that a product or service is not performing the way it was designed or advertised, therefore failing to deliver the expected benefits. |
Financial | The possibility that the use of a product or service will cause undesired financial loss (due to purchase and incurring fees or fraud). |
Time | The possibility that a product or service will cause the consumer to lose time from researching the product, learning the use, or returning the product if it underperforms. |
Psychological | The risk that the purchase or performance of a product or service will cause a negative effect on the consumer’s mind or self-perception (eg, frustration or loss of self-esteem). |
Social | The potential loss of the consumer’s social circle due to the use of a product or service. |
Physical | The possibility that the use of a product or service may be harmful or injurious to the consumer’s health. |
Privacy | The potential for personal information being shared without consent and/or used for purposes other than originally intended. |
Overall | A general measure of perceived risk when all criteria are considered together. |
Consumer behavior and IS research has found perceived risk and its antecedents to be key predictors of electronic service adoption; for example, perceived risk and its dimensions are inhibitors of technology acceptance model variables [
Perceived risk impacts attitudes toward adopting mobile e-services [
Previous research has categorized users based on the intensity of their perceptions of privacy risk. Westin [
Dupree et al [
Fundamentalists (high knowledge and motivation): like Westin’s privacy fundamentalists [
Lazy experts (high knowledge and low motivation): these individuals share the same technical knowledge as fundamentalists, but often choose convenience over security and socialization over privacy. They continue to put effort into protecting their privacy, however not to the extent where they would limit their interactions with society.
Technicians (medium knowledge and high motivation): have less technical knowledge compared with the fundamentalists and lazy experts. However, they show limited trust in privacy settings and are highly motivated to protect their privacy, often choosing privacy over being social. They tend to form their attitudes more intuitively but will change their behavior when provided with evidence.
Amateurs (medium knowledge and medium motivation): these individuals are just learning about security concepts. They are not nearly as motivated or knowledgeable as the other previously mentioned groups. Despite having limited knowledge, this group will still act to protect themselves from privacy threats.
Marginally concerned (low knowledge and low motivation): with limited knowledge about security concepts, they trust networks and websites which claim to be safe. They are aware of potential privacy threats but feel these threats are unlikely to happen to them.
Morton and Sasse [
The purpose of this study was to understand what beliefs, perceptions, and attitudes influenced patients’ intentions to share health service experience feedback using their smartphones, in particular, what role perceived risk plays in this process. Health care providers are increasingly being held accountable for the quality of services they provide; however, data collection is expensive, response rates are low, and turnaround times can be long. Although mHealth apps are common in the sector, and smartphones have been used to collect experience data in other industries [
We recruited participants from a local university between January and February 2017 using posters, email, and snowball sampling techniques. Participants were classified according to their privacy persona and their dimensions of perceived risk, and their responses were sequentially analyzed to allow researchers to evaluate the breadth of our sample and to ensure that individuals with different technical backgrounds as well as varying degrees of privacy tolerance related to information sharing were included in the study. Recruitment and analysis proceeded until saturation [
MetricWire's smartphone app served as a platform to administer the patient experience SmartSurvey.
Participants were welcomed upon their arrival and were given an overview of the purpose of the study and the data collection process by a researcher. Participants reviewed and signed an informed consent form and provided demographic information. Data were collected from participants using short questionnaires followed by a semistructured, in-depth interview. The information gathered from the questionnaires and think-aloud technique provided information for classification of participants into privacy persona clusters as well as complementary data for further context.
In the first questionnaire, participants were asked to rate their perspectives on privacy and security (PAS) (
Using cognitive interviewing techniques [
Upon completion of each participant interview, participant responses were transcribed manually from the digital recordings and thematically analyzed using QSR International’s NVivo 11 [
To assign participants to 1 of Dupree’s clusters, each participant’s data were reviewed (DN), and the participant was preliminarily assigned. Following a process of discussion (JM, JW, and PM), participants were reassigned as necessary until each cluster remained stable. Participant data were collected until we had at least one participant from each of the Dupree clusters identified, and saturation was achieved, where no new themes or evidence emerged from the interview transcripts [
Ethics approval for this study was sought and obtained jointly from the ethics committees at Wilfrid Laurier University and the University of Waterloo (#4690). All participants provided written informed consent before participating in the study.
We conducted 24 semistructured interviews with Canadian smartphone owners (7 male and 17 female) with varying educational backgrounds, technical knowledge, and motivations to protect privacy. All the participants were registered university students, half at the graduate level and, as such, are “digital natives” and thus confident using smartphones and mobile apps [
Respondents were classified according to Dupree et al’s [
To develop an understanding of the core issues facing smart survey adoption, we also categorized responses according to the dimensions of perceived risk [
Participants classified by privacy persona.
Privacy persona | Knowledge | Motivation | Statistics, n (%) |
Marginally concerned | Low | Low | 8 (33) |
Technicians | Medium | Medium | 7 (29) |
Amateurs | Medium | Low | 5 (21) |
Lazy experts | High | Low | 2 (8) |
Fundamentalists | High | High | 1 (4) |
Undefined | Low | High | 1 (4) |
Number of participants who classified dimensions of perceived risk as either “likely” or “very likely.”
Type of perceived risk | Statistics, n (%) |
Privacy and security | 18 (75) |
Performance | 12 (50) |
Time | 4 (17) |
Financial | 2 (8) |
Physical | 1 (4) |
Psychological | 0 (0) |
Social | 0 (0) |
A number of themes emerged from our analysis of cognitive and in-depth interview transcripts: (1) perceived risks associated with smart survey use, (2) loss of information agency, and (3) trusted data collectors and altruistic intentions. These are organized according to the focus of this study: first, how perceived risk impacts the propensity to use smartphones to provide service feedback using our perceived risk framework, and second, the role of participants’ key beliefs, perceptions, and attitudes in that process.
Although performance risk was the most cited type of risk, participants perceived it to be minimal when downloading or using smart surveys. The likelihood of performance risk was rated by 25% (6 out of 24) participants as “Very Unlikely” and as “Unlikely” by 46% (11 out of 24) participants. Some participants attributed this lack of risk to smart surveys being more simplistic in design than other apps on their phone and others to functionality that allowed participant audit before data were submitted:
...from my point of view, it doesn’t look too fancy or a gaming application with a lot of coding and stuff...I feel like chances of it not working...will be low.
It’s not quite as advanced as some other apps. And since it submits [data] all at once... I would be able to look at all the information before it’s submitted.
Some participants commented that smart surveys’ voluntary nature mitigated any associated risks related to time. Others disagreed, saying time loss from downloading and using the app was “very likely”; as they perceived that only health care providers would ultimately benefit from the data, they saw no off-setting personal benefit to mitigate the time risk:
Very likely, because it does benefit just the company, not really yourself. And like I said, it already takes a long time as an app it downloads and all that stuff...
The majority of participants felt that the possibility of financial loss associated with the app was either “Very Unlikely” 71% (17 out of 24 participants) or “Unlikely” 21% (5 out of 24 participants). The perception of low financial risk was attributed to smart surveys’ free download and lack of request for any financial information:
As a patient, would I have to pay money to download the app?...In this case, there doesn’t seem like there’s any chance that I would be losing money with Smart surveys. I don’t think it’s asking for credit card information or anything.
When asked to judge their perception of psychological risk associated with smart surveys, nearly all participants 8% (20 out of 24) rated their perceived psychological risk to be “Very Unlikely.” Participants were familiar with providing feedback and with using smartphone apps:
...it’s voluntary if there was something I didn’t want to say or discuss, I wouldn’t have taken it.
Overall, participants perceived a very low possibility of social risk, noting that completing surveys on a smartphone was sociably acceptable and could be completed privately:
I’m on my phone a lot anyways. I’m answering surveys. I don’t think anyone would think of me differently because it’s just surveys.
There was little to suggest that participants perceived any physical risk associated with this technology and noted that it was comparable with any other app on their smartphone:
Well, it’s just filling out buttons on a survey. I don’t think there should be health issues any more than health issues from just using a smartphone.
In line with the 6 dimensions of risk, participants rated the likelihood of “overall” risk associated with smart surveys as low. However, when discussing the overall risk, the predominant concerns related to PAS risk included the loss or misuse of sensitive information associated with their location and activity:
Personally, I don’t like the idea of data being collected on me...If there’s an app that could literally tell you physically where you’re being, that’s part of the metadata government can collect on you.
Similarly, participants displayed heightened sensitivity and apprehension about the possibility of the app being used to retrieve additional information unrelated to the research:
Maybe if I download some app, maybe someone can get your personal information on your phone.
Moreover, 1 participant was concerned that a third party, such as an employer or insurance company, could use the collected data to deny individuals employment or insurance claims:
If it’s not associated with my insurance company in any way, and it’s only for the health care to improve their staff’s interaction with their patient. I don’t think it would be likely [I would consider it a risk].
Yes. I just think I would just want to know what is being used and who’s using it. And if someone could tell me that, then it might change my mind from not giving out that information to giving information.
I don’t consider the information to be very sensitive. Even if it does go into the wrong hands, which would be weird, I probably wouldn’t mind too much.
Another expressed a belief that mobile apps may be less secure than traditional desktops apps and that the use of smartphones introduces risks such as susceptibility to hackers, in-device vulnerability, and susceptibility to loss:
It’s not 100% safe...I’m not sure apps interact with each other in a smartphone...if other apps can steal information from another app. It’s not 100% safe.
I think it’s safe. It’s not risky to share feedback. But you never know. Sometimes people can get your secure passwords, your bank passwords.
...it’s not really safe to send it through the smartphone...a smartphone can easily go into the wrong hands. It could get stolen, or even borrowed, maybe you just left it somewhere...
Location (Global Positioning System, GPS) data were an area of particular sensitivity. The majority of participants 71% (17 out of 24) were reluctant to disclose their location (GPS) data for service quality improvement. Many chose to not share location information for reasons of privacy, safety, and battery life:
If it’s on all the time, I feel like someone’s following me all the time or someone can see that they’re following me and it probably drains out my battery too.
Other issues included concern over the perceived lack of standards surrounding the handling and storage of patient experience data. The heightened sensitivity was not surprising given the considerable attention to PAS risks associated with mHealth apps [
...I mean, it’s kind of normal for me to do surveys on a computer but doing it on the phone is a little awkward. Another reason, I guess, I’m not too fond of reading too much on a smartphone ’cause I have a smaller screen and the text is small.
Importantly, the third-party mobile app for data collection using a smartphone was perceived as distinct from the health care facility requesting the data, which participants generally trusted to comply with ethical treatment of their data:
...because it’s health care facility. I have complete trust in them.
The identity of those who receive and interpret patient experience data was an important consideration for participants when deciding to complete a patient experience smart survey. More than half the participants 6% (14 out of 24) mentioned concerns over who received and viewed their information. Knowing who the users were and how the data would be used helped them decide whether or not to share feedback. Sharing experience data with their care providers was not a barrier, given its less sensitive nature, and thus diminished consequences if mishandled.
For some participants with altruistic intentions, the
I want my feedback to improve the service. I don’t write my feedback for someone who can’t change anything or improve anything.
If I share my data with the doctor, the administrator will not benefit me if they look into my data. Anyone who’s not really involved with the service. If I want to share my information in my smartphone, I want to give it to the doctor directly...It’s also the benefit of the smartphone, it can give it directly to the doctor.
Participants expressed concern that collected data may be used for purposes beyond what was initially intended or disclosed, specifically that it might affect their “information agency.” This differed from their privacy concerns, where privacy risk is defined as the potential loss of personal information without the consumers’ knowledge following the use of a service or a product [
Maybe it’s stuff you don’t necessarily want a third party to know and they do know it because sometimes certain third party companies display ads based on what you’ve done if you see an ad that’s something related to you in story that you’ve done.
In general, the participants exhibited comments and knowledge consistent with their Dupree classification.
Mobile apps are increasingly being used to gather real-time clinical and ecological patient data and to help manage workload in the health services sectors [
The study used an app that participants would have to download, retain on their phone, and manage alerts and the software app itself over time. The most commonly perceived risk was PAS, consistent with other mHealth and wearables literature [
When individuals do trust their health care provider, the presence of trust reflects a belief that the provider has the ability and motivation to make changes that result in service improvements [
Furthermore, our results suggest a need to better identify any complementary uses and recipients of survey data. Research ethics standards of practice require that researchers inform participants how data will be used at the beginning of a survey; however, this is not necessarily the case for private or nonprofit health service providers. Outside of personal health information, the use and management of which is often governed by legislation, consumers have very little control over what data are stored and shared for commercial use [
Smart surveys functionality should foster trust between patients and providers by identifying the recipients of feedback data and communicating when it is read and what improvements to care are made as a result.
Participants perceived a lack of confidence in the security of smartphones and that they can be perceived as a second-class computing device when compared with desktop personal computers for completing surveys. For example, participants expressed concerns about installing apps on their personal devices and uncertainty about how data may be shared between different apps. This was notable among the amateurs and marginally concerned. For individuals with more in-depth technical knowledge such as the fundamentalists, disclosure of implementation details was equally important, such as the types of permissions the app required, the type and location of servers on which the data would be stored, whether information would be encrypted, and the length of time their information would be retained. However, it should be noted that this was the exception. As with Gkioulos et al, we found that digital natives tended to ignore or were complacent about privacy policies [
For technical users, provide optional information about where data will be stored, for how long, and whether it will be encrypted.
Where possible, smart surveys should provide optional modalities to complete experience surveys on devices other than smartphones.
Although there were disagreements about the sensitivity of feedback data, participants were consistently hesitant about unauthorized use of or access to data, particularly their location (ie, GPS) [
The perceived risk of losing agency represents a significant barrier to adoption of “advanced” smart surveys features such as geofencing (use of technology to create a virtual geographic boundary that triggers or alerts when a mobile device enters or leaves the area). For example, smart surveys can reduce recall bias by prompting patients for feedback soon after they leave a physician’s office, instead of days or months later in traditional survey methods. The majority of participants found location-based prompts too intrusive and risky and had location services disabled on their phone. This finding is consistent with prior research that demonstrates concerns for privacy are higher when the service is based on tracking the user’s location [
Smart surveys should support alternatives to location services for prompting patients for feedback, for example, quick response codes or calendar integration.
The themes identified through our interviews helped to develop an understanding of barriers to smartphone-based patient experience surveys. Nevertheless, we are careful to acknowledge limitations to this study. First, the attitudes and perceptions of risk held by our participants were captured at 1 point in time, and attitudes toward adoption can change over time and as the y become more familiar with technology [
We used nonprobability convenience sampling and a nonsystematic recruitment process for this exploratory study; we did not anticipate our findings would be exhaustive; however, we believe that they add to the understanding of this emerging domain. On the basis of our findings, we believe that individuals with higher technical knowledge and motivation to protect their privacy were under-represented. Finally, the strength of qualitative research is its ability to describe and understand both obvious and latent phenomena contained within the “thick descriptions” provided by interview data. Although our interpretation of these exploratory data is nongeneralizable, the use of in-depth interview methodology provides researchers with an appreciation of the complexity and context of this relatively new research domain. It should also be acknowledged that with every new innovative technology, the patterns of risk and security concerns may differ from those of ostensibly similar legacy systems [
The use of smartphone-based patient experience surveys provides new and exciting opportunities for health care providers to assess and improve the quality of health services. We conducted 24 semistructured interviews with smartphone users to explore the types of perceived risks that may exist when using smart surveys in the context-sensitive health services sector. The results demonstrate that the classical dimensions of perceived risk raised minimal concerns for the use of smartphones to collect patient service experience feedback. However, PAS risk associated with trust in the doctor-patient relationship, the reliability of the communication channel, and the risk of potential loss of agency over shared information may inhibit adoption. Conversely, altruistic motivations to contribute to health service quality for others may facilitate patients’ adoption of smart surveys. We conclude that barriers and enablers of adoption of novel technologies may change from sector to sector and should be further explored.
Questionnaire: participant perspectives on privacy and security.
Questionnaire: perceived risk associated with the use of smart surveys.
In-depth interview (Think Aloud) prompts.
Global Positioning System
information system
mobile health
privacy and security
This work is funded by a Social Science and Humanities Research Council of Canada Insight Development Grant (#430-2016-00858). We would like to thank Mr Brian Stewart and the team at MetricWire for their support of this study.
None declared.