Original Paper
Abstract
Background: Working with eHealth requires health care organizations to make structural changes in the way they work. Organizational structure and process must be adjusted to provide high-quality care. This study is a follow-up study of a systematic literature review on optimally organizing hybrid health care (eHealth and face to face) using the Donabedian Structure-Process-Outcome (SPO) framework to translate the findings into a modus operandi for health care organizations.
Objective: This study aimed to develop an SPO-based quality assessment model for organizing hybrid health care using an accompanying self-assessment questionnaire. Health care organizations can use this model and a questionnaire to manage and improve their hybrid health care.
Methods: Concept mapping was used to enrich and validate evidence-based knowledge from a literature review using practice-based knowledge from experts. First, brainstorming was conducted. The participants listed all the factors that contributed to the effective organization of hybrid health care and the associated outcomes. Data from the brainstorming phase were combined with data from the literature study, and duplicates were removed. Next, the participants rated the factors on importance and measurability and grouped them into clusters. Finally, using multivariate statistical analysis (multidimensional scaling and hierarchical cluster analysis) and group interpretation, an SPO-based quality management model and an accompanying questionnaire were constructed.
Results: All participants (n=39) were familiar with eHealth and were health care professionals, managers, researchers, patients, or eHealth suppliers. The brainstorming and literature review resulted in a list of 314 factors. After removing the duplicates, 78 factors remained. Using multivariate statistical analyses and group interpretations, a quality management model and questionnaire incorporating 8 clusters and 33 factors were developed. The 8 clusters included the following: Vision, strategy, and organization; Quality information technology infrastructure and systems; Quality eHealth application; Providing support to health care professionals; Skills, knowledge, and attitude of health care professionals; Attentiveness to the patient; Patient outcomes; and Learning system. The SPO categories were positioned as overarching themes to emphasize the interrelations between the clusters. Finally, a proposal was made to use the self-assessment questionnaire in practice, allowing measurement of the quality of each factor.
Conclusions: The quality of hybrid care is determined by organizational, technological, process, and personal factors. The 33 most important factors were clustered in a quality management model and self-assessment questionnaire called the Hybrid Health Care Quality Assessment. The model visualizes the interrelations between the factors. Using a questionnaire, each factor can be assessed to determine how effectively it is organized and developed over time. Health care organizations can use the Hybrid Health Care Quality Assessment to identify improvement opportunities for solid and sustainable hybrid health care.
doi:10.2196/38683
Keywords
Introduction
Background
In recent years, the use of eHealth has expanded, encouraged by the increasing pressure on health care [
, ] and growing interest in patient empowerment [ , ]. On the one hand, an aging population and an increase in chronic diseases are causing a higher and more complex demand for health care. In addition, the COVID-19 pandemic has accelerated pressure on health care [ - ]. Therefore, innovations such as eHealth are required to maintain accessibility and high quality of health care [ - ]. On the other hand, digital health technologies have significantly accelerated patients’ involvement [ - ]. In line with these developments, health care organizations have intensively integrated eHealth into traditional face-to-face consultations [ ]. The combination of eHealth and face-to-face consultations can be defined as hybrid health care [ , ]. A few examples of hybrid health care are telemonitoring systems for patients with chronic diseases [ , ], web-based video coaching [ , ], and direct web-based access to medical records of patients [ , ], all of which are integrated into traditional health care.Although health care organizations are increasingly providing hybrid health care, integrating eHealth into the daily care process is challenging. Working with hybrid health care requires organizations to change the way they work. The roles of health care providers and patients are changing, and the available resources are used differently [
, , , ]. Organizational structure and work processes must be adapted to ensure high-quality hybrid care [ - ]. Several studies have examined ways to promote eHealth adoption, such as increasing the adaptability of the technology or stakeholders’ value [ , ]. However, it remains challenging to organize hybrid health care effectively and sustainably [ ]. There is a need for further research on how hybrid health care can be improved to add value to patients and health care providers when they work with eHealth. Therefore, we recently performed a systematic literature review to optimally organize hybrid health care [ ].In the systematic literature review, the Donabedian Structure-Process-Outcome (SPO) framework was used to identify indicators related to the integration of eHealth into health care organizations [
, - ] ( ). According to Donabedian, health care quality is based on the aspects of these 3 categories and their relationships. The SPO framework and its categories are described in detail in a literature review [ ].In the literature review, we identified 111 potential indicators under the SPO categories that impact eHealth integration. The study demonstrated that 3 principles are important for successful integration. First, the patient’s role must be centrally placed in the organization of hybrid care. Second, technology must be well attuned to the organizational structure and daily care process. Third, the deployment of human resources must be aligned with desired results [
].Objectives
To translate the findings from the literature study into a modus operandi for health care organizations, we aimed to develop a model that can help health care organizations organize hybrid health care and identify improvement opportunities for a solid and sustainable integration of eHealth. To achieve this aim, the objectives of the concept mapping study included the following: (1) enrich and validate evidence-based knowledge from the literature review with practice-based knowledge from experts and (2) develop an SPO-based model for organizing hybrid health care with an accompanying self-assessment questionnaire.
Methods
Concept Mapping
Concept mapping is a highly structured methodology for organizing ideas from different stakeholders and other data sources to produce a common framework for complex topics that can be used for evaluation or planning [
- ]. The method integrates qualitative data collection with quantitative analysis to construct an interpretable pictorial view of different ideas and concepts and how these are interrelated [ , ]. Concept mapping has been used worldwide, for a diverse range of health care projects and studies to develop conceptual frameworks, as well as health and eHealth evaluations [ - ].In this study, the 6-step concept mapping approach of Trochim and McLinden [
] was followed [ ] to develop a usable, tailored, SPO-based quality management model for hybrid health care and an accompanying questionnaire. The six steps of concept mapping are as follows: (1) preparation, (2) idea generation, (3) sorting and rating, (4) concept mapping analysis, (5) map interpretation, and (6) utilization. Each step involves different activities leading to an output, which serves as an input for the next step. The steps and activities are explained in and in the paragraphs below. All the steps were supported by the GroupWisdom webtool [ , ].Step 1: Preparation
Concept mapping is most effective when multiple stakeholders participate in all the steps of the concept mapping process [
]. There is no strict limitation to the number of participants, ranging from small groups of 8 to 15 people to groups of hundreds of participants [ ]. For this study, participants with eHealth experience, those employed by health care organizations, and patients with eHealth experience were recruited. The amount or kind of eHealth experience, health care setting, or disease was not relevant for inclusion. The goal was to create a diverse group in which different experiences, perceptions, and viewpoints complemented each other. We aimed to include a mix of health care professionals, patient experts (patients and caregivers), managers, directors, project leaders, researchers, and eHealth suppliers.Potential participants were approached to attend both brainstorming in step 2 and sorting and rating in step 3. Participants were invited via the research team’s network, social media, and snowballing. Before agreeing to participate, participants received an information letter about the concept mapping method, the study’s purpose, and the SPO framework. None of the potential participants were familiar with our previous literature study results. A selected group was asked to participate in step 4 (concept mapping), step 5 (interpretation), and step 6 (utilization), which will be explained in the subsequent sections.
Step 2: Idea Generation
Web-Based Brainstorming
In step 2, data from the participants were collected and combined with data from the literature study. Idea generation with participants was organized by brainstorming. Brainstorming is the most common method used in concept mapping, and can be either group brainstorming or individual brainstorming [
]. In this study, web-based brainstorming was conducted by the participants. Participants received a link via email with instructions, giving them access to the web-based brainstorm program of the GroupWisdom webtool. Before starting the brainstorming session, informed consent was provided, and participant characteristics (age, eHealth experience, professional background, and work setting) were collected to generate general background information about the participants. When the brainstorming started session, the following instruction was presented: “Name all factors, which you believe contribute to effective organization of patient care with eHealth, and what the outcomes of this care should be. Keep the ‘Structure-Process-Outcome’ framework in mind.”For 23 days, the participants could list as many factors they considered essential contributors to effective hybrid health care. Participants could see each other’s inputs and save their brainstorming results in the meantime. They received reminders after 10 and 15 days.
Editing Brainstorming and Literature Study Data
After closing the web-based brainstorming session, the brainstorming and literature study data were combined for sorting and rating. A manageable amount of data for sorting and rating is ideally ≤100 to prevent redundancy and a loss of participants’ motivation [
, ]. To generate a final set of up to 100 factors, duplicates and factors that did not match the brainstorming instructions were removed. For this purpose, each factor was assessed independently by the authors, RT-S and ET-K. The assessments were compared, and disagreements were resolved by discussion between RT-S and ET-K. Next, RT-S edited the remaining factors for grammar and spelling.Authors, MK and AR reviewed the editing process to check whether they would conclude the same selection and wording and made recommendations where appropriate. Finally, the set was entered into the GroupWisdom webtool, serving as an input for the sorting and rating activities.
Step 3: Sorting and Rating
At the beginning of step 3, the participants received instructions for the sorting and rating tasks. For the sorting task, the participants were asked to cluster the factors into self-created clusters and assign names to the clusters. The participants were instructed to keep the Donabedian SPO categories in mind while sorting each factor into self-created clusters. For the rating task, each participant was asked to rate each factor by relevancy on a 5-point Likert scale, ranging from 1 (not important at all or not feasible to measure) to 5 (very important or very feasible to measure) by answering the questions, “How important is this factor for effective patient care with eHealth?” and “How feasible to measure is this factor?”
The participants had the opportunity to sort and rate over 3 weeks. They could save their activities and return later and received reminders after 10 and 15 days. The sorting data were approved for concept mapping analysis for participants who completed 75% of the sorting activity and created at least three clusters [
]. The rating data were included when the participant rated at least one factor.Step 4: Concept Mapping Analysis
Concept mapping analysis consisted of four main activities: (1) generating a point map with the sorting data, (2) grouping factors into clusters using hierarchical cluster analysis, (3) selecting a concept map from the hierarchical cluster analysis, and (4) computing average ratings for each factor and cluster of the selected concept map [
]. All computations were based on the concept mapping approach of Kane et al [ , ] and conducted using the GroupWisdom webtool.Generating a Point Map With the Sorting Data
Data from the rating step were analyzed to create a point map [
, , , ]. A point map is a 2-dimensional point map, in which each point represents a factor [ ]. The point map visually displayed the locations of all factors. Factors closer to each other on the point map were sorted together more frequently by the participants, whereas more distant factors on the map were sorted together less frequently [ , , ]. The point map was constructed using a similarity matrix and multidimensional scaling algorithm. First, the similarity matrix indicated the number of times various factors were grouped together. Next, a multidimensional scaling algorithm plotted factors as points on a point map [ , , ]. Subsequently, a stress value (0-1) was calculated, indicating the degree to which the distances on the point map fit the original similarity matrix [ , ]. The better the fit, the lower is the stress value.Grouping Factors Into Clusters With Hierarchical Cluster Analysis
The point map provided the input for the hierarchical cluster analysis. The hierarchical cluster analysis grouped factors into clusters [
] using Ward algorithm [ ]. The algorithm proposed several concept map solutions, where 2 clusters were merged at each following the proposed solution.Selecting a Concept Map
From the proposed concept map solutions, a concept map that made sense for conceptualization was selected. There is no single correct number of clusters or mathematical decision criterion for selecting a concept map solution [
, ]. This study selected the number of clusters for the concept map by determining the range of the highest and lowest number of clusters. The range was the average number of clusters made by the participant and its SD.Subsequently, the cluster solutions in this range were reviewed to select the cluster level by following the cluster tree in the Methods section of the studies by Trochim [
] and Kane et al [ ]. Finally, in a meeting, 2 authors (RT-S and ET-K) and 2 participants reviewed the merging of clusters, beginning with the highest number of clusters and moving to the lowest. The 2 study participants were asked to join this meeting because of their extensive experience with eHealth, daily care processes, research, operational management, and concept mapping.After establishing the number of clusters in the concept map, each factor was reviewed for compatibility with the cluster and to determine whether it was appropriate to move the factor to a different cluster. A cluster and its content were appropriate for inclusion when they were considered essential and usable for the quality management model [
].In addition, each cluster received a name and description based on the cluster names that emerged from the sorting activity.
Computing Mean Ratings for Each Cluster and Factor of the Selected Concept Map
After the cluster map was selected, the relationships between ratings were computed using pattern-match and Go-zones [
].Pattern-match and its Pearson product-moment (r value) were calculated to compare how the clusters of the selected concept map were rated on importance and measurability. The pattern-match visualized the mean ratings of each cluster in a ladder graph, connecting lines between the mean ratings on importance and measurable of each cluster [
, ]. The r value represented the correlation strength between the 2 mean ratings of all clusters [ , ].Finally, multiple Go-zones were computed: a Go-zone of the total point map and Go-zones per cluster of the selected concept map. Go-zone is a 4-quadrant graph with an x-y graph [
], visualizing the mean ranking results of each factor on the questions “How important is this factor” and “How feasible to measure is this factor.” The minimum and maximum values for each axis were the minimum and maximum average Likert scores, respectively. The upper-right quadrant is called the Go-zone because it shows factors rated above the mean for both importance and measurability [ , ]. The pattern-match and Go-zone showed how important and measurable each cluster and its factors were rated for quality assessment by the individual participants during the step, sorting and rating.The selected concept map, with its calculation of importance and measurability for each cluster and factor, formed the basis of interpretation in the next step [
].Step 5: Interpretation of the Concept Map
The selected concept map, with its pattern-match and Go-zones, was discussed with an advisory board. On the basis of the pattern-match and Go-zones, the advisory board decided which clusters and factors should be included in the quality management model and the accompanying questionnaire. The advisory board consisted of 4 study participants from the brainstorming and sorting step, of whom, 2 also participated in step 4, concept mapping analysis. The advisors were chosen because they could be future model users. In addition, all had extensive experience with eHealth, health care business, and as health care professionals (general practitioners, nurses, anesthetists, and clinical psychologists) in different health care settings.
The advisors voted individually on which clusters and factors of the selected concept map should be included in the quality management model and questionnaire to ensure usability. Using a web-based survey, the following questions were asked: “Which cluster should be included in the quality management model based on the mean cluster rating scores of the pattern matches? Please, specify your choice.” and “On which factors should the questionnaire give focus? Guide your choice by the Go-zones of each cluster and the Go-zone of the total point map. Please specify your choice.” The advisors could not see each other’s votes. By 75% (3/4) agreement or more, the concerned clusters and factors were operationalized in the quality assessment model and questionnaire. Where there was less agreement, the advisors viewed all responses, including the comments, and were asked to vote again. This process was repeated until a 75% consensus was reached. The web-based survey results were used as inputs to develop the quality management model and its questionnaire.
Step 6: Utilization
Quality Management Model
The remaining clusters and their positions in the selected concept map provided the blueprint for the quality management model. First, the excluded clusters and factors were removed from the concept map. Second, the concept map with the remaining clusters was used to produce a logic model. A logic model is a framework that visualizes the interrelations between the clusters in graphic form and is therefore valuable for quality evaluation [
]. The SPO framework [ , ] was used to identify logical interrelationships between the clusters. Accordingly, noticeable SPO connections between the clusters were drawn on the map by RT-S. A simplified version of the logic model was designed for clarity and readability. Authors SW, ET-K, and RT-S discussed the design of the quality management model to ensure the usability and clarity of the model.Self-assessment Questionnaire
The questionnaire was drafted by RT-S with the remaining factors, taking the advisors’ comments into account. The questionnaire should give care organizations insight into the quality of hybrid care and how quality develops over time. On the one hand, the questionnaire must be easy to use and uniformly independent of the type of health care organization, type of eHealth, and disease. On the other hand, the questionnaire results must provide specific guidance to improve the quality of specific clusters and factors.
The concept model and questionnaire were submitted to the advisors for peer review of usability and clarity. Their comments were processed by RT-S, resulting in an improved draft. Finally, ET-K and SW peer reviewed the last draft to ensure that the representatives’ comments were implemented entirely in the quality management model and the related questionnaire.
Ethics Approval
Approval by an ethics committee was not needed because no intervention or trial has occurred in the sense that the research participants were subjected to actions or had modes of behavior imposed on them [
].Results
Participant Characteristics (Step 1)
A total of 39 people participated in this study. The participants had a mean age of 45.2 (SD 11.1) years and were mainly working at the family medicine clinic (12/39, 31%) or hospital (10/39, 26%) within a management function (16/39, 41%) or as a health care professional (14/39, 36%). A total of 59% (23/39) of the participants estimated their eHealth experience to be extensive. The 3 most commonly used eHealth tools were apps (37/147, 25.2% participants), web portals (35/147, 23.8% participants), and video communication (34/147, 23.1% participants). An overview of the participants’ characteristics is shown in
.Of the 39 participants, 38 (97%) completed the brainstorming sessions. In all, 18% (7/38) of the participants dropped out after the brainstorming session, and a new participant joined the sorting and rating phase. In total, 79% (31/39) of the participants completed the sorting and rating phase (
).Variables | Values | ||
Age (years), mean (SD) | 45.2 (11.1) | ||
Main work setting, n (%) | |||
Family medicine | 12 (31) | ||
Hospital | 10 (26) | ||
Mental health clinic | 5 (13) | ||
Nursing and residential care | 5 (13) | ||
eHealth supplier | 4 (10) | ||
Research institute | 2 (5) | ||
Patient experts (self-employed) | 1 (3) | ||
Main profession, n (%)a | |||
Manager, director, or project leader | 16 (41) | ||
Health care professional (eg, physician, nurse, therapist, or psychologist) | 14 (36) | ||
Patient expert (eg, patient or caregiver) | 5 (13) | ||
Researcher | 3 (8) | ||
Unknown | 1 (3) | ||
eHealth technology experience, n (%)b | |||
Apps | 37 (25.2) | ||
Web portals (eg, electronic health records or personal care records) | 35 (23.8) | ||
Video communication | 34 (23.1) | ||
Sensors and wearables | 23 (15.6) | ||
Artificial intelligence | 13 (8.8) | ||
Domotica and robotica | 10 (6.8) | ||
Estimated level of experience with eHealth, n (%) | |||
Extensive experience | 23 (59) | ||
Moderated experience | 15 (38) | ||
Limited experience | 1 (3) |
aMany participants had dual roles, from which they were asked to choose one role.
bParticipants could select multiple answers.
Idea Generation (Step 2)
Brainstorming during idea generation resulted in a list of 203 factors. A total of 111 potential indicators were extracted from the literature study [
]. Both lists were aggregated, resulting in a list of 314 factors. Editing of the data led to a final list of 78 factors. These 78 factors served as inputs for the sorting and rating activity. The list of 78 factors is provided in .Sorting and Rating (Step 3)
The rating data of the 32 participants were included in this study. All factors received mean rating scores of >3.1, for both importance and measurability. The mean ratings on the questions, “How important is this factor for successful integration of eHealth?” and “How feasible to measure is this factor” are described in
.The sorting data of 8 people were excluded, with the reason “less than 75% sorted” (n=4, 50%) or “sorted in two clusters” (n=4, 50%). The mean number of clusters of the approved data was 7 (SD 3.5) with a range of 3 to 15 clusters.
Concept Mapping Analysis (Step 4)
Visual Representation
The point map in
shows how the 78 factors are related according to the sorting data. The point map had a stress value of 0.26, indicating that it had a good fit with the original similarity matrix [ , ].The point map displays the locations of all factors that were frequently sorted closer together by the participants, whereas unrelated factors were plotted farther from each other. The number of points corresponds to the number of factors presented in
.Selecting the Concept Map
Concept map solutions ranging from 11-cluster to 3-cluster options were reviewed (mean 7, SD 3.5). The 9-cluster concept map was selected to make the most sense of conceptualization. A few factors (n=14) were unanimously replaced, leading to the concept map shown in
. Replaced factors and their reasons are presented in . The 9 clusters were labeled and received a short description, as described in . The number of points corresponds to the number of factors presented in . The clusters represent how the participants sorted the factors into self-created clusters using the proposed cluster labels.Cluster numbera | Cluster label | Description | Included factors, n |
1 | Quality information technology infrastructure and systems | Conditions concerning technology, information technology systems, and data. | 6 |
2 | Quality eHealth application | Conditions concerning the eHealth application. | 4 |
3 | Learning system: evaluation and improvement | Evaluation and realignment with stakeholders and the patient care objectives for a continuous development. | 4 |
4 | Vision, strategy, and organization | Responsibilities of the health care organization concerning vision, strategy, policy, leadership, funding, and work process designs. | 16 |
5 | Providing support to health care professionals | Conditions arranged by the health care organization to encourage the use of eHealth among its health care professionals. | 10 |
6 | Skills, knowledge, and attitude of health care professionals | Health care professionals’ ability to provide hybrid care. | 10 |
7 | Attentiveness to the patient | Organize the daily care process in line with the patient’s needs, demand for care, and its capacity. | 13 |
8 | Organization outcomes | Outcomes for the health care organization; for example, quality health care provision and health care logistics. | 5 |
9 | End results for the patient | Outcomes for the patients; for example, health, added value, satisfaction, ownership, and convenience. | 10 |
aNumber corresponds with the number of the concerning cluster in
.Mean Ratings for Each Cluster and Factor of the Selected Concept Map
The pattern-match showed that all clusters had a mean score between 3.75 and 4.27 on the importance and a mean score between 3.79 and 4.10 on measurability (
). The cluster with the highest mean score on importance was Attentiveness to the patient (mean 4.27, SD 0.27), and the cluster with the highest mean score on measurability was End results for the patients (mean 4.10, SD 0.17). On the contrary, the cluster with the lowest mean score on importance was Organization outcomes (mean 3.75, SD 0.36), whereas the cluster Quality eHealth application (mean 3.79, SD 0.45) had the lowest mean score on measurability. The r value was 0.63, indicating a predictable alignment between the rating of importance and the rating of measurability. The mean ratings of the factors and Go-zones per cluster are included in .Interpretation of the Concept Map (Step 5)
The pattern-match and Go-zones were input to determine which clusters and factors of the selected concept map should be included in the quality management model and questionnaire. Decisions were made in 2 voting rounds. Of the 9 clusters, the cluster Organization outcomes was not included in the quality management model, based on the voting (3/4, 75%) of the advisors had doubts about including the cluster in the model) and after discussion with the research team. The factors included in the questionnaire concerned those placed in the Go-zone of the total point map or the Go-zone of the clusters. As a result, 8 clusters remained in the model and 33 factors in the questionnaire remained as a manageable utility for quality assessment (
). presents the responses and comments of the advisory board during the voting rounds.The included clusters and factors.
Quality Information technology infrastructure and systems (1)
- Information technology architecture available within the health care organization (1).
- Back-up scenario during technical problems (12).
Quality eHealth application (2)
- The eHealth application is user-friendly (35).
Learning system: evaluation and improvement (3)
- Cocreation: eHealth is developed, implemented and redeveloped with different stakeholders (8).
- Monitoring and evaluation of service and treatment results (58).
Vision, strategy, and organization (4)
- Support the implementation and development of eHealth in the organization with good project management (4).
- Mobilizing funding for working with eHealth (16).
- Clear internal policies regarding the use of eHealth (18).
- Vision supported by the line, “Why are we doing this?” (21).
- Care delivery with eHealth complies with laws and regulations (41).
- Financial reimbursements for eHealth deployment (42).
- Redesign the current work process and review what contributes to the desired care outcomes (47).
Providing support toward health care professionals (5)
- Health care professionals have easy access to information technology resources; for example, device, internet, screen, or headset (2).
- Embedding eHealth in the daily practice of health care professionals (11).
- Training and supervision for health care professionals (15).
- Help desk for health care professionals (17).
- Information on the treatment with eHealth is clear and accessible to the health care professional (19).
Skills, knowledge, and attitude of health care professionals (6)
- Good balance between face to face and eHealth for the health care professional (46).
- The health care professional has confidence in the eHealth application (70).
- The health care professional is satisfied with working with eHealth (74).
Attentiveness to the patient (7)
- Clear communication to the patient about how care is offered (10).
- Personalized care, considering patient needs with regard to (deployment of) eHealth (13).
- The patient has easy access to the necessary information technology resources; for example, device, Internet, and so on (30).
- Patients receive practical support in using the eHealth application; for example, a help desk (49).
- The patient has confidence in the eHealth application (67).
- The patient has the flexibility to use eHealth wherever and whenever it is convenient (72).
End results for the patient (9)
- The patient can integrate the use of eHealth in their daily life (33).
- Treatment with eHealth has a positive influence on the patient’s health (64).
- Treatment with eHealth contributes to the patient’s self-reliance (65).
- The patient is satisfied (68).
- The patient has easy access to care (71).
- eHealth provides logistical convenience for the patient (73).
- eHealth has added value for the patient (75).
Utilization (Step 6)
Utilization Model
The clusters and factors excluded from the voting rounds were removed from the selected concept map. The remaining clusters (n=8) and their factors (n=33) led to nonoverlaying clusters on the concept map. Above the clusters, the SPO categories were positioned as overarching themes to emphasize the interrelations between the clusters. In addition, a complex cluster map can be simplified into a logic model.
A-C show the simplification of the model.The overarching categories, structure, process and outcomes and the clusters’ interconnections refer to the Donabedian SPO framework [
, ]. The cluster Learning system is visualized in the arrows with the dashed line. The numbers inside the clusters represent the number of factors included.Utilization Questionnaire
The remaining 33 factors were included in the questionnaire, where each factor can be measured on how effectively it is organized and developed over time. The advisory board noted that measuring the quality progress of hybrid health care is very important, in addition to learning and continuous improvement with stakeholders. Subsequently, the idea was to enrich the questionnaire with a quality progress tracker based on the plan-do-check-act (PDCA) cycles of Deming [
]. Incorporating the PDCA cycle makes it possible to assess the quality easily and uniformly with tailored feedback for health care organizations. PDCA is a well-known cycle method for continuous improvement and quality measurement [ ]. The PDCA cycles assess each factor’s quality by measuring the extent to which The objective is tangible? (plan), The plan is implemented? (do), To what extent is the plan realized? (check), and Providing feedback on the quality of the execution to make improvements (act) [ ]. Each factor can be monitored on the quality level of the PDCA cycles using a Likert score (0-10). A score of 0 means there is no plan to improve the concerning factor, and a score of 10 means continue improvement with stakeholders. The Likert scoring is based on the PDCA cycles and the 2 factors of the cluster Learning system, which include the following: (1) Cocreation: eHealth is being developed and implemented with various stakeholders and (2) Monitoring and evaluation of service- and treatment outcomes. Using the PDCA cycles in combination with a Likert score provides a health care organization insight into improvement possibilities for each factor or cluster.Finally, the model and questionnaire obtained a more convenient workname Hybrid Health Care Quality Assessment (HHQA). The HHQA model and questionnaire with suggestions on how to use it are explained in
.Discussion
Principal Findings
In this concept mapping study, we aimed to develop an SPO-based model and an accompanying self-assessment questionnaire for hybrid health care. By combining practice-based knowledge from eHealth users with an evidence-based literature review, we found that organizational, technological, and process and personal factors affect the quality of hybrid health care. Health care organizations must understand that these factors play a role in organizing hybrid health care and should be familiar with ways to improve them. The authors developed the HHQA, which can be used to systematically assess and improve the quality of hybrid health care.
The HHQA model includes 8 clusters. Cluster 1 (Vision, strategy, and organization) includes the responsibilities of the management to set the vision, strategy, policy, leadership, finance, and project management. Cluster 2 (Quality information technology infrastructure and systems) focuses on information technology infrastructure and back-up scenarios by information technology issues. Cluster 3 (Quality eHealth application) concerns the user-friendliness of the digital health application itself. Cluster 4 (Providing support toward care professional) and cluster 5 (Skills, knowledge, and attitude of health care professionals) include factors concerning health care providers. Cluster 4 focuses on factors that should be arranged for the individual health care professional by the care organization, and cluster 5 includes the responsibilities of the professional. The patient is central in cluster 6 (Attentiveness to the patient). This cluster contains the measurement of factors that allow patients to increase their self-management and consider the individual patient’s needs. Patient centeredness is also reflected in cluster 7 (Patient outcomes), including factors such as patient’s health outcomes, added value, satisfaction, ownership, and convenience. Finally, cluster 8 (Learning system), forms the relationship between the continued development of hybrid health care with stakeholders and health care provision objectives. The factors in cluster 8 provide insight into where alignment can be improved with other organizational criteria and actions, such as cost-benefit or capacity management.
The interdependencies of the clusters are logically expressed in the HHQA model because of the overarching categories of the Donabedian SPO framework. Moreover, according to eHealth users, clusters consist of the most important factors for the quality of hybrid health care. Using the questionnaire, each factor (33 in total) was measured to determine how effectively it was organized and developed over time. Subsequently, the main results of the questionnaire were shown at the cluster level. It was possible to zoom in on the relevant factors for each cluster.
Comparison With Literature
In our previous literature review [
], we concluded that the capabilities of patients, health care professionals, and technology play a crucial role in the quality of hybrid health care. We also concluded that offering hybrid health care requires adjusting the daily care process and appropriate process monitoring. The conclusions from the literature review are reflected in the HHQA clusters, namely, the patient’s role is visible in the clusters Attentiveness to the patient and Patient outcomes; the health care professional’s role is central in the clusters Providing support toward health care professionals and Skills, knowledge, and attitude of professionals; and technology is covered in the clusters Quality information technology infrastructure and systems and Quality eHealth application. The adjustment of the daily care processes is elaborated in the cluster Vision, strategy, and organization. Finally, monitoring is embedded in the cluster Learning system and the PDCA-progress tracker.The 8 clusters of the HHQA model fit the 3 overarching categories of the Donabedian SPO framework. According to Donabedian [
], health care quality is based on aspects of these 3 categories and their relationships. The interaction between the categories can be bidirectional and is an “unbroken chain of antecedents, followed by intermediate ends, which are themselves the means to still further ends” [ ]. Our research translated the complex interaction between the categories, structure, process, and outcome into user language.The HHQA connects essential contributions to the quality of hybrid health care using a progress tracker. The relationship between quality contributors and continuous improvement also appears in the European Foundation for Quality Management Model (EFQM) [
, ]; nonadoption, abandonment, scale-up, spread, sustainability (NASSS) [ ]; and the Consolidated Framework for Implementation Research (CFIR) [ , ]. All models approach the organizational structure, process, and outcomes with continuous improvement in a structured manner, but with different focus areas. For example, the EFQM is not specified for health care, in contrast to the NASSS and CFIR. The NASSS focuses on the adoption of technology and reduces implementation complexity, whereas the CFIR emphasizes on implementation in general. However, none of them have been specified for quality assessment and improvement of hybrid health care.Nevertheless, it is interesting to conduct a detailed examination of the assessment questionnaires of the EFQM and NASSS. The EFQM deployed the Results-Approach-Deployed-Assessment-Refinement (RADAR) method [
, ], a questionnaire to assess the quality improvement at each EFQM criteria, which incorporates the continued improvement circle. The assessment using the RADAR method is similar to the PDCA cycle in our questionnaire, as both monitor continuous quality improvement by completing the cycle plan-executing-monitoring and refining. However, the RADAR, similar to the EFQM model, is not specified for hybrid health care. In addition, the NASSS comes with a questionnaire to monitor the complexity of technology implementation in health care [ ], but the focus is on project management instead of the hybrid health care process itself. Furthermore, there are other questionnaires measuring the quality of eHealth [ - ] or the quality of health care [ , ]. However, these questionnaires are concerned with the quality assessment of eHealth nationwide [ , ], the quality of a specific digital health application [ , ], or measuring the quality of a specific disease pathway [ , ]. To the best of our knowledge, HHQA is the first questionnaire measuring the quality of hybrid health care at an organizational level, taking the role of the patient, health care professionals, and technology into account, accompanied by an improvement progress tracker. Therefore, the authors recommend using the HHQA to measure and improve the quality of hybrid health care.Strengths and Limitations
This study has several strengths. First, the HHQA was developed in cocreation with stakeholders who are direct users of eHealth. Therefore, the HHQA content was drawn from inside the health care system itself and not conceived or imposed outside the health care organizations. Second, stakeholders choose the included clusters and factors. The researcher only played a facilitating role. Consequently, the clusters and factors accurately reflect stakeholders’ views and values, expressed in their own words and visual representations. Third, the stakeholder group was diverse and consisted of representatives of health care professionals, patients, managers, researchers, and eHealth designers. Nevertheless, the stress value of the point map shows that the stakeholders’ outcomes are highly compatible. Therefore, the study results are likely to be generalizable to everyday practices. Fourth, the model and questionnaire were developed by combining scientific and practice-based knowledge. Together, these strengths result in important factors for effective hybrid health care covering different users' needs and organization requirements.
Our study had some limitations. First, the questionnaire had not yet been tested in health care organizations. This will be conducted in a follow-up study. Although eHealth users from different health care organizations have reviewed the model and questionnaire, the model and questionnaire may still be too abstract for daily practice, as is often the case in scientific research [
- ]. A follow-up study could provide concrete recommendations on how to use the HHQA. Second, it is conceivable that other factors and clusters could be included in other participants and health care environments. We attempted to overcome this problem by creating diverse groups of participants with different backgrounds, various eHealth experiences, and different kinds of health care settings. In addition, combining idea generation through brainstorming with results from a systematic literature review reduces the risk of bias. Third, based on the analysis of the concept mapping phase, 14 factors were moved to other clusters. However, some of these factors were moved far across the map, which was not entirely in line with the spirit of group concept mapping. Nevertheless, we deemed it necessary to move these factors for substantive reasons. Fourth, the advisory group consisted of 4 participants. We wanted to avoid overquestioning the participants and, therefore, deliberately selected a group of delegates who reflected on the diversity among the participants and who also had experience with quality management and concept mapping. Combined with in-depth preparation and discussion among the research groups, this appeared to be the most feasible solution.Finally, it is worth pointing out that the HHQA gives a first general impression of improvement, as there is much to be gained in taking the role of the patient, health care professionals, and used technology into account [
]. Furthermore, the authors will continue with follow-up research and warm-heartedly welcome repetition of the study to improve the HHQA, taking into account the different users and health care environments.Conclusions
This study developed a quality management model and an accompanying self-assessment questionnaire tailored for hybrid health care, the HHQA. A quality model for hybrid care is indispensable for effectively integrating eHealth into regular care and delivering high-quality health care. The HHQA covers all relevant aspects for the assessment and sustainable improvement of hybrid health care and the interrelations of eHealth with organizational, technical, and human factors. The next step is to validate and apply the HHQA model and questionnaire in practice.
Acknowledgments
The authors thank the experts who participated in the brainstorming and sorting phase, Arjen Huizinga (MiGuide), Bart van Pinxteren (eHuisarstenkompas; Huisartsen Oog in Al; Huisartsen Utrecht Stad; Nederlands Huisartsen Genootschap; Spindok); Bart Timmers (Groepspraktijk Huisartsen Bergh), Beverly Rose (Vereniging van Ervaringsdeskundigen), Corine van Barneveld (Saltro), Caroline Meijer (Leiden University Medical Centre), Erwin van Boxtel (Thebe), Folkert de Winter (Sint Antonius Ziekenhuis), Geert-Jan van Hal (self-employed), Hans in het Veen (Franciscus Gasthuis and Vlietland), Heleen Krabben (Medicinemen), Ineke Kamp, Irvin Talboom (Huisartsen Zorggroep Breda), Jan Frans Mutsaerts (Het Huisartsenteam; Familiedokters), Jeanette Ploeger (Minddistrict), Joris Arts (DiSofa; Geestelijke GezondheidsZorg Noord Holland Noord), Joyce Bierbooms (Geestelijke GezondheidsZorg Eindhoven), Judie Knol (de Neckar), Kim Brons (Leiden University Medical Centre; Process in progress), Leonoor van Dam van Isselt (Leiden University Medical Centre, Pieter van Foreest; Vilente), Maarten Ellenbroek (Topaz), Marijke de Vries (Leiden University Medical Centre), Marjolein Eldenhorst (Leiden University Medical Centre), Marjon Peters (Insight4u), Maryse Spapens (Zorgkompaz), Melchior Nierman (Atal Medial), Mieke Klerkx (self-employed), Lya van der Veen, Pascale Schure (Huisartsen Linschoten; Unilabs Group), Paul Haarkamp (Carinova), Ryan Esser (Incluzio), Stephanie Wouthuis (Gezondheidscentrum De Boog; Huisartsenpraktijk Nijdam; Saltro), Stijn de Ruijter (doccs huisartsenpraktijk), Wim Green (Leiden University Medical Centre).
The authors extend special thanks to Cynthia Hallensleben (Gezondheidscentrum Spoorlaan; Leiden University Medical Centre), Fred de Boer (Leiden University Medical Centre), Nathan Bachran (Geestelijke GezondheidsZorg Oost Brabant), and Tobias Bonten (Huisartsenpraktijk L Broek; Leiden University Medical Centre) for their advice on the development of the model and questionnaire, to Marc Smelik (IE Business School) for reviewing the manuscript textually, and to Léon Tossaint (former chief executive officer of the European Foundation for Quality Management Model) for his explanation of the Results-Approach-Deployed-Assessment-Refinement method of the European Foundation for Quality Management Model.
Conflicts of Interest
None declared.
Mean (SD) rating scores of clusters and factors.
DOCX File , 38 KB
Relocation factors and their reasons.
DOCX File , 16 KB
Results voting “which clusters and factors to include” and given comments.
DOCX File , 292 KB
Suggestion utilization Hybrid Health Care Quality Assessment questionnaire.
XLSX File (Microsoft Excel File), 84 KBReferences
- van der Kleij RM, Kasteleyn MJ, Meijer E, Bonten TN, Houwink EJ, Teichert M, et al. SERIES: eHealth in primary care. Part 1: concepts, conditions and challenges. Eur J Gen Pract 2019 Oct;25(4):179-189. [CrossRef] [Medline]
- Nijland N. Grounding eHealth: towards a holistic framework for sustainable eHealth technologies. University of Twente. 2011 Jan 21. URL: https://research.utwente.nl/en/publications/grounding-ehealth-towards-a-holistic-framework-for-sustainable-eh [accessed 2020-07-30]
- Hibbard JH, Mahoney ER, Stock R, Tusler M. Do increases in patient activation result in improved self-management behaviors? Health Serv Res 2007 Aug;42(4):1443-1463. [CrossRef] [Medline]
- Boers SN, Jongsma KR, Lucivero F, Aardoom J, Büchner FL, de Vries M, et al. SERIES: eHealth in primary care. Part 2: exploring the ethical implications of its application in primary care practice. Eur J Gen Pract 2020 Dec;26(1):26-32. [CrossRef] [Medline]
- van Hattem NE, Silven AV, Bonten TN, Chavannes NH. COVID-19's impact on the future of digital health technology in primary care. Fam Pract 2021 Nov 24;38(6):845-847. [CrossRef] [Medline]
- Golinelli D, Boetto E, Carullo G, Nuzzolese AG, Landini MP, Fantini MP. Adoption of digital technologies in health care during the COVID-19 pandemic: systematic review of early scientific literature. J Med Internet Res 2020 Nov 06;22(11):e22280 [FREE Full text] [CrossRef] [Medline]
- Thulesius H. Increased importance of digital medicine and eHealth during the Covid-19 pandemic. Scand J Prim Health Care 2020 Jun;38(2):105-106. [CrossRef] [Medline]
- Health at a Glance 2021. Organisation for Economic Cooperation and Development. 2021. URL: https://www.oecd-ilibrary.org/social-issues-migration-health/health-at-a-glance-2021_ae3016b9-en [accessed 2022-03-08]
- Zorgkeuzes in Kaart 2020: Analyse van beleidsopties van politieke partijen voor de zorg. Centraal Planbureau. 2020 Jul 24. URL: https://www.cpb.nl/zorgkeuzes-in-kaart-2020 [accessed 2021-11-25]
- Mann DM, Chokshi SK, Kushniruk A. Bridging the gap between academic research and pragmatic needs in usability: a hybrid approach to usability evaluation of health care information systems. JMIR Hum Factors 2018 Nov 28;5(4):e10721 [FREE Full text] [CrossRef] [Medline]
- Heckemann B, Wolf A, Ali L, Sonntag SM, Ekman I. Discovering untapped relationship potential with patients in telehealth: a qualitative interview study. BMJ Open 2016 Mar 02;6(3):e009750 [FREE Full text] [CrossRef] [Medline]
- Aardoom JJ, van Deursen L, Rompelberg CJ, Standaar LM, Suijkerbuijk AW, van Tuyl LH, et al. Indicatoren E-healthmonitor 2021-2023 en doelstellingen voor e-health. Rijksinstituut voor Volksgezondheid en Milieu. 2021. URL: https://rivm.openrepository.com/handle/10029/624864 [accessed 2022-03-09]
- Snyder H, Engström J. The antecedents, forms and consequences of patient involvement: a narrative review of the literature. Int J Nurs Stud 2016 Jan;53:351-378. [CrossRef] [Medline]
- Bentvelsen RG, van der Vaart R, Veldkamp KE, Chavannes NH. Systematic development of an mHealth app to prevent healthcare-associated infections by involving patients: ‘Participatient’. Clin eHealth 2021;4:37-44 [FREE Full text] [CrossRef]
- Bruce CR, Harrison P, Nisar T, Giammattei C, Tan NM, Bliven C, et al. Assessing the impact of patient-facing mobile health technology on patient outcomes: retrospective observational cohort study. JMIR Mhealth Uhealth 2020 Jun 26;8(6):e19333 [FREE Full text] [CrossRef] [Medline]
- Matamala-Gomez M, Maisto M, Montana JI, Mavrodiev PA, Baglio F, Rossetto F, et al. The role of engagement in teleneurorehabilitation: a systematic review. Front Neurol 2020 May 6;11:354 [FREE Full text] [CrossRef] [Medline]
- Tossaint-Schoenmakers R, Versluis A, Chavannes N, Talboom-Kamp E, Kasteleyn M. The challenge of integrating eHealth into health care: systematic literature review of the Donabedian model of structure, process, and outcome. J Med Internet Res 2021 May 10;23(5):e27180 [FREE Full text] [CrossRef] [Medline]
- Chan SR, Torous J, Hinton L, Yellowlees P. Mobile tele-mental health: increasing applications and a move to hybrid models of care. Healthcare (Basel) 2014 May 06;2(2):220-233 [FREE Full text] [CrossRef] [Medline]
- Hughes MC, Gorman JM, Ren Y, Khalid S, Clayton C. Increasing access to rural mental health care using hybrid care that includes telepsychiatry. J Rural Ment Health 2019 Jan;43(1):30-37. [CrossRef]
- van Buul AR, Derksen C, Hoedemaker O, van Dijk O, Chavannes NH, Kasteleyn MJ. eHealth program to reduce hospitalizations due to acute exacerbation of chronic obstructive pulmonary disease: retrospective study. JMIR Form Res 2021 Mar 18;5(3):e24726 [FREE Full text] [CrossRef] [Medline]
- Dijkstra A, Heida A, van Rheenen PF. Exploring the challenges of implementing a Web-based telemonitoring strategy for teenagers with inflammatory bowel disease: empirical case study. J Med Internet Res 2019 Mar 29;21(3):e11761 [FREE Full text] [CrossRef] [Medline]
- Hinman RS, Nelligan RK, Bennell KL, Delany C. "Sounds a bit crazy, but it was almost more personal:" a qualitative study of patient and clinician experiences of physical therapist-prescribed exercise for knee osteoarthritis via Skype. Arthritis Care Res (Hoboken) 2017 Dec;69(12):1834-1844. [CrossRef] [Medline]
- Hadjistavropoulos HD, Nugent MM, Dirkse D, Pugh N. Implementation of Internet-delivered cognitive behavior therapy within community mental health clinics: a process evaluation using the consolidated framework for implementation research. BMC Psychiatry 2017 Sep 12;17(1):331 [FREE Full text] [CrossRef] [Medline]
- Talboom-Kamp E, Tossaint-Schoenmakers R, Goedhart A, Versluis A, Kasteleyn M. Patients' attitudes toward an online patient portal for communicating laboratory test results: real-world study using the eHealth impact questionnaire. JMIR Form Res 2020 Mar 04;4(3):e17060 [FREE Full text] [CrossRef] [Medline]
- Tossaint-Schoenmakers R, Kasteleyn M, Goedhart A, Versluis A, Talboom-Kamp E. The impact of patient characteristics on their attitudes toward an online patient portal for communicating laboratory test results: real-world study. JMIR Form Res 2021 Dec 17;5(12):e25498 [FREE Full text] [CrossRef] [Medline]
- Chavennes NH. eHealth in Disease Management: doel of tool? Leiden University. 2015. URL: https://scholarlypublications.universiteitleiden.nl/handle/1887/51560 [accessed 2021-04-29]
- Mitchell M, Getchell M, Nkaka M, Msellemu D, Van Esch J, Hedt-Gauthier B. Perceived improvement in integrated management of childhood illness implementation through use of mobile technology: qualitative evidence from a pilot study in Tanzania. J Health Commun 2012;17 Suppl 1:118-127. [CrossRef] [Medline]
- Budhwani S, Fujioka JK, Chu C, Baranek H, Pus L, Wasserman L, et al. Delivering mental health care virtually during the COVID-19 pandemic: qualitative evaluation of provider experiences in a scaled context. JMIR Form Res 2021 Sep 21;5(9):e30280 [FREE Full text] [CrossRef] [Medline]
- Swinkels IC, Huygens MW, Schoenmakers TM, Oude Nijeweme-D'Hollosy W, van Velsen L, Vermeulen J, et al. Lessons learned from a living lab on the broad adoption of eHealth in primary health care. J Med Internet Res 2018 Mar 29;20(3):e83 [FREE Full text] [CrossRef] [Medline]
- van Gemert-Pijnen JE, Nijland N, van Limburg M, Ossebaard HC, Kelders SM, Eysenbach G, et al. A holistic framework to improve the uptake and impact of eHealth technologies. J Med Internet Res 2011 Dec 05;13(4):e111 [FREE Full text] [CrossRef] [Medline]
- Granja C, Janssen W, Johansen MA. Factors determining the success and failure of eHealth interventions: systematic review of the literature. J Med Internet Res 2018 May 01;20(5):e10235 [FREE Full text] [CrossRef] [Medline]
- Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, et al. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res 2017 Nov 01;19(11):e367 [FREE Full text] [CrossRef] [Medline]
- Ross J, Stevenson F, Lau R, Murray E. Factors that influence the implementation of e-health: a systematic review of systematic reviews (an update). Implement Sci 2016 Oct 26;11(1):146. [CrossRef] [Medline]
- Donabedian A. The quality of care. How can it be assessed? JAMA 1988 Sep 23;260(12):1743-1748. [CrossRef] [Medline]
- Donabedian A. Evaluating the quality of medical care. 1966. Milbank Q 2005;83(4):691-729. [CrossRef] [Medline]
- Rademakers J, Delnoij D, de Boer D. Structure, process or outcome: which contributes most to patients' overall assessment of healthcare quality? BMJ Qual Saf 2011 Apr;20(4):326-331. [CrossRef] [Medline]
- Kane M, Trochim WM. Using concept mapping in planning. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:135-156.
- Rosas SR, Kane M. Quality and rigor of the concept mapping methodology: a pooled study analysis. Eval Program Plann 2012 May;35(2):236-245. [CrossRef] [Medline]
- Trochim WM. Concept mapping: soft science or hard art? Eval Program Plann 1989 Jan;12(1):87-110. [CrossRef]
- van Bon-Martens MJ, Achterberg PW, van de Goor IA, van Oers HA. Towards quality criteria for regional public health reporting: concept mapping with Dutch experts. Eur J Public Health 2012 Jun;22(3):337-342. [CrossRef] [Medline]
- Groupwisdom. URL: https://groupwisdom.com [accessed 2021-03-08]
- Trochim WM, McLinden D. Introduction to a special issue on concept mapping. Eval Program Plann 2017 Feb;60:166-175. [CrossRef] [Medline]
- Minkman M, Ahaus K, Fabbricotti I, Nabitz U, Huijsman R. A quality management model for integrated care: results of a Delphi and Concept Mapping study. Int J Qual Health Care 2009 Feb;21(1):66-75. [CrossRef] [Medline]
- Bonten TN, Rauwerdink A, Wyatt JC, Kasteleyn MJ, Witkamp L, Riper H, EHealth Evaluation Research Group. Online guide for electronic health evaluation approaches: systematic scoping review and concept mapping study. J Med Internet Res 2020 Aug 12;22(8):e17774 [FREE Full text] [CrossRef] [Medline]
- Rauwerdink A, Kasteleyn MJ, Haafkens JA, Chavannes NH, Schijven MP, steering committee, of the Citrien fund program eHealth. A national eHealth vision developed by University Medical Centres: a concept mapping study. Int J Med Inform 2020 Jan;133:104032. [CrossRef] [Medline]
- van Engen-Verheul M, Peek N, Vromen T, Jaspers M, de Keizer N. How to use concept mapping to identify barriers and facilitators of an electronic quality improvement intervention. Stud Health Technol Inform 2015;210:110-114. [Medline]
- Svobodova I, Filakovska Bobakova D, Bosakova L, Dankulincova Veselska Z. How to improve access to health care for Roma living in social exclusion: a concept mapping study. Int J Equity Health 2021 Feb 12;20(1):61 [FREE Full text] [CrossRef] [Medline]
- Hargett CW, Doty JP, Hauck JN, Webb AM, Cook SH, Tsipis NE, et al. Developing a model for effective leadership in healthcare: a concept mapping approach. J Healthc Leadersh 2017 Aug 28;9:69-78 [FREE Full text] [CrossRef] [Medline]
- van Bon-Martens MJ, van de Goor LA, Holsappel JC, Kuunders TJ, Jacobs-van der Bruggen MA, te Brake JH, et al. Concept mapping as a promising method to bring practice into science. Public Health 2014 Jun;128(6):504-514. [CrossRef] [Medline]
- Kane M, Trochim WM. An introduction to concept mapping. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:1-26.
- Kane M, Trochim WM. Preparing for concept mapping. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:28-48.
- Kane M, Trochim WM. Generating the ideas. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:49-66.
- Trochim WM. An introduction to concept mapping for planning and evaluation. Eval Program Plann 1989 Jan;12(1):1-16 [FREE Full text] [CrossRef]
- Kane M, Trochim WM. Concept mapping analysis. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:87-110.
- Group Concept Mapping Steps. Groupwisdom. URL: https://groupwisdom.com/GCMRG#GCM [accessed 2021-03-08]
- Kane M, Trochim WM. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2007.
- Kane M, Trochim WM. Interpreting the maps. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:112-134.
- Trochim W, Kane M. Concept mapping: an introduction to structured conceptualization in health care. Int J Qual Health Care 2005 Jun;17(3):187-191. [CrossRef] [Medline]
- Kane M, Trochim WM. Using concept mapping in evaluation. In: Kane M, Trochim WM, editors. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA, USA: Sage Publications; 2011:157-174.
- Medical Research Involving Human Subjects Act (WMO). Central Committee on Research Involving Human Subjects. URL: https://english.ccmo.nl/investigators/legal-framework-for-medical-scientific-research/laws/medical-research-involving-human-subjects-act-wmo [accessed 2021-11-04]
- Harteloh PP, Casparie AF. Kwaliteit van zorg. Van een zorginhoudelijke benadering naar een bedrijfskundige aanpak. Utrecht, The Netherlands: Elsevier; Jul 1, 1994.
- Fonseca L, Amaral A, Oliveira J. Quality 4.0: the EFQM 2020 model and industry 4.0 relationships and implications. Sustainability 2021 Mar 12;13(6):3107. [CrossRef]
- European Foundation for Quality Management. URL: https://www.efqm.org/ [accessed 2022-03-04]
- Sarkies M, Long JC, Pomare C, Wu W, Clay-Williams R, Nguyen HM, et al. Avoiding unnecessary hospitalisation for patients with chronic conditions: a systematic review of implementation determinants for hospital avoidance programmes. Implement Sci 2020 Oct 21;15(1):91 [FREE Full text] [CrossRef] [Medline]
- Versluis A, van Luenen S, Meijer E, Honkoop PJ, Pinnock H, Mohr DC, et al. SERIES: eHealth in primary care. Part 4: addressing the challenges of implementation. Eur J Gen Pract 2020 Dec;26(1):140-145. [CrossRef] [Medline]
- European Foundation for Quality Management. The EFQM Excellence Model: Large Company, Operational and Business Unit Version. Brussels, Belgium: European Foundation for Quality Management; 2003.
- Sokovic M, Pavletic D, Pipan MK. Quality improvement methodologies – PDCA Cycle, RADAR Matrix, DMAIC and DFSS. J Achiev Mater Manuf Eng 2010;43(1):476-483 [FREE Full text]
- Greenhalgh T, Maylor H, Shaw S, Wherton J, Papoutsi C, Betton V, et al. The NASSS-CAT tools for understanding, guiding, monitoring, and researching technology implementation projects in health and social care: protocol for an evaluation study in real-world settings. JMIR Res Protoc 2020 May 13;9(5):e16861 [FREE Full text] [CrossRef] [Medline]
- Mousavi SM, Takian A, Tara M. Design and validity of a questionnaire to assess national eHealth architecture (NEHA): a study protocol. BMJ Open 2018 Dec 22;8(12):e022885 [FREE Full text] [CrossRef] [Medline]
- Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care 2020 Jun;36(3):204-216. [CrossRef] [Medline]
- Currie WL. TEMPEST: an integrative model for health technology assessment. Health Policy Technol 2012 Mar;1(1):35-49. [CrossRef]
- ISO/TS 82304-2:2021 – Health software — Part 2: Health and wellness apps — Quality and reliability. International Organization for Standardization. 2021. URL: https://www.iso.org/standard/78182.html [accessed 2022-03-17]
- Campmans-Kuijpers MJ, Lemmens LC, Baan CA, Gorter KJ, Groothuis J, van Vuure KH, et al. Defining and improving quality management in Dutch diabetes care groups and outpatient clinics: design of the study. BMC Health Serv Res 2013 Apr 05;13:129 [FREE Full text] [CrossRef] [Medline]
- Zonneveld N, Vat LE, Vlek H, Minkman MM. The development of integrated diabetes care in the Netherlands: a multiplayer self-assessment analysis. BMC Health Serv Res 2017 Mar 21;17(1):219 [FREE Full text] [CrossRef] [Medline]
- Banks GC, Barnes CM, Jiang K. Changing the conversation on the science–practice gap: an adherence-based approach. J Manag 2021 Feb 22;47(6):1347-1356. [CrossRef]
- Wandersman A. Community science: bridging the gap between science and practice with community-centered models. Am J Commun Psychol 2003 Jun;31(3-4):227-242. [CrossRef]
- Disler RT, Gallagher RD, Davidson PM. Factors influencing self-management in chronic obstructive pulmonary disease: an integrative review. Int J Nurs Stud 2012 Feb;49(2):230-242. [CrossRef] [Medline]
Abbreviations
CFIR: Consolidated Framework for Implementation Research |
EFQM: European Foundation for Quality Management Model |
HHQA: Hybrid Health Care Quality Assessment |
NASSS: nonadoption, abandonment, scale-up, spread, sustainability |
PDCA: plan-do-check-act |
RADAR: Results-Approach-Deployed-Assessment-Refinement |
SPO: Structure-Process-Outcome |
Edited by A Mavragani; submitted 14.04.22; peer-reviewed by M Antoniou, G Deckard; comments to author 08.05.22; revised version received 01.06.22; accepted 07.06.22; published 07.07.22
Copyright©Rosian Tossaint-Schoenmakers, Marise J Kasteleyn, Anneloek Rauwerdink, Niels Chavannes, Sofie Willems, Esther P W A Talboom-Kamp. Originally published in JMIR Formative Research (https://formative.jmir.org), 07.07.2022.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.