Original Paper
Abstract
Background: Congenital heart disease (CHD) is a birth defect of the heart that requires long-term care and often leads to additional health complications. Effective educational strategies are essential for improving health literacy and care outcomes. Despite affecting around 40,000 children annually in the United States, there is a gap in understanding children’s health literacy, parental educational burdens, and the efficiency of health care providers in delivering education.
Objective: This qualitative pilot study aims to develop tailored assessment tools to evaluate educational needs and burdens among children with CHD, their parents, and health care providers. These assessments will inform the design of medical education toys to enhance health management and outcomes for pediatric patients with CHD and key stakeholders.
Methods: Through stakeholder feedback from pediatric patients with CHD, parents, and health care providers, we developed three tailored assessments in two phases: (1) iterative development of the assessment tools and (2) pilot testing. In the first phase, we defined key concepts, conducted a literature review, and created initial drafts of the assessments. During the pilot-testing phase, 12 participants were recruited at the M Health Fairview Pediatric Specialty Clinic for Cardiology—Explorer in Minneapolis, Minnesota, United States. We gathered feedback using qualitative methods, including cognitive interviews such as think-aloud techniques, verbal probing, and observations of nonverbal cues. The data were analyzed to identify the strengths and weaknesses of each assessment item and areas for improvement.
Results: The 12 participants included children with CHD (n=5), parents (n=4), and health care providers (n=3). The results showed the feasibility and effectiveness of the tailored assessments. Participants showed high levels of engagement and found the assessment items relevant to their education needs. Iterative revisions based on participant feedback improved the assessments’ clarity, relevance, and engagement for all stakeholders, including children with CHD.
Conclusions: This pilot study emphasizes the importance of iterative assessment development, focusing on multistakeholder engagement. The insights gained from the development process will guide the creation of tailored assessments and inform the development of child-led educational interventions for pediatric populations with CHD.
doi:10.2196/63818
Keywords
Introduction
Background
Congenital heart disease (CHD) is a heart abnormality present at birth, requiring intensive medical care and often leading to life-threatening complications [
]. It affects around 40,000 newborns annually in the United States and 1% globally, posing lifelong challenges for children, families, and health care systems [ - ]. Despite significant advancements in diagnosis and treatment, CHD remains a chronic condition that demands ongoing care and specialized educational resources [ , ]. Tailored education is critical for addressing the needs of children with CHD, providing accessible educational support for parents, and equipping health care providers with efficient educational strategies. However, no tools currently exist to understand children’s health literacy, assess parental educational burdens, or measure health care providers’ efficiency in delivering education [ , ].Children with CHD face unique barriers in understanding their condition due to developmental challenges, such as difficulty grasping abstract medical concepts or relating them to their lived experiences. These limitations can hinder their ability to adhere to treatment plans, actively participate in care, or respond effectively during emergencies and transitions to adult care [
, , - ]. Despite these challenges, pediatric patients are often excluded from health literacy efforts, as most resources are designed for parents or caregivers. This exclusion highlights the critical need for age-appropriate tools that empower children to engage with their care actively, ultimately improving adherence and long-term health outcomes [ , - ].Parents, meanwhile, bear a significant educational burden. They must interpret medical jargon, simplify it for their child, and act as intermediaries with health care providers, all while managing the emotional strain of caregiving and the cognitive load of understanding complex medical information. This burden can increase stress, reduce caregiving effectiveness, and impact family well-being. Low parental health literacy further compounds these challenges, as it is linked to medical errors and poorer health outcomes for children [
- ]. Furthermore, health care providers face the challenge of balancing clear communication with time and resource constraints, often struggling to deliver education tailored to the needs of both children and parents [ , - ]. Addressing these gaps is essential for improving communication, care coordination, and health outcomes for children with CHD and their families [ , ].Study Objectives
As part of a multiphase research project, this pilot study aims to close these gaps by developing tailored assessment tools for key stakeholders in pediatric CHD education. This research comprises two phases: (1) iterative development of the assessment tools and (2) pilot testing to create, refine, and validate 3 CHD-specific assessments based on stakeholder feedback in a real-world setting. We gathered feedback using cognitive interviews and observations to assess each assessment item [
- ]. We iteratively developed face-to-face and computer-based assessments based on this feedback from all stakeholders, including children. The assessments measure changes in children’s knowledge, parental educational burdens, and health care provider efficiency before and post interventions. The results show that developing and refining these assessments are essential before introducing our medical education toy. The objectives of this study are 3-fold:- To assess changes in knowledge of children with CHD before and after educational interventions to empower pediatric engagement in health care innovation.
- To measure the educational burden experienced by parents before and after interventions, focusing on their needs and challenges in navigating CHD-related educational responsibilities.
- To assess the efficiency and effectiveness of health care providers in communicating essential information to children with CHD and their families to improve care coordination and patient outcomes.
Prior Work
Assessments Developed for Pediatric Patients With CHD, Parents, and Health Care Providers
Assessments are essential for evaluating the health status and practices of pediatric patients, their parents, and health care providers across various domains [
- ]. These domains include health-related quality of life, emotional well-being, physical or psychological health, social support, and behavioral problems. For pediatric patients aged 8-18 years, assessments often involve parent proxies completing tools for younger children [ - ]. Commonly used tools in this context include the Pediatric Quality of Life Inventory (PedsQL), Pediatric Cardiac Quality of Life Inventory (PCQLI) [ - ], Child Health Questionnaire (CHQ), and Child Behavior Checklist (CBCL). However, there is a notable gap in health literacy measures for children younger than 9 years [ ]. While some tools, such as Food Label Literacy for Applied Nutrition Knowledge (FLLANK) questionnaire [ ], exist, none are specifically tailored to disease populations such as those with CHD.Parental well-being and caregiving burden assessments are also critical for understanding and supporting effective caregiving. However, the specific burden related to education remains underexplored. Existing tools, such as the Parenting Stress Index/Parental Stress Scale (PSI/PSS) and the Family Impact Module (FIM) of the PedsQL, focus on parental stress, caregiving difficulty, and family functioning [
- ]. Similarly, tools such as the Caregiver Health Self-Assessment, which aims to improve the caregiver-provider dyadic relationship (commonly for older adult patients), do not address the educational needs of caregivers of patients with CHD [ - ].For health care providers, efficiency assessments—measuring the ability to minimize wasted time and maximize outcomes—remain an ongoing challenge despite progress in evaluating physician and hospital care effectiveness [
, ]. While tools such as Key Performance Indicators (KPIs), Lean Six Sigma (LSS), and Performance Metrics are widely used to evaluate provider and hospital performance, their primary focus is on patient care outcomes (eg, patient satisfaction). These methods often overlook the educational challenges faced by health care providers themselves [ - ]. Addressing these gaps can enhance our understanding of the needs and well-being of CHD stakeholders, including children, enabling the development of more targeted and effective educational interventions. More details about these assessments are available in .Empowering Pediatric Engagement in Health Care Interventions
Engaging children effectively in health care interventions and assessments is challenging due to their unique developmental hurdles [
- ]. Younger children face difficulties participating because of limited attention spans and cognitive abilities, requiring reliance on parental feedback as a proxy for designing and testing interventions [ , ]. To address these challenges, studies suggest using interviews, focus groups, and activity-based methods [ - ].Prior studies in design, human-computer interaction, and health care domains indicate that adolescents aged 13-17 years engage well in interviews and focus groups. In contrast, younger children, particularly those aged 4-12 years, benefit more from creative techniques such as activity-based methods [
- ]. Furthermore, studies highlight the importance of social factors, such as ongoing support, in alleviating children’s potential stress during research activities [ ]. Empowering children through age-appropriate strategies, coupled with family facilitation, enhances their sense of ownership in health care decision-making. These approaches not only improve engagement but also strengthen the research process by addressing children’s developmental needs and promoting their active participation [ - ].Methods
We developed and refined 3 CHD-specific assessments to evaluate children’s knowledge, parental educational burdens, and health care provider efficiency in delivering education. This process included defining key concepts, conducting a literature review, and designing initial assessment tools. These assessments were subsequently pilot-tested with stakeholders at the M Health Fairview Pediatric Specialty Clinic for Cardiology—Explorer in Minneapolis, Minnesota, United States, to ensure validity and relevance.
Assessment Development
CHD Health Literacy Children Assessment
We developed the CHD Health Literacy Children Assessment (CHD-HLCA) to evaluate CHD health literacy in children aged 4-10 years through pre- and posteducational intervention. This tool draws inspiration from the FLLANK questionnaire [
] and incorporates storytelling techniques to enhance engagement [ ]. The assessment consists of 10 questions featuring simple black and white icons without color or intricate details to reduce visual distractions and avoid potential psychological and physiological influences of color. Instead, children use colored markers to answer, fostering their engagement and enthusiasm during the assessment process [ , ].This self-report assessment is administered with the assistance of the research team or parents, particularly for younger children. The assessment covers various knowledge dimensions, including understanding CHD, preparing for doctor visits, and practicing self-care [
]. Questions offer 2 comparison options to minimize complexity, using simple icons and illustrations to reduce distractions and prevent cognitive overload. A “can’t tell” option is also included to ensure that children feel comfortable expressing uncertainty. Furthermore, a Likert scale question with Smiley Face Likerts assesses their general knowledge about their heart [ ], a common method for children’s surveys. The assessment is conducted face-to-face with parental involvement, fostering a supportive environment where children can freely express themselves through drawing, crafting, or pointing to icons. It takes approximately 5-10 minutes per child and evaluates 4 key domains: Conceptual Knowledge, Comprehension, Appraisal, and Application/function specific to CHD ( [ ]).![](https://asset.jmir.pub/assets/f95bd10ea9295bbb5f1878582c243bd4.png)
CHD Parental Educational Burden Assessment
The CHD Parental Educational Burden Assessment (CHD-PEBA) is a 30-question assessment designed to evaluate parental educational responsibilities. It uses Likert scales, multiple-choice questions, matrix format, and open-ended responses to gather both quantitative and qualitative data on parental challenges. Inspired by Neuro-QoL and the Caregiver Health Self-Assessment Questionnaire [
, , ], this self-report assessment examines parental understanding, coping mechanisms, and support needs in educating children about CHD. The assessment evaluates parents’ knowledge of CHD, their perception of their children’s understanding of the condition, and their confidence and stress levels in providing educational support. It also explores preferred information sources, information-seeking behaviors, and the time and effort parents dedicate to educating their children. Demographic data are collected to contextualize responses. Administered on the web, the survey takes approximately 5-10 minutes to complete. It serves as both a pre-and postintervention tool, measuring changes in parental educational burden over time.CHD Healthcare Provider Educational Efficiency Assessment
The self-reported CHD Healthcare Provider Educational Efficiency Assessment (CHD-HEEA) health care providers’ assessment is a 20-question self-report assessment to evaluate health care providers’ educational efficiency in educating children with CHD and their families. It assesses the practices of pediatric cardiologists, fellows, nurses, and child life specialists through various question formats, including Likert scales, matrix questions, and multiple-choice and open-ended responses. Inspired by tools such as the Physician Task Checklist and Lean Six Sigma [
, , ], the assessment examines educational practices, challenges, and strategies. Key areas of evaluation include the time and effort spent on educating children with CHD and families, perceptions of current educational methods, use of preappointment materials and self-education resources, encouragement of questions from caregivers and children, and strategies to streamline education while maintaining information accessibility. It also collects some demographic data to provide context for the responses. Administered on the web, the survey takes approximately 5-10 minutes to complete. It serves as a pre- and postassessment tool to evaluate changes in health care provider practices after educational interventions to optimize care coordination and patient education. Additional details about the 3 assessments are provided in .Pilot Testing of Assessment Tools
During the pilot phase, we tested 3 different assessments with CHD stakeholders at the M Health Fairview Pediatric Specialty Clinic for Cardiology—Explorer.
Study Population
The study defined specific criteria for each stakeholder group. Children with CHD were required to be between 4 and 10 years of age, diagnosed by a health care provider, and fluent in English. Parents or guardians had to speak English and be confirmed as caregivers by a health care provider. Health care providers must be actively involved in CHD care and fluent in English.
Participants in this pilot study included children with CHD (n=5), their parents (n=4), and health care providers (n=3). Among the children were four 7-year-olds (3 males and 1 female) and one 5-year-old female (n=5). The parental and caregiver group comprised 5 mothers aged 25-44 years, including 2 White/Caucasian and 3 African American/Black participants, all with bachelor’s degrees and reporting middle to high-level incomes (n=5). However, one of the parents did not complete the assessment, resulting in a final parental group size (n=4). The health care provider group included 2 pediatric cardiology fellows and 1 physician assistant student, aged 25-44 years, with 1-5 years of experience in pediatric cardiology (n=3).
Study Recruitment and Informed Consent
The medical team facilitated communication with potential children with CHD and their families through purposive sampling. Parents signed parental permission forms for their children and children provided assent. We administered the children’s assessment while their parents were present but ensured that the parents did not guide their answers. After providing their own informed consent, parents received a link to complete the computer-based parental assessment. They could complete it at the clinic using the researcher’s laptop or later at home. Parents shared their experiences and provided feedback on the assessment, including its clarity and length, either during or after completion. Health care providers also participated by completing their assessments and providing feedback on their experiences. All participants received a gift card for their time. We transcribed participants’ input or feedback and anonymized results and transcripts to protect privacy.
Data Collection
We used qualitative methods, including cognitive interviews, observational notes, and interactive techniques, to gather feedback from children, parents, and health care providers [
, , , ]. These methods captured participants’ impressions, behaviors, and engagement, providing rich data for iterative refinements of the assessments.Logistics and Setup
To accommodate the clinic setting and time constraints, we conducted parent and child assessments simultaneously. Parents began slightly later than their children to ensure that the children were comfortable and understood the process before starting. This approach minimized parental influence on children’s responses while saving time. After completing their assessments, children engaged in painting activities, allowing parents to complete their assessments without distractions. For health care providers, assessments were scheduled flexibly, either on the web or in person, to accommodate their busy schedules.
Cognitive and Observational Methods
Cognitive interviews and verbal probing revealed how participants interpreted and responded to survey questions. As a qualitative method widely used in survey design, cognitive interviews helped identify ambiguities, cognitive challenges, and difficulties in understanding. Participants verbalized their thoughts concurrently (using the think-aloud technique) or retrospectively (recalling and explaining their thought processes after completing the assessment) [
- ]. Observational techniques captured nonverbal cues, such as hesitation, frustration, or excitement, along with body language and engagement levels. These observations were particularly valuable for younger children, who often struggled with abstract concepts or articulating their thoughts [ ]. Combining these methods provided detailed feedback that informed refinements to our assessments.Engaging Children
We used developmentally appropriate and interactive methods to engage children effectively. A researcher read questions aloud to younger children, with parents assisting if the child felt uncomfortable interacting with the research team. To make the activity engaging, we incorporated storytelling and asked children to role-play as detectives solving questions. Prompts such as “Hey, detective! Let’s figure out which, what, or where!” encouraged participation. Children used markers and craft materials, such as multicolor pom-poms, to point to their answers, making the process interactive and enjoyable. For older children who could read independently, we encouraged self-guided responses and asked clarifying questions to explore their thought processes. The think-aloud technique and verbal probing provided deeper insights into their reasoning [
- ]. To ensure that children understood the questions rather than guessing, we rephrased questions after they provided answers. For example, if a child pointed to an icon for chest pain, we asked, “Do you know where your chest is?” and used physical prompts to confirm their understanding. This approach validated their answers and often prompted children to share personal experiences, enriching the feedback [ , , , ].Refining the Assessments
We iteratively refined the assessments, tailoring questions to each stakeholder group by focusing on their thoughts, understanding, relevance, completeness, survey length, and overall experience.
For the children’s assessment, we prioritized meeting the developmental needs of children aged 4-10 years based on criteria provided by health care providers. These criteria outlined essential knowledge for this age group while addressing the additional challenges faced by younger or newly diagnosed children. Testing revealed that younger children often struggled with multistep questions, so we simplified these into single, clear actions. Language and visuals were adjusted for clarity while retaining enough detail to engage older children. Older children provided feedback on question relevance and reflected on how they might have responded when younger, offering insights that shaped refinements.
Each testing round informed adjustments to ensure that the questions were clear, precise, and engaging for all stakeholders in CHD pediatric care. These assessments now support pre- and postassessment stages to validate interventions, including our ongoing study of medical educational toys for CHD pediatric populations.
outlines the development and testing process, with sample prompts provided in .Step | Description | Stakeholders involved | Outcomes |
1. Define key concepts | Define health literacy, parental educational burden, and health care provider efficiency | Research team, pediatric cardiologist | Established key concepts for the assessment framework |
2. Conduct literature review | Review existing best practices and assessments | Research team | Insights for developing new assessments |
3. Develop initial assessments | Draft assessments for children, parents, and health care providers | Research team, pediatric cardiologist | Initial assessments ready for review |
4. Pilot testing—first iteration | Test initial assessments with children with CHD, parents, and health care providers | Children with CHD, parents, and health care providers | Feedback on relevance, clarity, and appropriateness |
5. Analyze feedback | Analyze feedback from pilot testing | Research team, pediatric cardiologist | Identify strengths, weaknesses, and improvements |
6. Refine assessments | Modify assessments based on feedback | Research team | Improved assessments |
7. Pilot testing—further iterations | Conduct additional testing and refinement | Children with CHD, parents, and health care providers | Continuous improvement and validation |
8. Final analysis | Analyze all data and feedback to finalize assessments | Research team, pediatric cardiologist | Validated assessments for implementation |
Data Analysis
Our research team held weekly meetings to analyze assessment data and stakeholder feedback. Using qualitative methods, including thematic analysis, we identified common themes from feedback and observational data [
, ]. The first author coded the data to highlight relevant themes, focusing on the assessment’s reliability, engagement, and responsiveness to meet all stakeholders’ needs. During each round of pilot testing, we prioritized feedback using four criteria: (1) its potential impact on the assessment’s effectiveness, (2) alignment with intended outcomes, (3) feasibility of incorporating changes, and (4) stakeholder preferences. Feedback was categorized by feasibility, clarity, relevance to educational needs, and participant engagement, then ranked by the frequency and significance of reported issues. We implemented revisions iteratively, prioritizing aspects that most improved comprehension and ease of response. This systematic process enhanced the assessments’ accuracy and relevance across stakeholder groups. During weekly meetings, the team reviewed how each piece of feedback aligned with the established criteria and worked collaboratively to resolve discrepancies. This iterative process led to the development of a set of codes such as “strengths,” “weaknesses,” and “areas of improvement.” These ongoing discussions and thematic analyses refined the assessments and informed adjustments for subsequent pilot tests [ ].Ethical Considerations
The study received ethical approval from the University of Minnesota institutional review board (STUDY00020670). The medical team facilitated communication with potential participants—children with CHD and their families—through purposive sampling. Informed consent was obtained from all participants: parents also signed parental permission forms for their children, and children provided assent. Participants were informed about the study’s purpose, procedures, and their right to withdraw at any time. Data were anonymized and deidentified during transcription and analysis, with all personal information securely stored in adherence to institutional guidelines. Participants received a gift card as compensation for their time, ensuring fairness and transparency.
Results
We identified consistent themes across three assessments: (1) assessment engagement, (2) relevance and structure of assessment, and (3) opportunities of assessments. These themes guided iterative revisions before each new pilot test. In this section, we summarize the findings from (1) children’s assessments, (2) parents’ assessments, and (3) health care providers’ assessments. Some feedback from key stakeholders on 3 survey experiences can be found in
.Assessments Engagement
Both children (C) and parents (P) actively engaged in the children’s assessment process. Parents were surprised by their children’s interest and enjoyment, with one parent expressing, “I wasn’t sure she’d even talk. Wow!” It appeared that the interactive elements, such as choosing pens or markers and storytelling, not only boosted curiosity and made them concentrate on doing the assessment but also eased the anxiety and fear of children. For instance, C3 quickly transitioned from nervousness about another medical procedure to excitement upon seeing colorful markers, exclaiming, “Yay, I can paint here in the doctor’s office!” Initially, she hid under the bed due to fear when we entered the examination room. However, she relaxed and became comfortable after seeing the markers and being invited to answer the questions using different colors. Even children like C2, initially focused on their iPad, became engaged, asking, “Can I use all colors? I can answer anything like this!”
Parents appreciated the playful and instructive design of the assessments, with one commenting, “It’s like teaching by itself through playtime. She’s telling me all you've asked her.” After finishing their assessments, we provided paper to children like C4, whose mother was also being assessed. This engagement prompted them to draw, becoming so engrossed to continue that they requested us to stay there even after the doctor’s arrival (
). Initially, parental influence impacted C1’s responses, but gentle interventions and prioritizing children’s uninfluenced responses over accuracy fostered independent responses in other pilot tests. Through face-to-face interaction using cognitive interviews akin to semistructured interviews [ , ], we collected less biased data from children rather than their parents, maintained attention, and reduced parental influence.![](https://asset.jmir.pub/assets/6a3012c545fa051c93fef3b6d32a0665.png)
Similarly, in the clinic, parents provided feedback after their children’s assessments through a think-aloud and verbal probing method [
, ] while answering the computer-based parents’ assessment using Typeform (Typeform SL). During the assessment, P4 noted, “You know what to ask.” P3 mentioned that they would change only some of the wording. However, despite this feedback, P3 described the survey as “clear, easy, feeling good; finally, somebody asks!” This feedback highlighted the ease of use, clear directions, and meeting the participants’ needs, facilitating effective data collection. The health care providers’ assessment also received positive feedback for its engaging interface, with H1 noting, “Much better than usual surveys,” and H3 stating, “...really interactive with photos and video, and buttons. It’s easy....” Such feedback underscores the effectiveness of engaging participants and facilitating data collection.Relevance and Structure of Assessment
The effectiveness of engaging all stakeholders through assessments depends on their relevance and well-structured design. Maintaining high relevance across all assessments, we noticed an interesting trend involving the Smiley Face Likert question during the children’s assessment. Despite our efforts to adjust the question wording, children consistently chose the happiest face in the initial version of the assessment. They emphasized their positive feelings associated with smiley faces by saying, “I chose it because I like it!” We expected this issue but still tested the smiley face Likert question due to its common use in children’s surveys. To improve engagement and reliability, we found that comparison questions, structured like a right or wrong game format, were more effective. Children showed high concentration levels and often asked, “Is this right?” They felt like participating in a game, increasing their involvement in the assessment process. Incorporating feedback from our medical team and parents highlighting the assessment’s educational value, we replaced the Likert question with a prompt offering a choice between physical activity and screen time (
[ ]), addressing another relevant habit for children with CHD. We also adjusted the question order to observe response variations, albeit with limited reliability due to its implementation with only participants C2 and C3 immediately after the initial version.P2, P3, P4, and P5 actively engaged with the questions in the parents’ assessment, finding them clear, easy, and relevant. However, P2 noted assumptions in specific questions regarding the prior receipt of educational material during doctor visits. They proposed a preliminary question to confirm whether educational material was used before assessing its appropriateness (
). They asked, “How do you know if we got any educational material before asking how good it was? It’d make more sense to check if we received any and then ask about it.” Interestingly, when explicitly asked about this question, P3, a follow-up patient, did not express the same concern. Another participant suggested improving the context of the question about comfort levels during clinic visits (follow-ups or surgeries), which we addressed in the later version. We condensed the questionnaire to 25 items in the latest parent assessments by adding a matrix question.![](https://asset.jmir.pub/assets/fdd38f16e8f033e02364c851b1835ee1.png)
In response to initial feedback on the health care providers’ assessment, H1 proposed categorizing questions differently for new and follow-up patients. All participants provided positive feedback regarding the matrix questions (
). H2 preferred to keep the existing matrix questions but suggested reordering them for better clarity and efficiency. They emphasized that combining related aspects into single questions helps respondents compare options and find the correct answer more easily: “Having one question for each of these aspects would make the questionnaire longer and harder to compare when answering; it just seems more practical this way.” Furthermore, H2 recommended adding a specific demographic—vulnerable parents—to measure parental CHD information instead of using a general category. These suggestions were integrated into version 03, reducing the length to 15 items.![](https://asset.jmir.pub/assets/01cee770d3569e6c5d148a875f8a15d3.png)
Opportunities of Assessments
We used patients’ wait time between their initial screenings and the meeting with their physician to complete the assessments. This integration not only streamlined care coordination and reduced wait times but also engaged families and alleviated worries, as noted by P4, “It keeps us good busy!” This suggests that our design intervention (an educational toy) can be implemented during patients’ wait time as well. The assessments also doubled as educational tools for children, engaging them in learning about their condition and easing anxiety. Parents found solace in sharing burdens, while health care providers gained insights into their practices and sought potential solutions. These assessments exceeded their initial purposes, offering education, support, and reflection opportunities for child-led approaches in pediatric care.
Discussion
Principal Results
During face-to-face interviews, we identified areas for refinement in assessment items, such as clarifying the context of health care experiences for children’s comfort-level rating or specifying the type of visit for health care providers. Observational notes highlighted instances where assumptions in specific questions, such as the prior receipt of educational material during doctor visits, caused confusion among interviewers. To alleviate this issue, we proposed preliminary questions. Despite these challenges, participants generally interpreted the majority of survey items as intended, with occasional adjustments to wording choices for better comprehension. Furthermore, we changed the order of questions based on feedback and observations to improve the structure of the assessment.
The pilot testing evaluated the feasibility and effectiveness of the tailored assessments. Iterative revisions further improved their clarity and appropriateness. We ensured the validity of the assessments through a collaborative and iterative approach, incorporating perspectives from all stakeholders. As a result, the assessments offer a comprehensive understanding of educational needs and burdens, providing valuable insights to guide the development of targeted educational interventions.
Comparison With Prior Work
The tailored assessments developed in this pilot study address a gap in the existing literature by focusing on the educational needs of pediatric patients with CHD, their parents, and health care providers [
, , ]. While existing assessment tools mainly measure health-related quality of life and emotional well-being, they offer limited insight into health literacy and the unique educational challenges associated with CHD [ - ]. Moreover, these assessments are primarily developed quantitatively and lack qualitative insights into educational needs and burdens. We used qualitative methods to explore how respondents interpret, understand, and respond to specific survey items. This approach offered a more comprehensive understanding of the questions that assessments should address [ - , - ].Through collaboration with stakeholders, including children, we developed and refined assessments to evaluate children’s knowledge, parental educational burdens, and health care provider efficiency in pediatric CHD care. Including children in the development process was crucial due to the lack of tailored educational materials for children with CHD and the complexities involved in assessing this population [
- , ]. While parents were present during the children’s assessments to provide comfort, their involvement was carefully managed to ensure that it did not interfere with the children’s meaningful participation. This approach contrasts with prior methods, where parents often act as proxies, potentially biasing the results [ , ]. Feedback from all stakeholders was collected post assessment to gain insights into their experiences and further improve the assessments.We conducted pilot tests for two reasons: (1) to refine the assessments before involving more stakeholders, specifically children, and (2) to ensure the assessments were well adapted to meet all stakeholders’ needs before the actual study [
, ]. This pilot study revealed critical gaps in existing tools and advanced the methodology for developing educational assessments. Emphasizing collaborative, iterative, and direct engagement with children, parents, and health care providers ultimately leads to the design of tailored assessments. These assessments are crucial for informing the development of effective, child-led educational interventions for pediatric populations with CHD, demonstrating feasibility before broader implementation.The assessments developed in this study have potential applications in routine clinical practice, offering health care providers a structured tool to assess and enhance CHD-related health literacy among pediatric patients and their families. By integrating these assessments into preappointment resources or waiting room activities, health care providers can identify educational gaps early and address them proactively. Furthermore, the assessments could complement existing educational interventions, providing feedback on the effectiveness of child- and family-centered resources to improve health literacy. This integration would not only inform the design of more effective child- and family-centered resources but also support continuous improvement of educational interventions. This approach aligns with broader goals in pediatric health care to support lifelong health management through early literacy interventions.
Limitations
Although the pilot testing provided valuable insights, several limitations are acknowledged. First, despite efforts to minimize parental influence, their presence may have still impacted some children’s responses. Future research should explore strategies to reduce this influence further and ensure more independent responses from children. Second, the sample distribution was not spread evenly across age groups, with only 1 child aged 5 years and 4 children aged 7 years. This uneven distribution may affect the representativeness of the findings. Future studies should aim for a more balanced age distribution. Third, conducting assessments for both children and parents at the clinic during their visit might have influenced their responses, as they were exposed to medical procedures. Furthermore, the researcher’s presence during the assessment of parents and children could have influenced their feedback. Finally, while general accessibility was considered, specific adaptations for educational disabilities were not within the scope of this pilot study.
Conclusions
This pilot study aims to improve educational interventions and care coordination through tailored assessments designed by stakeholder feedback, including affected children. The findings emphasize the importance of interactive and qualitative methods that foster multistakeholder engagement and ensure question relevance. These assessments can potentially enhance child-led interventions and improve outcomes for patients with CHD from childhood through to adult care. However, the iterative development process highlighted several challenges, such as managing parental presence to encourage independent responses from children and the logistical complexities of recruiting diverse participants within clinical settings. It is also important to ensure that assessments for younger children are read aloud without influencing their responses. Integrating these assessments into real-world clinical workflows requires adaptability to avoid disrupting patient care. Addressing these practical challenges will be essential to scaling and sustaining the use of these assessments in diverse health care contexts.
Acknowledgments
The authors gratefully acknowledge the funding provided by the Kusske Design Initiative (KDI) and Dr Gwenyth Fischer, MD. Their sincere thanks extend to the M Health Fairview Explorer Clinic and all pilot participants. Finally, they would like to thank contributions from former students Grace Rubas, Jessica Jenkins, Jonathan Jakubas, Levi Skelton, and Andy Thai.
Conflicts of Interest
None declared.
References
- Centers for Disease Control and Prevention. Data and statistics on congenital heart defects. 2023. URL: https://www.cdc.gov/heart-defects/data/index.html [accessed 2023-09-22]
- GBD 2017 Congenital Heart Disease Collaborators. Global, regional, and national burden of congenital heart disease, 1990-2017: a systematic analysis for the Global Burden of Disease Study 2017. Lancet Child Adolesc Health. 2020;4(3):185-200. [FREE Full text] [CrossRef] [Medline]
- Liu Y, Chen S, Zühlke L, Black GC, Choy M, Li N, et al. Global birth prevalence of congenital heart defects 1970-2017: updated systematic review and meta-analysis of 260 studies. Int J Epidemiol. 2019;48(2):455-463. [FREE Full text] [CrossRef] [Medline]
- Chong LSH, Fitzgerald DA, Craig JC, Manera KE, Hanson CS, Celermajer D, et al. Children's experiences of congenital heart disease: a systematic review of qualitative studies. Eur J Pediatr. 2018;177(3):319-336. [CrossRef] [Medline]
- Lee A, Bailey B, Cullen-Dean G, Aiello S, Morin J, Oechslin E. Transition of care in congenital heart disease: ensuring the proper handoff. Curr Cardiol Rep. 2017;19(6):55. [CrossRef] [Medline]
- Moons P, De Volder E, Budts W, De Geest S, Elen J, Waeytens K, et al. What do adult patients with congenital heart disease know about their disease, treatment, and prevention of complications? A call for structured patient education. Heart. 2001;86(1):74-80. [FREE Full text] [CrossRef] [Medline]
- Barbazi N, Shin JY, Hiremath G, Lauff CA. Exploring Health Educational Interventions for Children with Congenital Heart Disease: A Scoping Review. JMIR Pediatr Parent. Jan 24, 2024;8:e64814. [FREE Full text] [CrossRef]
- Wray J, Maynard L. The needs of families of children with heart disease. J Dev Behav Pediatr. 2006;27(1):11-17. [CrossRef] [Medline]
- Veldtman GR, Matley SL, Kendall L, Quirk J, Gibbs JL, Parsons JM, et al. Illness understanding in children and adolescents with heart disease. Heart. 2000;84(4):395-397. [FREE Full text] [CrossRef] [Medline]
- Goossens E, Fieuws S, Van Deyk K, Luyckx K, Gewillig M, Budts W, et al. Effectiveness of structured education on knowledge and health behaviors in patients with congenital heart disease. J Pediatr. 2015;166(6):1370-1376.e1. [CrossRef] [Medline]
- Burns J, Higgins C, Ganigara M, Kalivas B, Basken A. Health literacy in CHD. Cardiol Young. 2022;32(9):1369-1372. [CrossRef] [Medline]
- van der Heijden Z, de Gooijer F, Camps G, Lucassen D, Feskens E, Lasschuijt M, et al. User requirements in developing a novel dietary assessment tool for children: mixed methods study. JMIR Form Res. 2024;8:e47850. [FREE Full text] [CrossRef] [Medline]
- Francis-Oliviero F, Loubières C, Grové C, Marinucci A, Shankland R, Salamon R, et al. Improving children's mental health literacy through the cocreation of an intervention and scale validation: protocol for the CHILD-Mental Health literacy research study. JMIR Res Protoc. 2023;12:e51096. [FREE Full text] [CrossRef] [Medline]
- Gupta A, Cafazzo JA, IJzerman MJ, Swart JF, Vastert S, Wulffraat NM, et al. Genomic health literacy interventions in pediatrics: scoping review. J Med Internet Res. 2021;23(12):e26684. [FREE Full text] [CrossRef] [Medline]
- Shin JY, Holtz B. Identifying opportunities and challenges: how children use technologies for managing diabetes. 2020. Presented at: IDC '20: Interaction Design and Children; 2020 June 24:495-507; London, United Kingdom. URL: https://dl.acm.org/doi/10.1145/3392063.3394444
- Shin JY, Holtz BE. Towards better transitions for children with diabetes: user experiences on a mobile health app. 2019. Presented at: Proceedings of the 18th ACM International Conference on Interaction Design and Children; 2019 June 15; Boise, ID. [CrossRef]
- DeHoff BA, Staten LK, Rodgers RC, Denne SC. The role of online social support in supporting and educating parents of young children with special health care needs in the United States: a scoping review. J Med Internet Res. 2016;18(12):e333. [FREE Full text] [CrossRef] [Medline]
- Shin JY, Kedroske J, Vue R, Sankaran R, Chaar D, Churay T, et al. Design considerations for family-centered health management: preliminary findings with pediatric BMT patients. 2018. Presented at: Proceedings of the 17th ACM Conference on Interaction Design and Children; 2018 June 22; Trondheim, Norway. [CrossRef]
- Chaar D, Shin JY, Mazzoli A, Vue R, Kedroske J, Chappell G, et al. A mobile health app (Roadmap 2.0) for patients undergoing hematopoietic stem cell transplant: qualitative study on family caregivers' perspectives and design considerations. JMIR Mhealth Uhealth. 2019;7(10):e15775. [FREE Full text] [CrossRef] [Medline]
- Rodts ME, Unaka NI, Statile CJ, Madsen NL. Health literacy and caregiver understanding in the CHD population. Cardiol Young. 2020;30(10):1439-1444. [CrossRef] [Medline]
- Tennant R, Allana S, Mercer K, Burns CM. Exploring the experiences of family caregivers of children with special health care needs to inform the design of digital health systems: formative qualitative study. JMIR Form Res. 2022;6(1):e28895. [FREE Full text] [CrossRef] [Medline]
- Shin JY, Okammor N, Hendee K, Pawlikowski A, Jenq G, Bozaan D. Development of the socioeconomic screening, active engagement, follow-up, education, discharge readiness, and consistency (SAFEDC) model for improving transitions of care: participatory design. JMIR Form Res. 2022;6(4):e31277. [FREE Full text] [CrossRef] [Medline]
- Hölgyesi Á, Luczay A, Tóth-Heyn P, Muzslay E, Világos E, Szabó AJ, et al. The impact of parental electronic health literacy on disease management and outcomes in pediatric type 1 diabetes mellitus: cross-sectional clinical study. JMIR Pediatr Parent. 2024;7:e54807. [FREE Full text] [CrossRef] [Medline]
- Sakamoto M, Ishikawa H, Suzuki A. Evaluation of parents' use of a child health care information app and their health literacy: cross-sectional study. JMIR Pediatr Parent. 2024;7:e48478. [FREE Full text] [CrossRef] [Medline]
- Castor C, Lindkvist RM, Hallström IK, Holmberg R. Health care professionals' experiences and views of eHealth in pediatric care: qualitative interview study applying a theoretical framework for implementation. JMIR Pediatr Parent. 2023;6:e47663. [FREE Full text] [CrossRef] [Medline]
- Lindström NB, Pozo RR. Perspectives of nurses and doulas on the use of information and communication technology in intercultural pediatric care: qualitative pilot study. JMIR Pediatr Parent. 2020;3(1):e16545. [FREE Full text] [CrossRef] [Medline]
- Arya B, Glickstein JS, Levasseur SM, Williams IA. Parents of children with congenital heart disease prefer more information than cardiologists provide. Congenit Heart Dis. 2013;8(1):78-85. [FREE Full text] [CrossRef] [Medline]
- Van Orne J. Care coordination for children with medical complexity and caregiver empowerment in the process: a literature review. J Spec Pediatr Nurs. 2022;27(3):e12387. [CrossRef] [Medline]
- Willis GB, Artino AR. What do qur respondents think we're Asking? using cognitive interviewing to improve medical education surveys. J Grad Med Educ. 2013;5(3):353-356. [FREE Full text] [CrossRef] [Medline]
- Balza JS, Cusatis RN, McDonnell SM, Basir MA, Flynn KE. Effective questionnaire design: how to use cognitive interviews to refine questionnaire items. J Neonatal Perinatal Med. 2022;15(2):345-349. [FREE Full text] [CrossRef] [Medline]
- Kedroske J, Koblick S, Chaar D, Mazzoli A, O'Brien M, Yahng L, et al. Development of a national caregiver health survey for hematopoietic stem cell transplant: qualitative study of cognitive interviews and verbal probing. JMIR Form Res. 2020;4(1):e17077. [FREE Full text] [CrossRef] [Medline]
- Eccles DW, Arsal G. The think aloud method: what is it and how do I use it? Qual Res Sport Exerc Health. 2017;9(4):514-531. [CrossRef]
- Presser S, Couper MP, Lessler JT, Martin E, Martin J, Rothgeb JM, et al. Methods for testing and evaluating survey questions. Public Opin Q. 2004;68(1):109-130. [CrossRef]
- Collins D. Pretesting survey instruments: an overview of cognitive methods. Qual Life Res. 2003;12(3):229-238. [CrossRef] [Medline]
- Mingjie Z, Li W, Jianxin Z. Developing a computer system for health assessment. 2009. Presented at: Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Huma; 2009 November 26:303-306; New York, NY. [CrossRef]
- Morris M, Intille S. HCI challenges in health assessment. 2005. Presented at: CHI '05 Extended Abstracts on Human Factors in Computing Systems; 2005 April 7; New York, NY. [CrossRef]
- Ricci M, Cimini A, Gli?ovi? K, Jiménez RJ, Medina-García R, Steinböck M, et al. Quality of life assessment methodology in TeNDER project. 2022. Presented at: Proceedings of the 15th International Conference on Pervasive Technologies Related to Assistive Environments; 2022 June 29; New York, NY. [CrossRef]
- Ayers TS, Sandler IN, West SG, Roosa MW. Children?s Coping Strategies Checklist. 1991. URL: https://psycnet.apa.org/doiLanding?doi=10.1037%2Ft42024-000 [accessed 2024-12-13]
- ScienceDirect Topics. Child behavior checklist—an overview. URL: https://www-sciencedirect-com.ezp2.lib.umn.edu/topics/psychology/child-behavior-checklist [accessed 2024-03-12]
- Haley SM, Coster WI, Kao YC, Dumas HM, Fragala-Pinkham MA, Kramer JM, et al. Lessons from use of the pediatric evaluation of disability inventory: where do we go from here? Pediatr Phys Ther. 2010;22(1):69-75. [FREE Full text] [CrossRef] [Medline]
- Horner-Johnson W, Krahn G, Andresen E, Hall T, Rehabilitation ResearchTraining Center Expert Panel on Health Status Measurement. Developing summary scores of health-related quality of life for a population-based survey. Public Health Rep. 2009;124(1):103-110. [FREE Full text] [CrossRef] [Medline]
- Hullmann SE, Ryan JL, Ramsey RR, Chaney JM, Mullins LL. Measures of general pediatric quality of life: Child Health Questionnaire (CHQ), DISABKIDS Chronic Generic Measure (DCGM), KINDL-R, Pediatric Quality of Life Inventory (PedsQL) 4.0 Generic Core Scales, and Quality of My Life Questionnaire (QoML). Arthritis Care Res (Hoboken). 2011;63 Suppl 11:S420-S430. [FREE Full text] [CrossRef] [Medline]
- Child Outcomes Research Consortium. Strengths and Difficulties Questionnaire (SDQ). URL: https://www.corc.uk.net/outcome-experience-measures/strengths-and-difficulties-questionnaire-sdq/ [accessed 2024-12-20]
- Marino BS, Tomlinson RS, Wernovsky G, Drotar D, Newburger JW, Mahony L, et al. Pediatric Cardiac Quality of Life Inventory Testing Study Consortium. Validation of the pediatric cardiac quality of life inventory. Pediatrics. 2010;126(3):498-508. [FREE Full text] [CrossRef] [Medline]
- Pearson. Pediatric Evaluation of Disability Inventory. URL: https://www.pearsonassessments.com/store/usassessments/en/Store/Professional-Assessments/Developmental-Early-Childhood/Pediatric-Evaluation-of-Disability-Inventory/p/100000505.html [accessed 2024-12-20]
- Cleveland Clinic Children's. Pediatric Cardiac Quality of Life Inventory (PCQLI). URL: https://my.clevelandclinic.org/pediatrics/medical-professionals/pediatric-cardiac-quality-life-inventory [accessed 2024-12-20]
- PedsQL TM (Pediatric Quality of Life Inventory TM). URL: https://www.pedsql.org/about_pedsql.html [accessed 2024-12-20]
- Health Literacy Tool Shed. URL: https://healthliteracy.tuftsmedicine.org/ [accessed 2024-12-20]
- Reynolds JS, Treu JA, Njike V, Walker J, Smith E, Katz CS, et al. The validation of a food label literacy questionnaire for elementary school children. J Nutr Educ Behav. 2012;44(3):262-266. [CrossRef] [Medline]
- Child Outcomes Research Consortium. Parental Stress Scale (PSS). URL: https://www.corc.uk.net/outcome-experience-measures/parental-stress-scale-pss/ [accessed 2024-12-20]
- American Psychological Association. Parenting Stress Index. URL: https://www.apa.org/pi/about/publications/caregivers/practice-settings/assessment/tools/parenting-stress [accessed 2024-12-20]
- Ríos M, Zekri S, Alonso-Esteban Y, Navarro-Pardo E. Parental stress assessment with the Parenting Stress Index (PSI): a systematic review of its psychometric properties. Children (Basel). 2022;9(11):1649. [FREE Full text] [CrossRef] [Medline]
- Varni JW, Sherman SA, Burwinkle TM, Dickinson PE, Dixon P. The PedsQL family impact module: preliminary reliability and validity. Health Qual Life Outcomes. 2004;2:55. [FREE Full text] [CrossRef] [Medline]
- HealthinAging. org. Caregiver Self Assessment Questionnaire. URL: https://www.healthinaging.org/tools-and-tips/caregiver-self-assessment-questionnaire [accessed 2024-12-20]
- Epstein-Lubow G, Gaudiano BA, Hinckley M, Salloway S, Miller IW. Evidence for the validity of the American Medical Association's caregiver self-assessment questionnaire as a screening measure for depression. J Am Geriatr Soc. 2010;58(2):387-388. [CrossRef] [Medline]
- Hanmer J, Jensen RE, Rothrock N, HealthMeasures Team. A reporting checklist for HealthMeasures' patient-reported outcomes: ASCQ-Me, Neuro-QoL, NIH Toolbox, and PROMIS. J Patient Rep Outcomes. 2020;4(1):21. [FREE Full text] [CrossRef] [Medline]
- de Brantes F. Measuring Provider Efficiency Version 1. 2004. URL: https://www.commonwealthfund.org/sites/default/files/documents/___media_files_publications_other_2004_dec_measuring_provider_efficiency __version_1_0__a_collaborative_multi_stakeholder_effort_measurproviderefficiency1_12312004_pdf.pdf [accessed 2004-12-03]
- Palmer S, Torgerson DJ. Economic notes: definitions of efficiency. BMJ. 1999;318(7191):1136. [FREE Full text] [CrossRef] [Medline]
- Bailie R, Bailie J, Larkins S, Broughton E. Editorial: continuous quality improvement (CQI)—advancing understanding of design, application, impact, and evaluation of CQI approaches. Frontiers in Public Health. 2017. URL: https://www.frontiersin.org/journals/public-health/articles/10.3389/fpubh.2017.00306 [accessed 2024-02-24]
- Henrique DB, Godinho Filho M. A systematic literature review of empirical research in Lean and Six Sigma in healthcare. Total Qual Manag Bus Excell. 2018;31(3-4):429-449. [CrossRef]
- Hussey PS, de Vries H, Romley J, Wang MC, Chen SS, Shekelle PG, et al. A systematic review of health care efficiency measures. Health Serv Res. 2009;44(3):784-805. [FREE Full text] [CrossRef] [Medline]
- O'Neill SM, Hempel S, Lim YW, Danz MS, Foy R, Suttorp MJ, et al. Identifying continuous quality improvement publications: what makes an improvement intervention 'CQI'? BMJ Qual Saf. 2011;20(12):1011-1019. [FREE Full text] [CrossRef] [Medline]
- Rathi R, Vakharia A, Shadab M. Lean Six Sigma in the healthcare sector: a systematic literature review. Mater Today Proc. 2022;50:773-781. [FREE Full text] [CrossRef] [Medline]
- Shepard L, Kagan SL, Wurtz E. Principles and recommendations for early childhood assessments. National Education Goals Panel. 1998. URL: https://eric.ed.gov/?id=ED416033 [accessed 2024-03-16]
- Committee on the Science of Children Birth to Age 8: Deepening and Broadening the Foundation for Success, Board on Children, Youth, and Families, Institute of Medicine. In: National Research Council. Transforming the Workforce for Children Birth Through Age 8: A Unifying Foundation. Washington, DC. National Academies Press; 2015.
- Banker A, Lauff C. Usability testing with children: history of best practices, comparison of methods & gaps in literature. URL: https://dl.designresearchsociety.org/drs-conference-papers/drs2022/researchpapers/225 [accessed 2022-06-25]
- Vang PC, Lauff CA. Reflections on data collection during toy prototype development in a design studio course. 2024. Presented at: Proceedings of the 23rd Annual ACM Interaction Design and Children Conference; 2024 June 20; New York, NY. [CrossRef]
- Koutná V, Blatný M, Jelínek M. Concordance of child self-reported and parent proxy-reported posttraumatic growth in childhood cancer survivors. Cancers (Basel). 2021;13(16):4230. [FREE Full text] [CrossRef] [Medline]
- Smriti D, Kao TSA, Rathod R, Shin JY, Peng W, Williams J, et al. Motivational interviewing conversational agent for parents as proxies for their children in healthy eating: development and user testing. JMIR Hum Factors. 2022;9(4):e38908. [FREE Full text] [CrossRef] [Medline]
- Angell C, Alexander J, Hunt JA. ‘Draw, write and tell’: a literature review and methodological development on the ‘draw and write’ research method. J Early Child Res. 2014;13(1):17-28. [CrossRef]
- McWhirter J. The draw and write technique as a versatile tool for researching children's understanding of health and well-being. Int J Health Promot Educ. 2014;52(5):250-259. [CrossRef]
- Teela L, Verhagen LE, van Oers HA, Kramer EEW, Daams JG, Gruppen MP, et al. Pediatric patient engagement in clinical care, research and intervention development: a scoping review. J Patient Rep Outcomes. 2023;7(1):32. [FREE Full text] [CrossRef] [Medline]
- Wang Q, Hay M, Clarke D, Menahem S. Adolescents' drawings of their cardiac abnormality. Cardiol Young. 2011;21(5):556-561. [CrossRef] [Medline]
- Noonan RJ, Boddy LM, Fairclough SJ, Knowles ZR. Write, draw, show, and tell: a child-centred dual methodology to explore perceptions of out-of-school physical activity. BMC Public Health. 2016;16:326. [FREE Full text] [CrossRef] [Medline]
- Flynn R, Walton S, Scott SD. Engaging children and families in pediatric health research: a scoping review. Res Involv Engagem. 2019;5:32. [FREE Full text] [CrossRef] [Medline]
- Druin A. The role of children in the design of new technology. Behav Inf Technol. 2002;21(1):1-25. [CrossRef]
- Sanders EBN, Brandt E, Binder T. A framework for organizing the toolstechniques of participatory design. 2010. Presented at: Proceedings of the 11th Biennial Participatory Design Conference; 2010 November 29:195-198; New York, NY. [CrossRef]
- Read JC, MacFarlane S. Using the fun toolkit and other survey methods to gather opinions in child computer interaction. 2006. Presented at: Proceedings of the 2006 Conference on Interaction Design and Children; 2006 June 9:A; New York, NY. [CrossRef]
- Bradbury-Jones C, Taylor J. Engaging with children as co-researchers: challenges,counter-challenges and solutions. Int J Soc Res Methodol. 2013;18(2):161-173. [CrossRef]
- Abdelghani R, Law E, Desvaux C, Oudeyer PY, Sauzéon H. Interactive environments for training children's curiosity through the practice of metacognitive skills: a pilot study. Association for Computing Machinery; 2023. Presented at: Proceedings of the 22nd Annual ACM Interaction Design and Children Conference; 2023 June 23:495-501; New York, NY. [CrossRef]
- Boberg M, Karapanos E, Holopainen J, Lucero A. PLEXQ: Towards a Playful Experiences Questionnaire. New York, NY. Association for Computing Machinery; 2015. Presented at: Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play; 2015 October 7; London. [CrossRef]
- Kellett M, Forrest R, Dent N, Ward S. ‘Just teach us the skills please, we'll do the rest’: empowering ten‐year‐olds as active researchers. Children Soc. 2006;18(5):329-343. [CrossRef]
- Druin A. Cooperative inquiry: developing new technologies for children with children. Proc SIGCHI Conf Hum Factors Comput Syst New York, NY. New York, NY. Association for Computing Machinery; 1999. Presented at: Proceedings of the SIGCHI conference on Human Factors in Computing Systems; 1999 May 20; Pittsburgh, PA. [CrossRef]
- Yale-Griffin Prevention Research Center. Food Label Literacy for Applied Nutrition Knowledge (FLLANK). URL: https://yalegriffinprc.griffinhealth.org/products-resources/prc-products/food-label-literacy [accessed 2023-12-20]
- Stals S, Voysey I, Baillie L. Let's make this fun!: activities to motivate children teens to complete questionnaires. New York, NY. Association for Computing Machinery; 2024. Presented at: ACMIEEE International Conference on Human-Robot Interaction; 2024 March 15; Boulder, CO. [CrossRef]
- Gaines KS, Curry ZD. The inclusive classroom: the effects of color on learning and behavior. J Fam Consum Sci Educ. 2011;29(1):46-57. [FREE Full text]
- Barbazi N, Wang CX. Hum factors Aging Spec Needs. Perceiving through colors: visual supports for children with autism. 2023;88:104-112. [CrossRef]
- Hall L, Hume C, Tazzyman S. Five degrees of happinessffective smiley face likert scales for evaluating with children. New York, NY. Association for Computing Machinery; 2016. Presented at: Proceedings of the 15th International Conference on Interaction Design and Children; 2016 June 24:311-321; Manchester, UK. [CrossRef]
- National Cancer Institute at the National Institutes of Health. Health measures. URL: https://healthcaredelivery.cancer.gov/healthmeasures/ [accessed 2024-12-20]
- HealthMeasures. Neuro-QoL. 2024. URL: https://www.healthmeasures.net/explore-measurement-systems/neuro-qol [accessed 2024-02-17]
- Stein DM, Wrenn JO, Stetson PD, Bakken S. What "to-do" with physician task lists: clinical task model development and electronic health record design implications. AMIA Annu Symp Proc. 2009;2009:624-628. [FREE Full text] [Medline]
- Braun V, Clarke V. Thematic analysis. In: APA Handbook of Research Methods in Psychology Vol 2. Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological. Washington, DC. American Psychological Association; 2012:57-71.
- Creswell JW, Creswell JD. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. Washington DC. SAGE Publications; 2017.
- Martin B, Hanington B. Universal Methods of DesignWays to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions. Gloucester MA. Rockport Publishers; 2012.
- Creswell JW, Poth CN. Qualitative Inquiry and Research Design: Choosing Among Five Approaches. Washington, DC. SAGE Publications; 2016.
- DeJonckheere M, Vaughn LM. Semistructured interviewing in primary care research: a balance of relationship and rigour. Fam Med Community Health. 2019;7(2):e000057. [FREE Full text] [CrossRef] [Medline]
- de Souza RH, Dorneles CF. Searchingranking questionnaires: an approach to calculate similarity between questionnaires. New York, NY. Association for Computing Machinery; 2019. Presented at: Proceedings of the ACM Symposium on Document Engineering 2019; 2019 September 26:1-9; Berlin, Germany. [CrossRef]
- Coyne I. Children's participation in consultations and decision-making at health service level: a review of the literature. Int J Nurs Stud. 2008;45(11):1682-1689. [CrossRef] [Medline]
- Hassan ZA, Schattner P, Mazza D. Doing a pilot study: why is it essential? Malays Fam Physician. 2006;1(2-3):70-73. [FREE Full text] [Medline]
- Schroder C, Medves J, Paterson M, Byrnes V, Chapman C, O'Riordan A, et al. Development and pilot testing of the collaborative practice assessment tool. J Interprof Care. 2011;25(3):189-195. [CrossRef] [Medline]
Abbreviations
CBCL: Child Behavior Checklist |
CHD: congenital heart disease |
CHD-HEEA: Congenital Heart Disease Healthcare Provider Educational Efficiency Assessment |
CHD-HLCA: Congenital Heart Disease Health Literacy Children Assessment |
CHD-PEBA: Congenital Heart Disease Parental Educational Burden Assessment |
CHQ: Child Health Questionnaire |
FIM: Family Impact Module |
FLLANK: Food Label Literacy for Applied Nutrition Knowledge |
KPI: Key Performance Indicator |
LSS: Lean Six Sigma |
PCQLI: Pediatric Cardiac Quality of Life Inventory |
PedsQL: Pediatric Quality of Life Inventory |
PSI/PSS: Parenting Stress Index/Parental Stress Scale |
Edited by A Mavragani; submitted 30.06.24; peer-reviewed by F Yang, LB Carey; comments to author 13.11.24; revised version received 02.12.24; accepted 02.12.24; published 27.01.25.
Copyright©Neda Barbazi, Ji Youn Shin, Gurumurthy Hiremath, Carlye Anne Lauff. Originally published in JMIR Formative Research (https://formative.jmir.org), 27.01.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.