e.g. mhealth
Search Results (1 to 7 of 7 Results)
Download search results: CSV END BibTex RIS
Skip search results from other journals and go to results- 2 JMIR Formative Research
- 2 Journal of Medical Internet Research
- 1 JMIR Biomedical Engineering
- 1 JMIR Medical Education
- 1 JMIR Serious Games
- 0 Medicine 2.0
- 0 Interactive Journal of Medical Research
- 0 iProceedings
- 0 JMIR Research Protocols
- 0 JMIR Human Factors
- 0 JMIR Medical Informatics
- 0 JMIR Public Health and Surveillance
- 0 JMIR mHealth and uHealth
- 0 JMIR Mental Health
- 0 JMIR Rehabilitation and Assistive Technologies
- 0 JMIR Preprints
- 0 JMIR Bioinformatics and Biotechnology
- 0 JMIR Cancer
- 0 JMIR Challenges
- 0 JMIR Diabetes
- 0 JMIR Data
- 0 JMIR Cardio
- 0 Journal of Participatory Medicine
- 0 JMIR Dermatology
- 0 JMIR Pediatrics and Parenting
- 0 JMIR Aging
- 0 JMIR Perioperative Medicine
- 0 JMIR Nursing
- 0 JMIRx Med
- 0 JMIRx Bio
- 0 JMIR Infodemiology
- 0 Transfer Hub (manuscript eXchange)
- 0 JMIR AI
- 0 JMIR Neurotechnology
- 0 Asian/Pacific Island Nursing Journal
- 0 Online Journal of Public Health Informatics
- 0 JMIR XR and Spatial Computing (JMXR)

Depression is associated with a negative bias in the interpretation of facial emotional expressions [1,2]. This negative bias has been proposed to play an important role in the onset and maintenance of depression, as successful pharmacological interventions have been found to be associated with the reduction of negative biases [3]. It was previously thought that treating depression would lead to improvements in emotion recognition bias.
JMIR Serious Games 2025;13:e65103
Download Citation: END BibTex RIS
Go back to the top of the page Skip and go to footer section
Go back to the top of the page Skip and go to footer section

Confidentiality concerns can limit traditional patient photo use, especially when facial features are essential [4]. Using widely available AI text-to-image tools, we aimed to create images portraying distinct facial signs important for medical trainees—hypothyroidism (myxedema) and Horner syndrome [5,6]. These tools generate unique, high-quality images based on text prompts, utilizing learned probability distributions rather than pre-existing images [7].
JMIR Med Educ 2024;10:e52155
Download Citation: END BibTex RIS

Perception and processing of facial expression and emotions through the use of images is a long-standing research field [1] and the use of facial emotion expression has become more common. Various sets of facial expressions have been developed for research purposes, deploying different facial expressions for different ethnicities [2]. The need for differing ethnicity samples of facial expressions follows the rationale that “within-group” processing of emotions is more readily available than “out-groups.”
JMIR Form Res 2023;7:e44632
Download Citation: END BibTex RIS

In this regard, the evaluation of facial expressions in children is commonly used in various existing observational pain assessment tools and is a valid means of assessing pain [6]. However, evaluation of these facial expressions in clinical practice is done through direct observation, and this process is limited by the challenges of human decoding as well as inherent subjectivity issues [7,8].
J Med Internet Res 2023;25:e41992
Download Citation: END BibTex RIS

More contemporarily, reduced facial expressivity and movement measured using standardized coding schemes based on videos of patient interviews differentiated depressed patients with and without suicide risk [9], and altered vocal characteristics have been observed in acutely suicidal patients [5].
A number of visual and auditory characteristics can be directly quantified, including gross motor activity [10], head movement variability [11-13], facial activity [14], and properties of speech [15].
J Med Internet Res 2021;23(6):e25199
Download Citation: END BibTex RIS