e.g. mhealth
Search Results (1 to 7 of 7 Results)
Download search results: CSV END BibTex RIS
Skip search results from other journals and go to results- 2 JMIR Formative Research
- 2 Journal of Medical Internet Research
- 1 JMIR Infodemiology
- 1 JMIR Medical Informatics
- 1 JMIR Public Health and Surveillance
- 0 Medicine 2.0
- 0 Interactive Journal of Medical Research
- 0 iProceedings
- 0 JMIR Research Protocols
- 0 JMIR Human Factors
- 0 JMIR mHealth and uHealth
- 0 JMIR Serious Games
- 0 JMIR Mental Health
- 0 JMIR Rehabilitation and Assistive Technologies
- 0 JMIR Preprints
- 0 JMIR Bioinformatics and Biotechnology
- 0 JMIR Medical Education
- 0 JMIR Cancer
- 0 JMIR Challenges
- 0 JMIR Diabetes
- 0 JMIR Biomedical Engineering
- 0 JMIR Data
- 0 JMIR Cardio
- 0 Journal of Participatory Medicine
- 0 JMIR Dermatology
- 0 JMIR Pediatrics and Parenting
- 0 JMIR Aging
- 0 JMIR Perioperative Medicine
- 0 JMIR Nursing
- 0 JMIRx Med
- 0 JMIRx Bio
- 0 Transfer Hub (manuscript eXchange)
- 0 JMIR AI
- 0 JMIR Neurotechnology
- 0 Asian/Pacific Island Nursing Journal
- 0 Online Journal of Public Health Informatics
- 0 JMIR XR and Spatial Computing (JMXR)

BERTopic [29] is an algorithm that uses pretrained embedding models to create word and document embeddings so that documents that occupy similar vector space can be grouped together to form topics. By default, BERTopic incorporates Bidirectional Encoder Representations From Transformers embeddings and a term frequency–inverse document frequency algorithm, which compares the importance of terms within a cluster and creates term representation based on this [60].
JMIR Infodemiology 2025;5:e65632
Download Citation: END BibTex RIS

Probing Public Perceptions of Antidepressants on Social Media: Mixed Methods Study
We used BERTopic [44], a topic modeling approach leveraging transformers and class-based term frequency-inverse document frequency to generate coherent topics. To enhance interpretability, we used GPT-4 to refine topic labels by analyzing keywords and representative documents. The following prompt was used (Textbox 1).
JMIR Form Res 2025;9:e62680
Download Citation: END BibTex RIS

BERTopic [17] is a more recent topic-modeling technique that has gained popularity for its ease of interpretation and ability to leverage Hugging Face transformers and class-based Term Frequency–Inverse Document Frequency (c-TF-IDF) to create dense clusters.
JMIR Public Health Surveill 2024;10:e59193
Download Citation: END BibTex RIS

Built on the foundations of bidirectional encoder representations from transformers (BERT), BERTopic introduces a novel approach to topic modeling [29,30]. Unlike traditional unsupervised models like latent Dirichlet allocation, which rely on “bag-of-words” model [31], BERTopic overcomes the problem of semantic information loss, significantly enhancing the accuracy of generated topics, and providing more interpretable compositions for each topic, which greatly facilitates the classification of topics.
J Med Internet Res 2024;26:e48330
Download Citation: END BibTex RIS

Machine Learning–Based Approach for Identifying Research Gaps: COVID-19 as a Case Study
For clustering the sentences into semantically similar topics, we used the BERTopic algorithm [25]. The BERTopic algorithm is an unsupervised learning algorithm for topic modeling. It uses the Bidirectional Encoder Representations from Transformers (BERT). BERTopic does not require labeled data as it extracts topics from an input text in a supervised way [26].
JMIR Form Res 2024;8:e49411
Download Citation: END BibTex RIS

At this stage, we did not perform any further data cleaning to maintain the natural structure of the comments since the BERTopic library was developed with natural text and has its own way of dealing with noise and outliers.
BERTopic is a topic modeling technique that uses state-of-the-art language models and applies a class-based term frequency-inverse document, which calculates how relevant a word is to the class of documents and uses a frequency procedure for generating topics [22].
J Med Internet Res 2023;25:e45249
Download Citation: END BibTex RIS

Examples of recently developed neural topic models include top2vec [10] and BERTopic [11]. In this study, we focused on the BERTopic model.
BERTopic begins with embedding documents empirically observed in the study corpus into a latent embedding space. Many methods exist for embedding discrete linguistic units (words, sentences, paragraphs, documents, etc) into an embedding space.
JMIR Med Inform 2022;10(12):e40102
Download Citation: END BibTex RIS