%0 Journal Article %@ 2561-326X %I JMIR Publications %V 8 %N %P e53918 %T Chinese Oncologists’ Perspectives on Integrating AI into Clinical Practice: Cross-Sectional Survey Study %A Li,Ming %A Xiong,XiaoMin %A Xu,Bo %A Dickson,Conan %+ Department of Health Policy Management, Bloomberg School of Public Health, Johns Hopkins University, 615 North Wolfe Street, Baltimore, MD, 21205, United States, 1 410 955 3543, cdickso1@jh.edu %K artificial intelligence %K AI %K machine learning %K oncologist %K concern %K clinical practice %D 2024 %7 5.6.2024 %9 Original Paper %J JMIR Form Res %G English %X Background: The rapid development of artificial intelligence (AI) has brought significant interest to its potential applications in oncology. Although AI-powered tools are already being implemented in some Chinese hospitals, their integration into clinical practice raises several concerns for Chinese oncologists. Objective: This study aims to explore the concerns of Chinese oncologists regarding the integration of AI into clinical practice and to identify the factors influencing these concerns. Methods: A total of 228 Chinese oncologists participated in a cross-sectional web-based survey from April to June in 2023 in mainland China. The survey gauged their worries about AI with multiple-choice questions. The survey evaluated their views on the statements of “The impact of AI on the doctor-patient relationship” and “AI will replace doctors.” The data were analyzed using descriptive statistics, and variate analyses were used to find correlations between the oncologists’ backgrounds and their concerns. Results: The study revealed that the most prominent concerns were the potential for AI to mislead diagnosis and treatment (163/228, 71.5%); an overreliance on AI (162/228, 71%); data and algorithm bias (123/228, 54%); issues with data security and patient privacy (123/228, 54%); and a lag in the adaptation of laws, regulations, and policies in keeping up with AI’s development (115/228, 50.4%). Oncologists with a bachelor’s degree expressed heightened concerns related to data and algorithm bias (34/49, 69%; P=.03) and the lagging nature of legal, regulatory, and policy issues (32/49, 65%; P=.046). Regarding AI’s impact on doctor-patient relationships, 53.1% (121/228) saw a positive impact, whereas 35.5% (81/228) found it difficult to judge, 9.2% (21/228) feared increased disputes, and 2.2% (5/228) believed that there is no impact. Although sex differences were not significant (P=.08), perceptions varied—male oncologists tended to be more positive than female oncologists (74/135, 54.8% vs 47/93, 50%). Oncologists with a bachelor’s degree (26/49, 53%; P=.03) and experienced clinicians (≥21 years; 28/56, 50%; P=.054). found it the hardest to judge. Those with IT experience were significantly more positive (25/35, 71%) than those without (96/193, 49.7%; P=.02). Opinions regarding the possibility of AI replacing doctors were diverse, with 23.2% (53/228) strongly disagreeing, 14% (32/228) disagreeing, 29.8% (68/228) being neutral, 16.2% (37/228) agreeing, and 16.7% (38/228) strongly agreeing. There were no significant correlations with demographic and professional factors (all P>.05). Conclusions: Addressing oncologists’ concerns about AI requires collaborative efforts from policy makers, developers, health care professionals, and legal experts. Emphasizing transparency, human-centered design, bias mitigation, and education about AI’s potential and limitations is crucial. Through close collaboration and a multidisciplinary strategy, AI can be effectively integrated into oncology, balancing benefits with ethical considerations and enhancing patient care. %M 38838307 %R 10.2196/53918 %U https://formative.jmir.org/2024/1/e53918 %U https://doi.org/10.2196/53918 %U http://www.ncbi.nlm.nih.gov/pubmed/38838307