Background: Neurocognitive disorders are often accompanied by behavioral symptoms such as anxiety, depression, and/or apathy. These symptoms can occur very early in the disease progression and are often difficult to detect and quantify in nonspecialized clinical settings.
Objective: We focus in this study on apathy, one of the most common and debilitating neuropsychiatric symptoms in neurocognitive disorders. Specifically, we investigated whether facial expressivity extracted through computer vision software correlates with the severity of apathy symptoms in elderly subjects with neurocognitive disorders.
Methods: A total of 63 subjects (38 females and 25 males) with neurocognitive disorder participated in the study. Apathy was assessed using the Apathy Inventory (AI), a scale comprising 3 domains of apathy: loss of interest, loss of initiation, and emotional blunting. The higher the scale score, the more severe the apathy symptoms. Participants were asked to recall a positive and a negative event of their life, while their voice and face were recorded using a tablet device. Action units (AUs), which are basic facial movements, were extracted using OpenFace 2.0. A total of 17 AUs (intensity and presence) for each frame of the video were extracted in both positive and negative storytelling. Average intensity and frequency of AU activation were calculated for each participant in each video. Partial correlations (controlling for the level of depression and cognitive impairment) were performed between these indexes and AI subscales.
Results: Results showed that AU intensity and frequency were negatively correlated with apathy scale scores, in particular with the emotional blunting component. The more severe the apathy symptoms, the less expressivity in specific emotional and nonemotional AUs was displayed from participants while recalling an emotional event. Different AUs showed significant correlations depending on the sex of the participant and the task’s valence (positive vs negative story), suggesting the importance of assessing male and female participants independently.
Conclusions: Our study suggests the interest of employing computer vision-based facial analysis to quantify facial expressivity and assess the severity of apathy symptoms in subjects with neurocognitive disorders. This may represent a useful tool for a preliminary apathy assessment in nonspecialized settings and could be used to complement classical clinical scales. Future studies including larger samples should confirm the clinical relevance of this kind of instrument.
Apathy is one of the most common neuropsychiatric symptoms in neurocognitive disorders (NCDs) such as Alzheimer disease, Parkinson disease, Huntington disease, and vascular dementia and is prevalent in several psychiatric pathologies such as schizophrenia and major depression . It can be defined as a reduction in goal-directed behavior that persists over time causing impairment in global functioning. Three dimensions of apathy have been identified, including loss/reduction in goal-directed behavior and goal-directed cognitive activity (eg, reduced interests and reduced indoor and outdoor activities), emotions (eg, emotional blunting), and social interactions (eg, reduced interactions with family members and friends) [ ]. Apathy is a debilitating symptom: it significantly decreases the quality of life of patients with NCD and their caregivers and increases the risk for institutionalization in outpatients [ ] and mortality for nursing home residents (even after controlling for depression) [ ]. Apathy has also been associated with faster cognitive and functional decline [ , ]. A recent study suggested that apathy significantly increases the risk of developing dementia, resulting in a loss of autonomy in activities of daily living [ ]. As apathetic persons interact less with their surroundings, show less interests in their family and in leisure activities, they are significantly less stimulated, which may increase the progression of NCD. Critically, preliminary evidence suggests that interventions targeting apathy in people with mild cognitive impairment (through repetitive transcranial magnetic stimulation) may be effective in improving global cognitive functioning [ ], thus suggesting that identifying apathy early in disease progression and putting in place early treatment options could offer new opportunities for dementia prevention [ ].
Currently, apathy is assessed by using various clinical scales or questionnaires (including self-reports and scales completed by caregivers and/or the clinician; see Radakovic et al  for a review), which allow for quantification of apathy symptoms over continuous scales. Furthermore, apathy can be assessed using the Apathy Diagnostic Criteria (ADC), which allow for classification of patients as apathetic versus nonapathetic based on the observed symptomatology [ ]. All these instruments suffer from the risk of bias resulting from the assessor’s subjectivity [ ]. For instance, self-report clinical scales rely on the patient’s ability to recall a change in their activities, interests, or emotional reactivity, which might be difficult considering that this requires preserved memory and insight to some extent. Similarly, the clinician may have access only to limited information concerning the patient’s changes in everyday activities, thus resulting in underestimation of apathy [ ].
Another obstacle that can emerge is that apathy can often be misdiagnosed as depression because of frequent comorbidities such as fatigue and anhedonia and a considerable overlap in key symptoms such as diminished interest, psychomotor retardation, or social withdrawal [- ]. However, apathy and depression can be dissociated based on emotional deficits. Indeed, the emotion dimension in apathy can be described as a blunted affect whereas depression is characterized by the presence of negative emotions and sadness. Early differential diagnosis is of great importance since treating apathy with antidepressants can lead to aggravation of symptoms [ , ]. Therefore, there is an urgent need to detect apathy more objectively at early stages using objective, noninvasive methods to provide timely treatment and prevent it from aggravating cognitive symptoms. There is today a growing interest in finding new sensitive measures to detect behavioral aspects of dementia as they may represent a heavier burden for caregivers than the cognitive dysfunctions themselves [ , ].
New information and communication technologies can provide objective and more sensitive measures of human behaviors and have been recommended for neurocognitive disorders and apathy [, ]. The behavioral correlates of apathy have been investigated through oculomotor movement using eye tracking [ ], global activity or sleep disturbances using accelerometers [ , ], voice features [ , ], and facial expressivity [ - ]. The additional value of combining multimodal measures has been demonstrated before in depression by merging audio features with facial activity [ - ]. Associating these new markers with other recent methods such as ecological momentary assessment [ , ], which allows patients to report on symptoms remotely, could represent the future of more naturalistic psychiatric evaluations. These techniques allow the tracking of changes in mood and cognition continuously leading to timely prevention of further decline.
Today, facial expressivity, which may be altered in apathetic subjects, can be measured automatically by means of computer vision-based facial analysis methods. The Facial Action Coding System allows detection of facial behavior by identifying specific facial muscle movements called action units (AUs) . Some AUs are specifically linked to emotions such as AU 4 (cheek raiser) and 12 (lips puller, smile) being linked to joy. Some AUs are mainly associated with positive emotions, while others are more associated with negative emotions (see for a description of AUs). Girard and colleagues [ ] found that automatic facial expression analysis was consistent with manual coding, validating its use in clinical research. In their study, they analyzed AUs one by one to extract a pattern linked to depression. Seidl and colleagues [ ] studied the association of cognitive decline and facial expressivity. The results showed apathy moderated the effect of cognitive decline on facial expression in Alzheimer disease and was significantly correlated to decreased general and specific facial expressivity.
|AUa number||FACSb name||Emotion valence related|
|1||Inner brow raiser||–c|
|2||Outer brow raiser||+d|
|5||Upper lid raiser||–|
|10||Upper lip raiser||–|
|12||Lip corner puller||+|
|15||Lip corner depressor||–|
aAU: action unit.
bFACS: Facial Action Coding System.
eEither positive or negative valence.
In a previous pilot study, we aimed to identify facial behavior associated with apathy through automated video analysis and investigate how facial variations (expression and movement) could help characterize this symptom . Our algorithm was able to classify subjects as apathetic versus nonapathetic with 84% accuracy. To validate the deep learning algorithm, we relied on the leave-one-out cross-validation technique, which consists of training a model using n–1 available subjects and validate using the remaining subject. In another research work, we employed a different approach with the same dataset [ ]. Here again we used deep learning techniques to build models but added audio features to the video ones. The final model was 76% accurate in discriminating apathetic versus nonapathetic subjects.
Despite these promising findings, the problem of using deep learning techniques for such studies is that the generally low number of included subjects makes it difficult to allow for a generalization of the built models. In addition, while correct classification of apathetic versus nonapathetic subjects is an important challenge, from a clinical point of view it is crucial to precisely understand which features are more sensitive for apathy assessment and investigate links between these features and the degree of apathy severity as well as its subdomains, as measured by continuous scales.
Therefore, we aim with this exploratory study to better understand the quantitative relationship between facial expressivity and apathy and its subdomains (emotional blunting, loss of initiation, loss of interest) independently of depression and level of cognitive decline, which have been previously shown to affect emotional expressivity. For this, we included a larger sample of participants from which we correlated basic facial AU activation and intensity with apathy scale scores.
We hypothesize that the higher the apathy score of a participant, the lower and less intense would be his facial expressivity especially for AUs involved in positive or negative emotions .
A total of 63 subjects were included in the study; 7 participants had subjective memory complaints and the rest were diagnosed with NCDs  (39 mild and 17 major), including 11 subjects with Alzheimer disease, 22 subjects with vascular dementia, and 11 subjects with affective disorders. Participants were recruited from the Memory Center of Nice University Hospitals, France, and from the Cognition Behavioral Technology research lab of the Université Cote d’Azur in the context of motivation activation research protocol. Participants with mild NCD were previously followed at the Memory Center. Participants were not included if they had sensory or motor impairments interfering with the protocol completion. For each participant, clinicians assessed the global level of cognitive impairment using the Mini-Mental State Examination (MMSE) [ ], the presence of depression using the Neuropsychiatric Inventory (NPI) [ ], the presence of apathy using the ADC [ ], and the severity of apathy symptoms using the Apathy Inventory (AI) [ ]. The AI is divided into three subscales: loss of initiation (AI-initiation), loss of interest (AI-interest), and emotional blunting (AI-affect). The NPI depression score is calculated by reporting the intensity (1=mild, 2=moderate, 3=severe) and frequency (1=rarely to 4=very often). The demographic and clinical profiles of participants are presented in . Additional demographic features (including comparisons between apathetic versus nonapathetic subjects based on the ADC) are reported in .
|Characteristic||Total (n=63), mean (SD)||Females (n=38), mean (SD)||Males (n=25), mean (SD)||Cohen d|
|Age in years||72.95 (8.40)||72.66 (8.89)||73.40 (7.76)||0.09|
|MMSEa||24.06 (3.84)||23.68 (4.09)||24.64 (3.43)||0.25|
|NPIb depression||1.17 (1.95)||1.13 (2.03)||1.24 (1.85)||0.06|
|AIc affect||0.33 (0.74)||0.24 (0.68)||0.48 (0.82)||0.33|
|AI initiation||1.32 (1.47)||1.00 (1.38)||1.80 (1.50)||0.56|
|AI interest||1.17 (1.33)||0.92 (1.30)||1.56 (1.29)||0.49|
|AI total||2.81 (3.03)||2.11 (2.83)||3.88 (3.07)||0.61|
aMMSE: Mini-Mental State Examination.
bNPI: Neuropsychiatric Inventory.
cAI: Apathy Inventory.
The study was performed as defined in the Declaration of Helsinki. The protocol was approved by the ethics committee (Comité de Protection de Personnes—CPP Est III, France; MoTap: RCB ID No. 2017-A01366-4). Informed written consent was obtained from all participants before the study.
Free Emotional Speech Task
Participants were asked to recall a positive and negative event of their life in maximum 1 minute. This free speech task requires only a low cognitive load and is supposed to trigger an emotional response. Participants were audio and videorecorded using a tablet device in a quiet room of the Resources and Research Memory Center at the Claude Pompidou Institut in Nice. The tablet was facing the subjects but not hiding the investigator, allowing a more natural interaction. In a previous study, we have analyzed the simple audio files and found that certain voice features were associated with apathy presence .
Video Features Extraction
For the features extraction we used OpenFace 2.0 software, which is an open-source facial behavior analysis toolkit . OpenFace is an implementation of face recognition with deep neural networks. OpenFace allows detection of a single or multiple faces in an image using pretrained models. Many basic features can be extracted such as facial landmarks, eyes, and head positions. Higher level features are also provided by OpenFace in a video: head direction and movements, eye gaze estimations, and expressed emotions. Emotions are mainly computed by combining several AUs. Since the accuracy of detected emotions is highly related to the datasets used for training the neural networks and since we are working with a small dataset of elderly people that doesn’t allow us to retrain the model, the obtained accuracy on detected emotions is quite unsatisfying. We then investigated the use of AUs as middle-level features to detect apathy. We use here the 17 main computed AUs by OpenFace: 1, 2, 4, 5, 6, 7, 9, 10, 12, 14, 15, 17, 20, 23, 25, 26, and 45 (see ). For each frame of the recorded videos, OpenFace provides AU intensity and presence. We then calculated 2 measures for each AU in both videos: the average intensity and activation of each AU in each video. Intensity ranged from 0 (absent) to 1 (present at minimum intensity) to 5 (present at maximum intensity), with continuous values in between. The mean intensity for each AU was calculated as the average score across all the video frames. To obtain a more general measure of AU activation, we also computed for each subject the mean activation and intensity across all AUs.
Statistical analyses were performed using SPSS Statistics version 23.0.0 for Mac software (IBM Corporation) and R version 3.6.3 (R Foundation for Statistical Computing). To investigate the linear relationships between apathy and facial expressivity, we performed partial linear correlations between AU activation and intensity and the AI subscales using depression (NPI depression) and level of cognitive impairment (MMSE score) as covariates for both the positive and negative story. As most of the clinical scales were not normally distributed (as indexed by Shapiro-Wilks tests), Spearman rho correlations were employed. We wanted to control for the effect of cognitive impairment as it might have an impact on the task itself to recall an emotional event. Similarly, we wanted to control for the effects of depressive symptoms. In the analyses run on each specific AU, we separated males and females as studies have found that women and men can express their emotions differently [- ].
A total of 63 subjects (38 females and 25 males) were included in the study. The demographic and clinical profiles of participants are presented in. Sociodemographic features of apathetic and nonapathetic subjects and correlations among the different clinical scales are reported in .
The average activation frequency and intensity for each AU are reported in. In the positive story, the average global intensity of all AUs was 0.52 (SD 0.15) and the average global frequency was 0.29 (SD 0.07). In the negative story, the average global intensity was 0.51 (SD 0.28) and global frequency 0.28 (SD 0.08). Average presence and intensity for each AU for males and females are presented in . In the negative story, AU 7 was the most intense facial expression in the whole sample (mean 1.60 [SD 0.84] on a 0 to 5 scale) followed by AU 10 (mean 1.02 [SD 0.47]) and AU 4 (mean 0.45 [SD 0.57]). AU 5 was the most frequently seen in the video (mean 0.68 [SD 0.30]) followed by AU 4 (mean 0.37 [SD 0.34) and AU 7 (mean 0.37 [SD 0.31]). In the positive story, AU 7 was also the most intense (mean 1.68 [SD 0.86]) followed by AU 10 (mean 1.1 [SD 0.50]) and AU 6 (mean 0.85 [SD 0.53]). AU 5 was the most frequently seen (mean 0.63 [SD 0.33]) followed by AU 7 (mean 0.40 [SD 0.32]) and AU 23 (mean 0.39 [SD 0.29]).
AU Correlations to Scales
A general analysis considering the global activation and intensity across AUs revealed a small but significant negative partial correlation between global AU intensity to AI-affect score in the negative story (rs(61)=–0.26, P=.04), suggesting that more apathetic participants showed lower AU intensity in the negative story. No other significant linear correlation was found between the apathy scales and global activation.
Spearman partial correlations between each AU and apathy scale score controlled by the NPI depression and MMSE scores and divided by sex are presented in. Only significant correlations are reported. All correlations are presented in .
For females, we found significant negative partial correlations between AUs and apathy, predominantly with the AI-affect subscale score, in both negative and positive stories (all of them of medium effect size, rho ranging from –0.33 to –0.47). All correlations were negative. The more severe the AI-affect score, the less frequent or less intense the AU. Specifically, significant correlations were found for AU 1, 2, 9, 10, 12, 25, and 45. In the positive story, the mean activations of AU 10 and 25 were correlated to AI-affect score. In the negative story, the mean activations of AU 1, 2, 9, 10 12, and 45 were correlated to AI-affect scores. The mean activations of AU 10 and 12 were correlated to AI-total score, and the mean activation of AU 10 was correlated to AI-initiation score.
For males, significant correlations were found for AU 1, 12, 14, 15, 17, 20, and 45. All of these correlations were negative, with effect sizes ranging from medium to large (rho ranging from –0.42 to –0.63). In the positive story, the mean intensity of AU 1 was correlated to AI-initiation score, 12 to AI-interest score, 14 to AI-total score, and 17 was correlated to AI-affect score. The mean activation of AU 14 was correlated to AI-initiation, AI-interest, and AI-total scores. The mean activation of AU 45 was correlated to AI-interest and AI-total scores. In the negative story, the mean intensity of AU 1 was correlated to AI-initiation and AI-total scores. The mean intensities of AU 15, 17, and 20 were correlated to AI-affect score. The mean activation of AU 28 was correlated to AI-initiation and NPI-apathy score.
|Inner brow raiser||–0.340||—c||—||—|
|Outer brow raiser||–0.422||—||—||—|
|Upper lip raiser||–0.350||–0.426||—||–0.337|
|Lip corner puller||—||—||–0.350|
|Inner brow raiser||–0.361||—||—||—|
|Upper lip raiser||–0.461||—||—||—|
|Inner brow raiser||—||–0.432||—||–0.426|
|Lip corner depressor||–0.628||—||—||—|
|Inner brow raiser||—||–0.472||—||—|
|Lip corner puller||—||—||–0.418||—|
aAU: action unit.
bAI: Apathy Inventory.
In this exploratory study we aimed to verify whether facial expressivity, assessed using automatic video analysis, correlates with apathy severity in elderly subjects with NCDs. Specifically, we aimed to explore which particular AUs, intensity and activation frequency, are associated with apathy and its different subdomains in male and female subjects, employing 2 different emotion-related tasks (telling a positive and a negative story).
To assess apathy, we used the AI scale (clinician version) and its subdomains reduced affect, loss of interest, and loss of initiation. We extracted AUs from videorecordings of participants while they performed an emotional task consisting of recalling a positive and a negative event from their life. Studies have found that women and men can express their emotions differently [- ]. Thus, we separated male and female participants for the analyses. We hypothesized that the AUs implicated in positive and negative emotions (joy, sadness, disgust, anger, surprise) would correlate with apathy levels. Specifically, the more important the apathy symptomatology, the lower the emotional activation/intensity. We classified AUs with emotional valence (positive, negative, neutral) according to Haines et al [ ].
Partial correlations (controlling for depressive symptoms and level of cognitive impairment) showed that, overall, apathetic participants showed lower average AU intensity in the negative story. This result suggests that more apathetic subjects had less intense facial expressivity compared with less apathetic subjects, at least while telling a negative event. At the level of single AU analysis, several AUs were significantly less expressed with more pronounced apathy symptoms in the upper region (inner and outer brow raiser) and lower region (nose and lip movement) of the face. Moreover, the relevant AUs were different depending on sex and the task’s emotional valence.
For instance, for men, AU 12 (smile) and 14 (dimpler), both considered positive AUs, were significantly less expressed the more important the behavioral and cognitive dimensions of apathy in the positive story. AU 15 (lip corner depressor) and 20 (lip stretcher), both considered negative AUs, correlated significantly with affect dimension in the negative story. AU 1 (inner brow raiser), involved in sadness, anger and fear, was less expressed in both stories the more severe the initiation symptoms. For women, in the positive story, AU 1 (inner brow raiser) intensity and 4 (brow lowerer) frequency (both involved in sadness) were significantly correlated to AI-affect in the negative story. Thus, some of the expected facial expressions are significantly less expressed in more apathetic subjects. For men, this was observed in both positive and negative storytelling. For women, this was mainly observed in the negative story. For women, AU 25 (lip part), involved in talking, was significantly less frequent in the positive story. This is in line with the symptomatology of apathetic subjects being less talkative . Different AUs were correlated to apathy scales depending on the task’s emotional valence. In the negative story, mainly negative valence AUs were correlated to apathy symptoms for men. Both positive and negative AUs showed significant correlations for women. In the positive story, positive AUs (12 and 14) and neutral AUs (17 and 45) were significantly less expressed for men. For women, only two AUs (10 and 25) showed significant correlations, one of negative and one of positive valence. Our findings suggest that the more severe the apathy symptoms, the less expressed are the AUs expected for the task’s valence.
Correlations With the Affect Apathy Dimension
For both men and women in the negative story, the majority of AUs were associated with the affect subdomain. AI-affect (blunted affect) is mainly assessed with facial emotional responses in clinical interviews, which could explain this finding. Depending on the patient’s emotional responsiveness in the interaction, the clinician will infer the presence of a potential emotional blunting. This is one of the main biases in the evaluation of apathy due to a lack of objectivity (clinician can miss cues) and context (patients are in a particular environment that does not translate the way they experience emotions in their daily lives). Similar studies found that flattened affect in schizophrenic or depressed patients is reflected in decreased spontaneous facial expression compared with control groups . Furthermore, they found that depressed patients showed even less facial expressivity compared with schizophrenic subjects.
Interestingly, blunted affect was the subdomain of apathy most linked to lack of facial expressivity for women. This could be explained by the hypothesis that women tend to express their emotions more [, , ]. Studies have shown that emotional expressivity and emotional memory retrieval are different depending on sex [ , ]. Men experience emotions more strongly whereas women tend to express them more. Another explanation would be that women are expected to express more happiness and sadness, and a lack of facial expressivity due to apathy would be identified more quickly than for men who are less expected to display these emotions [ ]. Future studies should investigate the link between subdomains of apathy and sex. We found that for women, AU 9 (nose wrinkler, involved in disgust) intensity was significantly correlated with AI-affect in the negative story. In one study, the authors predicted the sex of a person simply from their emotional expressions of happiness and disgust but not from expressions like sadness and surprise [ ]. This AU is expected to be more pronounced and intense for women but seems significantly less expressed in people with apathy.
Correlations With the Loss of Interest and Lack of Initiation Dimensions
Only a few AUs were linked to loss of interest and lack of initiation. For women, only AU 10 was linked to lack of initiation, and none to loss of interest. However, for men, in the positive story, all the significant correlations to AUs (such as 1 and 14) were linked to either lack of initiation or loss of interest. One explanation could be that the positive condition triggered memories that might have involved activities such as gatherings with family or friends, eliciting less intense positive emotions the more severe the apathy symptoms. Emotional valence (positive, negative, and neutral) and arousal of an event will have an impact on the way emotions are stored and retrieved in autobiographical memory . Positive memories are retrieved with richer details supposedly due to a bonding social role, while negative memories would have more of a survival role, so the events are not repeated. In the positive story, women often recalled the birth of a child or a wedding day while men recalled more diverse events.
Our results, despite being preliminary, are in line with previous research showing that besides cognitive deficits, apathy was significantly correlated with decreased overall and specific facial expression . In depression, similar results have been found [ ]. Girard et al [ ] compared specific AUs to depression severity and found that depressed people would smile less and express the emotion of contempt more. In apathy, the AUs involved in these emotions were all decreased for the male sample. However, different AUs were involved for the female sample, which underlines how important it is to consider sex differences in the study of emotional expressivity. Overall, apathy seems to have an impact on global facial expressivity and thus, can be detected by automated video analysis. Nevertheless, at this stage it remains a rather sensitive but not specific tool for detecting variations in emotional facial expression (which in turn can be an indicator for apathy or depression or negative symptoms, etc). Peham et al [ ] studied facial emotional behavior of female subjects suffering from mental disorders (personality disorder, depression, anxiety, eating disorder, etc) during a clinical interview and found no distinctive patterns that were disease specific.
In depression detection, increasingly research efforts have been placed on merging audio and video features with encouraging results, demonstrating that when both modalities are combined more precise assessments can be obtained . We previously demonstrated that, simply by extracting and analyzing voice features from the free emotional tasks, apathy prediction can be quite accurate [ , ]. Therefore, it can be assumed that in a next step, the fusion of audio and video features will improve on our current results, supporting the long-term goal of validating such technology for use in daily clinical practice.
The main study limitations are the relatively small sample size, coupled with the rather high number of statistical tests performed. As this study was exploratory, we did not employ statistical corrections for multiple comparisons, which could increase the probability of making type I errors. This is one of the first studies that analyzes correlations between facial expressivity and apathy, and these results could be used to formulate more precise hypotheses on specific AU involvement, to be tested with more robust statistical methods in future studies. Another issue is that this study relied on clinicians to rate levels of apathy and other neuropsychiatric scales. Future research should employ multiple types of scales such as self-administered scales in order to understand if there is a gap between how the patient feels in his daily life and what the clinician is observing. Combining ecological momentary assessment with these novel measures (facial behavior, voice parameters, activity monitoring, etc) and biomarkers (magnetic resonance imaging) should also be included to ensure covering all types of assessments available since there are no existing gold standard for apathy scales [, ]. More variables should be considered such as fatigue and anhedonia, as they both are related to apathy [ , ].
Overall, it can be concluded that computer vision-based facial analysis showed promising results in detecting blunted affect and global apathy in neurocognitive disorders. These preliminary results should be corroborated by further studies including a larger sample size, allowing researchers to test a reduced number of relevant hypotheses and apply corrections for multiple comparisons. As the effect size found in our study was medium to large, reduced facial expressivity may represent a promising proxy for emotional blunting. Specifically, as hypothesized, our results suggested that the presence of AUs relating to positive emotions is particularly relevant to assess apathy during the positive storytelling, while the presence of AUs relating to negative emotions may be relevant during negative storytelling, especially for men. Women may show a wider range of emotions in both positive and negative storytelling, thus suggesting the interest of assessing AUs related to both positive and negative emotions in both stories.
More research is needed to identify specific facial expressions associated with apathy and combine this method with other technologies such as automatic speech analysis and eye tracking to provide additional information for differential diagnosis. Identifying multimodal digital biomarkers (eg, voice features, activity patterns, eye paths, facial expression) and combining them with ecological momentary assessment could be the future of neuropsychiatric and cognitive assessment, allowing early detection of changes and therefore better adapted treatment . These technologies could facilitate continuous monitoring to prevent relapses in depression, psychotic crisis in schizophrenia, or the detection of early signs of cognitive deficits and behavioral changes in neurocognitive disorders.
This work was supported by the Association Innovation Alzheimer, the JL Noisiez Foundation, and by the French government, through the UCA-JEDI (Cote d’Azur University: Joint, Excellent, and Dynamic Initiative) Investments in the Future project managed by the National Research Agency (reference number ANR-15-IDEX-01). This work was done in the context of the Digital Medicine–Brain, Cognition, Behavior program of the University Côte d’Azur. Thanks to all patients who participated in our study.
AK and PR designed the studies. RZ and AK ran the experiments. RZ, VM, RF, and RG analyzed the results. GS, JJ, and RG extracted the facial features. All authors participated in writing the manuscript.
Conflicts of Interest
Auxiliary tables.DOCX File , 64 KB
- Husain M, Roiser JP. Neuroscience of apathy and anhedonia: a transdiagnostic approach. Nat Rev Neurosci 2018 Aug;19(8):470-484. [CrossRef] [Medline]
- Robert P, Lanctôt KL, Agüera-Ortiz L, Aalten P, Bremond F, Defrancesco M, et al. Is it time to revise the diagnostic criteria for apathy in brain disorders? The 2018 international consensus group. Eur Psychiatry 2018 Oct;54:71-76 [FREE Full text] [CrossRef] [Medline]
- Bakker C, de Vugt ME, van Vliet D, Verhey FRJ, Pijnenburg YA, Vernooij-Dassen MJFJ, et al. Predictors of the time to institutionalization in young- versus late-onset dementia: results from the Needs in Young Onset Dementia (NeedYD) study. J Am Med Dir Assoc 2013 Apr;14(4):248-253. [CrossRef] [Medline]
- Nijsten JMH, Leontjevas R, Pat-El R, Smalbrugge M, Koopmans RTCM, Gerritsen DL. Apathy: risk factor for mortality in nursing home patients. J Am Geriatr Soc 2017 Oct;65(10):2182-2189. [CrossRef] [Medline]
- Starkstein SE, Jorge R, Mizrahi R, Robinson RG. A prospective longitudinal study of apathy in Alzheimer's disease. J Neurol Neurosurg Psychiatry 2006 Jan;77(1):8-11 [FREE Full text] [CrossRef] [Medline]
- Robert PH, Berr C, Volteau M, Bertogliati C, Benoit M, Sarazin M, PréAL study. Apathy in patients with mild cognitive impairment and the risk of developing dementia of Alzheimer's disease: a one-year follow-up study. Clin Neurol Neurosurg 2006 Dec;108(8):733-736. [CrossRef] [Medline]
- Ruthirakuhan M, Herrmann N, Vieira D, Gallagher D, Lanctôt KL. The roles of apathy and depression in predicting Alzheimer disease: a longitudinal analysis in older adults with mild cognitive impairment. Am J Geriatr Psychiatry 2019 Aug;27(8):873-882 [FREE Full text] [CrossRef] [Medline]
- Padala PR, Padala KP, Lensing SY, Jackson AN, Hunter CR, Parkes CM, et al. Repetitive transcranial magnetic stimulation for apathy in mild cognitive impairment: a double-blind, randomized, sham-controlled, cross-over pilot study. Psychiatry Res 2018 Mar;261:312-318 [FREE Full text] [CrossRef] [Medline]
- Manera V, Abrahams S, Agüera-Ortiz L, Bremond F, David R, Fairchild K, et al. Recommendations for the nonpharmacological treatment of apathy in brain disorders. Am J Geriatr Psychiatry 2020 Apr;28(4):410-420. [CrossRef] [Medline]
- Radakovic R, Harley C, Abrahams S, Starr JM. A systematic review of the validity and reliability of apathy scales in neurodegenerative conditions. Int Psychogeriatr 2015 Jun;27(6):903-923. [CrossRef] [Medline]
- Clarke DE, Ko JY, Kuhl EA, van Reekum R, Salvador R, Marin RS. Are the available apathy measures reliable and valid? A review of the psychometric evidence. J Psychosom Res 2011 Jan;70(1):73-97 [FREE Full text] [CrossRef] [Medline]
- Yeager CA, Hyer L. Apathy in dementia: relations with depression, functional competence, and quality of life. Psychol Rep 2008 Jun;102(3):718-722. [CrossRef] [Medline]
- Tagariello P, Girardi P, Amore M. Depression and apathy in dementia: same syndrome or different constructs? A critical review. Arch Gerontol Geriatr 2009;49(2):246-249. [CrossRef] [Medline]
- Ang Y, Lockwood P, Apps MAJ, Muhammed K, Husain M. Distinct subtypes of apathy revealed by the Apathy Motivation Index. PLoS One 2017;12(1):e0169938 [FREE Full text] [CrossRef] [Medline]
- Papastavrou E, Kalokerinou A, Papacostas SS, Tsangari H, Sourtzi P. Caring for a relative with dementia: family caregiver burden. J Adv Nurs 2007 Jun;58(5):446-457. [CrossRef] [Medline]
- Matsumoto N, Ikeda M, Fukuhara R, Shinagawa S, Ishikawa T, Mori T, et al. Caregiver burden associated with behavioral and psychological symptoms of dementia in elderly people in the local community. Dement Geriatr Cogn Disord 2007;23(4):219-224. [CrossRef] [Medline]
- Gros A, Bensamoun D, Manera V, Fabre R, Zacconi-Cauvin A, Thummler S, et al. Recommendations for the use of ICT in elderly populations with affective disorders. Front Aging Neurosci 2016;8:269 [FREE Full text] [CrossRef] [Medline]
- König A, Aalten P, Verhey F, Bensadoun G, Petit P, Robert P, et al. A review of current information and communication technologies: can they be used to assess apathy? Int J Geriatr Psychiatry 2014 Apr;29(4):345-358. [CrossRef] [Medline]
- Chau SA, Chung J, Herrmann N, Eizenman M, Lanctôt KL. Apathy and attentional biases in Alzheimer's disease. J Alzheimers Dis 2016;51(3):837-846. [CrossRef] [Medline]
- Mulin E, Zeitzer JM, Friedman L, Le Duff F, Yesavage J. Relationship between apathy and sleep disturbance in mild and moderate Alzheimer's disease: an actigraphic study. J Alzheimers Dis 2011;25(1):85-91. [CrossRef] [Medline]
- Valembois L, Oasi C, Pariel S, Jarzebowski W, Lafuente-Lafuente C, Belmin J. Wrist actigraphy: a simple way to record motor activity in elderly patients with dementia and apathy or aberrant motor behavior. J Nutr Health Aging 2015 Aug;19(7):759-764. [CrossRef] [Medline]
- Linz N, Klinge X, Tröger J. Automatic detection of apathy using acoustic markers extracted from free emotional speech. 2018 Presented at: 2nd Workshop on AI for Aging, Rehabilitation and Independent Assisted Living (ARIAL) IJCAI'18; 2018; Stockholm.
- König A, Linz N, Zeghari R, Klinge X, Tröger J, Alexandersson J, et al. Detecting apathy in older adults with cognitive disorders using automatic speech analysis. J Alzheimers Dis 2019;69(4):1183-1193. [CrossRef] [Medline]
- Happy S, Dantcheva A, Das A, Zeghari R, Robert P, Bremond F. Characterizing the state of apathy with facial expression and motion analysis. 2019 Presented at: 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG ); 2019; Lille p. 1-8. [CrossRef]
- Sharma G, Joshi J, Zeghari R, Guerchouche R. Audio-visual weakly supervised approach for apathy detection in the elderly. 2020 Presented at: 2020 International Joint Conference on Neural Networks (IJCNN); 2020; Glasgow p. 1-7. [CrossRef]
- Seidl U, Lueken U, Thomann PA, Kruse A, Schröder J. Facial expression in Alzheimer's disease: impact of cognitive deficits and neuropsychiatric symptoms. Am J Alzheimers Dis Other Demen 2012 Mar;27(2):100-106 [FREE Full text] [CrossRef] [Medline]
- Joshi J, Goecke R, Alghowinem S, Dhall A, Wagner M, Epps J, et al. Multimodal assistive technologies for depression diagnosis and monitoring. J Multimodal User Interfaces 2013 Sep 7;7(3):217-228. [CrossRef]
- Cohn JF, Kruez TS, Matthews I, Yang Y, Nguyen MH, Padilla MT, et al. Detecting depression from facial actions and vocal prosody. Proc 3rd Int Conf Affective Comput Intelligent Interaction Workshops 2009. [CrossRef]
- Cummins N, Scherer S, Krajewski J, Schnieder S, Epps J, Quatieri TF. A review of depression and suicide risk assessment using speech analysis. Speech Commun 2015 Jul;71:10-49. [CrossRef]
- Kubiak T, Smyth J. Connecting domains—ecological momentary assessment in a mobile sensing framework. In: Baumeister H, Montag C, editors. Digital Phenotyping and Mobile Sensing. Studies in Neuroscience, Psychology and Behavioral Economics. Cham: Springer; 2019.
- Ramsey AT, Wetherell JL, Depp C, Dixon D, Lenze E. Feasibility and acceptability of smartphone assessment in older adults with cognitive and emotional difficulties. J Technol Hum Serv 2016;34(2):209-223 [FREE Full text] [CrossRef] [Medline]
- Ekman P, Friesen WV. Manual for the Facial Action Coding System. Palo Alto: Consulting Psychologists Press; 1978.
- Girard JM, Cohn JF, Mahoor MH, Mavadati SM, Hammal Z, Rosenwald DP. Nonverbal social withdrawal in depression: evidence from manual and automatic analysis. Image Vis Comput 2014 Oct;32(10):641-647 [FREE Full text] [CrossRef] [Medline]
- Haines N, Southward MW, Cheavens JS, Beauchaine T, Ahn W. Using computer-vision and machine learning to automate facial coding of positive and negative affect intensity. PLoS One 2019;14(2):e0211735 [FREE Full text] [CrossRef] [Medline]
- Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Washington: American Psychiatric Association; 2013.
- Folstein MF, Folstein SE, McHugh PR. "Mini-mental state." A practical method for grading the cognitive state of patients for the clinician. J Psychiatr Res 1975 Nov;12(3):189-198. [CrossRef] [Medline]
- Cummings JL, Mega M, Gray K, Rosenberg-Thompson S, Carusi DA, Gornbein J. The Neuropsychiatric Inventory: comprehensive assessment of psychopathology in dementia. Neurology 1994 Dec;44(12):2308-2314. [CrossRef] [Medline]
- Robert PH, Clairet S, Benoit M, Koutaich J, Bertogliati C, Tible O, et al. The apathy inventory: assessment of apathy and awareness in Alzheimer's disease, Parkinson's disease and mild cognitive impairment. Int J Geriatr Psychiatry 2002 Dec;17(12):1099-1105. [CrossRef] [Medline]
- Baltrusaitis T, Robinson P, Morency L. OpenFace: an open source facial behavior analysis toolkit. 2016 Presented at: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV); 2016; Lake Placid p. 1-10. [CrossRef]
- Vail A, Grafsgaard J, Boyer K, Wiebe E, Lester J. Gender differences in facial expressions of affect during learning. Proc 2016 Conf User Modeling Adaptation Personalization 2016:65-73. [CrossRef]
- Xia B. Which facial expressions can reveal your gender? A study with 3D faces. ArXiv. Preprint posted online May 1, 2018 2018 [FREE Full text] [CrossRef]
- McDuff D, Kodra E, Kaliouby RE, LaFrance M. A large-scale analysis of sex differences in facial expressions. PLoS One 2017;12(4):e0173942 [FREE Full text] [CrossRef] [Medline]
- Trémeau F, Malaspina D, Duval F, Corrêa H, Hager-Budny M, Coin-Bariou L, et al. Facial expressiveness in patients with schizophrenia compared to depressed patients and nonpatient comparison subjects. Am J Psychiatry 2005 Jan;162(1):92-101. [CrossRef] [Medline]
- Dimberg U, Lundquist LO. Gender differences in facial reactions to facial expressions. Biol Psychol 1990 Apr;30(2):151-159. [CrossRef] [Medline]
- Piefke M, Weiss PH, Markowitsch HJ, Fink GR. Gender differences in the functional neuroanatomy of emotional episodic autobiographical memory. Hum Brain Mapp 2005 Apr;24(4):313-324 [FREE Full text] [CrossRef] [Medline]
- Deng Y, Chang L, Yang M, Huo M, Zhou R. Gender differences in emotional response: inconsistency between experience and expressivity. PLoS One 2016;11(6):e0158666 [FREE Full text] [CrossRef] [Medline]
- Hess U, Adams R, Kleck R. When two do the same, it might not mean the same: the perception of emotional expressions shown by men and women. In: Group Dynamics and Emotional Expression. Cambridge: Cambridge University Press; 2009:33-50.
- Ford JH, Addis DR, Giovanello KS. Differential effects of arousal in positive and negative autobiographical memories. Memory 2012;20(7):771-778 [FREE Full text] [CrossRef] [Medline]
- Peham D, Bock A, Schiestl C, Huber E, Zimmermann J, Kratzer D, et al. Facial affective behavior in mental disorder. J Nonverbal Behav 2015 Jul 11;39(4):371-396. [CrossRef]
- Cummings J, Friedman JH, Garibaldi G, Jones M, Macfadden W, Marsh L, et al. Apathy in neurodegenerative diseases: recommendations on the design of clinical trials. J Geriatr Psychiatry Neurol 2015 Sep;28(3):159-173. [CrossRef] [Medline]
- Skorvanek M, Gdovinova Z, Rosenberger J, Saeedian RG, Nagyova I, Groothoff JW, et al. The associations between fatigue, apathy, and depression in Parkinson's disease. Acta Neurol Scand 2015 Feb;131(2):80-87. [CrossRef] [Medline]
|ADC: Apathy Diagnostic Criteria|
|AI: Apathy Inventory|
|AU: action unit|
|MMSE: Mini-Mental State Examination|
|NCD: neurocognitive disorder|
|NPI: Neuropsychiatric Inventory|
|UCA-JEDI: Cote d’Azur University: Joint, Excellent, and Dynamic Initiative|
Edited by J Torous; submitted 02.10.20; peer-reviewed by W Eikelboom, N Nazari; comments to author 30.10.20; revised version received 23.11.20; accepted 18.01.21; published 31.03.21Copyright
©Radia Zeghari, Alexandra König, Rachid Guerchouche, Garima Sharma, Jyoti Joshi, Roxane Fabre, Philippe Robert, Valeria Manera. Originally published in JMIR Formative Research (http://formative.jmir.org), 31.03.2021.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on http://formative.jmir.org, as well as this copyright and license information must be included.