Search Results

Now showing 1 - 2 of 2
  • Item
    From face to face: the contribution of facial mimicry to cognitive and emotional empathy
    (Abingdon : Routledge, 2019) Drimalla, Hanna; Landwehr, Niels; Hess, Ursula; Dziobek, Isabel
    Despite advances in the conceptualisation of facial mimicry, its role in the processing of social information is a matter of debate. In the present study, we investigated the relationship between mimicry and cognitive and emotional empathy. To assess mimicry, facial electromyography was recorded for 70 participants while they completed the Multifaceted Empathy Test, which presents complex context-embedded emotional expressions. As predicted, inter-individual differences in emotional and cognitive empathy were associated with the level of facial mimicry. For positive emotions, the intensity of the mimicry response scaled with the level of state emotional empathy. Mimicry was stronger for the emotional empathy task compared to the cognitive empathy task. The specific empathy condition could be successfully detected from facial muscle activity at the level of single individuals using machine learning techniques. These results support the view that mimicry occurs depending on the social context as a tool to affiliate and it is involved in cognitive as well as emotional empathy.
  • Item
    Towards the automatic detection of social biomarkers in autism spectrum disorder: introducing the simulated interaction task (SIT)
    ([Basingstoke] : Macmillan, 2020) Drimalla, Hanna; Scheffer, Tobias; Landwehr, Niels; Baskow, Irina; Roepke, Stefan; Behnia, Behnoush; Dziobek, Isabel
    Social interaction deficits are evident in many psychiatric conditions and specifically in autism spectrum disorder (ASD), but hard to assess objectively. We present a digital tool to automatically quantify biomarkers of social interaction deficits: the simulated interaction task (SIT), which entails a standardized 7-min simulated dialog via video and the automated analysis of facial expressions, gaze behavior, and voice characteristics. In a study with 37 adults with ASD without intellectual disability and 43 healthy controls, we show the potential of the tool as a diagnostic instrument and for better description of ASD-associated social phenotypes. Using machine-learning tools, we detected individuals with ASD with an accuracy of 73%, sensitivity of 67%, and specificity of 79%, based on their facial expressions and vocal characteristics alone. Especially reduced social smiling and facial mimicry as well as a higher voice fundamental frequency and harmony-to-noise-ratio were characteristic for individuals with ASD. The time-effective and cost-effective computer-based analysis outperformed a majority vote and performed equal to clinical expert ratings. © 2020, The Author(s).