当前位置: X-MOL 学术Int. J. Neural Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Effect of Action Units, Viewpoint and Immersion on Emotion Recognition Using Dynamic Virtual Faces.
International Journal of Neural Systems ( IF 8 ) Pub Date : 2023-10-01 , DOI: 10.1142/s0129065723500533
Miguel A Vicente-Querol 1 , Antonio Fernández-Caballero 1, 2, 3 , Pascual González 1, 2, 3 , Luz M González-Gualda 4 , Patricia Fernández-Sotos 3, 4 , José P Molina 1, 2 , Arturo S García 1, 2
Affiliation  

Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.

中文翻译:

动作单元、视角和沉浸感对使用动态虚拟面孔的情感识别的影响。

面部表情识别是人类互动中的一项关键技能,但在精神疾病中往往会受到损害。为了应对这一挑战,我们开发了测试来衡量和训练这项技能。最近,虚拟人(VH)和虚拟现实(VR)技术已成为实现此目的的新颖工具。本研究调查了不同因素在 VH 传达的情感沟通和感知中的独特贡献。具体来说,它检查了在虚拟面部中使用动作单元 (AU) 的效果、VH 的定位(正面或中间轮廓)以及 VR 环境中的沉浸程度(桌面屏幕与沉浸式 VR)。三十六名健康受试者参与了每种情况。动态虚拟面孔(DVF),即带有面部动画的 VH,用于表示六种基本情绪和中性表情。结果凸显了在虚拟人脸中准确实现 AU 对于情感识别的重要作用。此外,据观察,在两种测试条件下,正面视图均优于中间视图,而沉浸式 VR 在情绪识别方面略有改善。这项研究为这些因素对情绪感知的影响提供了新的见解,并促进了对这些技术的理解和应用,以进行有效的面部情绪识别训练。
更新日期:2023-10-01
down
wechat
bug