Nonverbal Emotional Expression Database Development in the Context of Psychological Therapy
Keywords:
emotion, nonverbal expression, psychology, database, Artificial Intelligence (AI)Abstract
This study aims to develop a database of nonverbal emotional expressions in the context of psychological counseling. The study focused on two nonverbal expression parameters: expressions conveyed through facial muscles and eye movements, and expressions conveyed through voice modulation. An experimental research method was employed by simulating a psychological counseling scenario among samples. During the 15-25-minute counselling session, each sample was randomly assigned to one mood induction based on the circumplex model of emotion theory. Their nonverbal emotional expressions were recorded in video clips. The records were then identified by individuals and psychologists according to the Facial Action Coding System and analysis of expressions through voice modulation. From the 15-25 minute video clips, the facial expressions dataset contained 3,000-8,000 sequential frames capturing various emotional states, while vocal expressions contained no linguistic meaning as they were incomplete phrases. The emotions identified from the records varied depending on different emotional inductions given to the samples. The nonverbal expression database could be used to develop an Artificial Intelligence (AI) model for emotion recognition exhibited by the Thai population in the provision of mental health services.
References
นันทวัช สิทธิรักษ์, กมลเนตร วรรณเสวก, กมลพร วรรณฤทธิ์, ปเนต ผู้กฤตยาคามี, สุพร อภินันทเวช, และ พนม เกตุมาน. (2558). จิตเวช ศิริราช DSM-5. ภาควิชาจิตเวชศาสตร์ คณะแพทยศาสตร์ศิริราชพยาบาล มหาวิทยาลัยมหิดล.
วรางคณา โสมะนันทน์, คาลอส บุญสุภา, และ พลอยไพลิน กมลนาวิน. (2564). การให้บริการการปรึกษาเชิงจิตวิทยาแบบออนไลน์: มิติ ใหม่ของการให้บริการปรึกษาเชิงจิตวิทยา. วารสารบัณฑิตศึกษา มหาวิทยาลัยราชภัฏวไลยอลงกรณ์ ในพระบรมราชูปถัมภ์, 15(1), 247-260. https://opac02.rbru.ac.th/cgi-bin/koha/opac-detail.pl?biblionumber=4465
Borges, V., Duarte, R. P., Cunha, C. A., & Mota, D. B. (2019). Are you lost? Using facial recognition to detect customer emotions in retail stores. Advances in Human-oriented and Personalized Mechanisms, Technologies, and Services. CENTRIC 2019 (pp. 49-54). Valencia: Spain. https://www.researchgate.net/publication/338019900_Are_you_Lost_Using_Facial_Recognition_to_Detect_Customer_Emotions_in_Retail_Stores
Busso, C., Bulut, M., Lee, C. -C., Kazemzadeh, A., Mower, E., Kim, S., Chang, J. N., Lee, S., & Narayanan, S. S. (2008). IEMOCAP: Interactive emotional dyadic motion capture database. Language Resources and Evaluation, 42, 335-359. https://doi.org/10.1007/s10579-008-9076-6
Cordaro, D. T., Sun, R., Keltner, D., Kamble, S., Huddar, N., & McNeil, G. (2018). Universals and cultural variations in 22 emotional expressions across five cultures. Emotion, 18(1), 75-93. https://doi.org/10.1037/emo0000302
Ekman, P., & Friesen, W. V. (1978). Facial Action Coding System (FACS) [Database record]. APA PsycTests. https://doi.org/10.1037/t27734-000
Gross, J. J. (2015). Handbook of Emotion Regulation Second Edition. Guilford.
Kasuriya, S., Theeramunkong, T., Wutiwiwatchai, C., & Sukhummek, P. (2019). Developing a Thai emotional speech corpus from Lakorn (EMOLA). Language Resources and Evaluation, 53, 17-55. https://doi.org/10.1007/s10579-018-9428-9
Keltner, D., Sauter, D., Tracy, J., & Cowen, A. (2019). Emotional expression: Advances in basic emotion theory. Journal of Nonverbal Behavior, 43(2), 133-160. https://doi.org/10.1007/s10919-019-00293-3
Lapakko, D. (2007). Communication is 93% Nonverbal: An Urban Legend Proliferates. Communication and Theater Association of Minnesota Journal, 34, 7-19. https://cornerstone.lib.mnsu.edu/ctamj/vol34/iss1/2/
Luna-Jiménez, C., Griol, D., Callejas, Z., Kleinlein, R., Montero, J. M., & Fernández-Martínez, F. (2021). Multimodal emotion recognition on RAVDESS dataset using transfer learning. Sensors, 21(22), 7665. https://doi.org/10.3390/s21227665
Matsumoto, D., & Willingham, B. (2009). Spontaneous facial expressions of emotion of congenitally and noncongenitally blind individuals. Journal of Personality and Social Psychology, 96(1), 1-10. https://doi.org/10.1037/a0014037
Mollahosseini, A., Hasani, B., & Mahoor, M. H. (2019). AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Transactions on Affective Computing, 10(1), 18-31. https://doi.org/10.1109/TAFFC.2017.2740923
Russell, J. A. (1980). A circumplex model of affect. Journal of personality and Social Psychology, 39(6), 1161–1178. https://doi.org/10.1037/h0077714
Siedlecka, E., & Denson, T. F. (2019). Experimental methods for inducing basic emotions: A qualitative review. Emotion Review, 11(1), 87-97. https://doi.org/10.1177/1754073917749016
Sobin, C., & Alpert, M. (1999). Emotion in speech: The acoustic attributes of fear, anger, sadness, and joy. Journal of psycholinguistic research, 28(4), 347-365. https://doi.org/10.1023/a:1023237014909
Zeren, S. G., Erus, S. M., Amanvermez, Y., Genc, A. B., Yilmaz, M. B., & Duy, B. (2020). The Effectiveness of Online Counseling for University Students in Turkey: A Non-Randomized Controlled Trial. European Journal of Educational Research, 9(2), 825-834. https://doi.org/10.12973/eu-jer.9.2.825
Zhang, L., Walter, S., Ma, X., Werner, P., Al-Hamadi, A., Traue, H. C., & Gruss, S. (2016). “BioVid Emo DB”: A multimodal database for emotion analyses validated by subjective ratings. 2016 IEEE Symposium Series on Computational Intelligence. SSCI (pp. 1-6). IEEE. https://doi.org/10.1109/SSCI.2016.7849931
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Journal of Digital Communications
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The Office of the NBTC holds the copyright of articles appearing in the journal. The Office of the NBTC allows the public or individuals to distribute, copy, or republish the work under a Creative Commons license (CC), with attribution (BY), No Derivatives (ND) and NonCommercial (NC); unless written permission is received from the Office of the NBTC.
Text, tables, and figures that appear in articles accepted for publication in this journal are personal opinion and responsibility of the author, and not binding on the NBTC and the Office of the NBTC. In case of errors, each author is solely responsible for their own article, and not concerning the NBTC and the NBTC Office in any way.