skip to main content
10.1145/3411763.3451721acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster
Open access

Hidden Emotion Detection using Multi-modal Signals

Published: 08 May 2021 Publication History

Abstract

In order to better understand human emotion, we should not recognize only superficial emotions based on facial images, but also analyze so-called inner emotions by considering biological signals such as electroencephalogram (EEG). Recently, several studies to analyze a person’s inner state by using an image signal and an EEG signal together have been reported. However, there have been no studies dealing with the case where the emotions estimated from the image signal and the EEG signal are different, i.e., emotional mismatch. This paper defines a new task to detect hidden emotions, i.e., emotions in a situation where only the EEG signal is activated without the image signal being activated, and proposes a method to effectively detect the hidden emotions. First, when a subject hides the emotion intentionally, the internal and external emotional characteristics of the subject were analyzed from the viewpoint of multimodal signals. Then, based on the analysis, we designed a method of detecting hidden emotions using convolutional neural networks (CNNs) that exhibit powerful cognitive ability. As a result, this study has upgraded the technology of deeply understanding inner emotions. On the other hand, the hidden emotion dataset and source code that we have built ourselves will be officially released for future emotion recognition research.

References

[1]
Silke Anders, Martin Lotze, Michael Erb, Wolfgang Grodd, and Niels Birbaumer. 2004. Brain activity underlying emotional valence and arousal: A response-related fMRI study. Human brain mapping 23, 4 (2004), 200–209.
[2]
Iris Blandon-Gitlin, Katheryn Sperry, and Richard Leo. 2011. Jurors believe interrogation tactics are not likely to elicit false confessions: Will expert witness testimony inform them otherwise?Psychology, Crime & Law 17, 3 (2011), 239–260.
[3]
Bin Chen, Pei Sun, and Shimin Fu. 2020. Consciousness modulates the automatic change detection of masked emotional faces: Evidence from visual mismatch negativity. Neuropsychologia 144(2020), 107459.
[4]
Dong Yoon Choi, Deok-Hwan Kim, and Byung Cheol Song. 2020. Multimodal attention network for continuous-time emotion recognition using video and EEG signals. IEEE Access 8(2020), 203814–203826.
[5]
Paul Ekman. 1999. Basic emotions. Handbook of cognition and emotion 98, 45-60 (1999), 16.
[6]
Rinie Geenen, Linda van Ooijen-van der Linden, Mark A Lumley, Johannes WJ Bijlsma, and Henriët van Middendorp. 2012. The match–mismatch model of emotion processing styles and emotion regulation strategies in fibromyalgia. Journal of Psychosomatic Research 72, 1 (2012), 45–50.
[7]
Enhao Gong, Tianhe Wang, and Jing Xiong. 2013. Hidden emotion detection through analyzing facial expression. Technical reports 4(2013).
[8]
Richard HR Hahnloser, Rahul Sarpeshkar, Misha A Mahowald, Rodney J Douglas, and H Sebastian Seung. 2000. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature 405, 6789 (2000), 947–951.
[9]
Craig Hedge, George Stothart, Jenna Todd Jones, Priscila Rojas Frías, Kristopher Lundy Magee, and Jonathan CW Brooks. 2015. A frontal attention mechanism in the visual mismatch negativity. Behavioural brain research 293 (2015), 173–181.
[10]
Yongrui Huang, Jianhao Yang, Pengkai Liao, and Jiahui Pan. 2017. Fusion of facial expressions and EEG for multimodal emotion recognition. Computational intelligence and neuroscience 2017 (2017).
[11]
Yongrui Huang, Jianhao Yang, Siyu Liu, and Jiahui Pan. 2019. Combining facial expressions and electroencephalography to enhance emotion recognition. Future Internet 11, 5 (2019), 105.
[12]
Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.). ICLR, Banff, Canada, 10. http://arxiv.org/abs/1412.6980
[13]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems 25 (2012), 1097–1105.
[14]
Haoxiang Li, Zhe Lin, Xiaohui Shen, Jonathan Brandt, and Gang Hua. 2015. A convolutional neural network cascade for face detection. In Proceedings of the IEEE conference on computer vision and pattern recognition. IEEE, Boston, Massachusetts, 5325–5334.
[15]
Xiying Li, Yongli Lu, Gang Sun, Lei Gao, and Lun Zhao. 2012. Visual mismatch negativity elicited by facial expressions: new evidence from the equiprobable paradigm. Behavioral and Brain Functions 8, 1 (2012), 1–10.
[16]
Zhongzhe Li and Jian Kang. 2019. Sensitivity analysis of changes in human physiological indicators observed in soundscapes. Landscape and Urban Planning 190 (2019), 103593.
[17]
James A Russell. 1980. A circumplex model of affect.Journal of personality and social psychology 39, 6(1980), 1161.
[18]
Jürgen Schmidhuber and Sepp Hochreiter. 1997. Long short-term memory. Neural Comput 9, 8 (1997), 1735–1780.
[19]
Tongshuai Song, Guanming Lu, and Jingjie Yan. 2020. Emotion recognition based on physiological signals using convolution neural networks. In Proceedings of the 2020 12th International Conference on Machine Learning and Computing. Proceedings of Machine Learning Research, Shenzhen China, 161–165.
[20]
Gábor Stefanics, Motohiro Kimura, and István Czigler. 2011. Visual mismatch negativity reveals automatic detection of sequential regularity violation. Frontiers in Human Neuroscience 5 (2011), 46.
[21]
Gyanendra K Verma and Uma Shanker Tiwary. 2014. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage 102(2014), 162–172.
[22]
Wei-Long Zheng, Bo-Nan Dong, and Bao-Liang Lu. 2014. Multimodal emotion recognition using EEG and eye tracking data. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, Chicago, Illinois, 5040–5043.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
May 2021
2965 pages
ISBN:9781450380959
DOI:10.1145/3411763
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Convolutional Neural Networks
  2. Hidden emotion detection
  3. inner emotion

Qualifiers

  • Poster
  • Research
  • Refereed limited

Funding Sources

  • The Ministry of Trade, Industry, and Energy, Korea

Conference

CHI '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)504
  • Downloads (Last 6 weeks)50
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media