skip to main content
10.1145/3197768.3201523acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

Improving Subject-independent Human Emotion Recognition Using Electrodermal Activity Sensors for Active and Assisted Living

Published: 26 June 2018 Publication History

Abstract

In Active and Assisted Living environments (AAL), one of the major tasks is to make sure that old people or disabled persons do feel well in their environment. Unfortunately, it is still a difficult task to design a learning system or build a machine learning model which can be trained on a group of subjects using physiological sensors and performs well when testing it on other subjects. This paper proposes a dynamic calibration algorithm which presents promising results for subject-independent human emotion recognition. The goal of the calibration module is to calibrate itself with respect to the features of a new subject by finding the most similar subject in the training data. In order to check the overall performance, this approach is tested using the well-known MAHNOB dataset. The results show a promising improvement based on different evaluation metrics, e.g., sensitivity and specificity.

References

[1]
Per Ahlgren, Bo Jarneving, and Ronald Rousseau. 2003. Requirements for a cocitation similarity measure, with special reference to Pearson's correlation coefficient. Journal of the Association for Information Science and Technology 54, 6 (2003), 550--560.
[2]
Fadi Al Machot, Ahmad Haj Mosa, Alireza Fasih, Christopher Schwarzlmüller, Mouhanndad Ali, and Kyandoghere Kyamakya. 2012. A novel real-time emotion detection system for advanced driver assistance systems. In Autonomous Systems: Developments and Trends. Springer, 267--276.
[3]
Naomi S Altman. 1992. An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistician 46, 3 (1992), 175--185.
[4]
David Arthur and Sergei Vassilvitskii. 2007. k-means++: The advantages of careful seeding. In Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms. Society for Industrial and Applied Mathematics, 1027--1035.
[5]
Mathias Benedek and Christian Kaernbach. 2010. Decomposition of skin conductance data by means of nonnegative deconvolution. Psychophysiology 47, 4 (2010), 647--658.
[6]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.
[7]
Roddy Cowie, Ellen Douglas-Cowie, Nicolas Tsapatsoulis, George Votsis, Stefanos Kollias, Winfried Fellenz, and John G Taylor. 2001. Emotion recognition in human-computer interaction. IEEE Signal processing magazine 18, 1 (2001), 32--80.
[8]
Irfan A Essa and Alex Pentland. 1994. A vision system for observing and extracting facial action parameters. In CVPR. 76--83.
[9]
Nico H Frijda. 1986. The emotions Cambridge University Press Cambridge. UK Google Scholar (1986).
[10]
Keinosuke Fukunaga. 2013. Introduction to statistical pattern recognition. Academic press.
[11]
Alberto Greco, Antonio Lanata, Luca Citi, Nicola Vanello, Gaetano Valenza, and Enzo Pasquale Scilingo. 2016. Skin admittance measurement for emotion recognition: A study over frequency sweep. Electronics 5, 3 (2016), 46.
[12]
Greg Hamerly and Charles Elkan. 2004. Learning the k in k-means. In Advances in neural information processing systems. 281--288.
[13]
Cheng He, Yun-jin Yao, and Xue-song Ye. 2017. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors. In Wearable Sensors and Robots. Springer, 15--25.
[14]
Suwicha Jirayucharoensak, Setha Pan-Ngum, and Pasin Israsena. 2014. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. The Scientific World Journal 2014 (2014).
[15]
Ian Jolliffe. 2002. Principal component analysis. Wiley Online Library.
[16]
Samira Ebrahimi Kahou, Christopher Pal, Xavier Bouthillier, Pierre Froumenty, Çaglar Gülçehre, Roland Memisevic, Pascal Vincent, Aaron Courville, Yoshua Bengio, Raul Chandias Ferrari, et al. 2013. Combining modality specific deep neural networks for emotion recognition in video. In Proceedings of the 15th ACM on International conference on multimodal interaction. ACM, 543--550.
[17]
Cornelia Kappeler-Setz, Franz Gravenhorst, Johannes Schumm, Bert Arnrich, and Gerhard Tröster. 2013. Towards long term monitoring of electrodermal activity in daily life. Personal and ubiquitous computing 17, 2 (2013), 261--271.
[18]
Gil Keren, Tobias Kirschstein, Erik Marchi, Fabien Ringeval, and Björn Schuller. {n. d.}. END-TO-END Learning for Dimensional Emotion Recognition from Physiological Signals. ({n. d.}).
[19]
Mahdi Khezri, Mohammad Firoozabadi, and Ahmad Reza Sharafat. 2015. Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals. Computer methods and programs in biomedicine 122, 2 (2015), 149--164.
[20]
Gil Levi and Tal Hassner. 2015. Emotion recognition in the wild via convolutional neural networks and mapped binary patterns. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. ACM, 503--510.
[21]
Greg Linden, Brent Smith, and Jeremy York. 2003. Amazon. com recommendations: Item-to-item collaborative filtering. IEEE Internet computing 7, 1 (2003), 76--80.
[22]
Anima Majumder, Laxmidhar Behera, and Venkatesh K Subramanian. 2014. Emotion recognition from geometric facial features using self-organizing map. Pattern Recognition 47, 3 (2014), 1282--1293.
[23]
MATLAB. 2015. version R2015a. (2015).
[24]
Hongying Meng, Di Huang, Heng Wang, Hongyu Yang, Mohammed AI-Shuraifi, and Yunhong Wang. 2013. Depression recognition based on dynamic facial and vocal expression features using partial least square regression. In Proceedings of the 3rd ACM international workshop on Audio/visual emotion challenge. ACM, 21--30.
[25]
Ali Mollahosseini, David Chan, and Mohammad H Mahoor. 2016. Going deeper in facial expression recognition using deep neural networks. In Applications of Computer Vision (WACV), 2016 IEEE Winter Conference on. IEEE, 1--10.
[26]
Qin Ni, Ana Belén García Hernando, and Iván Pau de la Cruz. 2015. The elderly's independent living in smart homes: A characterization of activities and sensing infrastructure survey to facilitate services development. Sensors 15, 5 (2015), 11312--11362.
[27]
Ebenezer Owusu, Yongzhao Zhan, and Qi Rong Mao. 2014. A neural-AdaBoost based facial expression recognition system. Expert Systems with Applications 41, 7 (2014), 3383--3390.
[28]
Ming-Zher Poh, Nicholas C Swenson, and Rosalind W Picard. 2010. A wearable sensor for unobtrusive, long-term assessment of electrodermal activity. IEEE transactions on Biomedical engineering 57, 5 (2010), 1243--1252.
[29]
PD Robert, P Juszczak, P Paclik, et al. 2004. PRTools4, A Matlab Toolbox for Pattern Recognition. Delft University of Technology (2004).
[30]
Mohammad Soleymani, Jeroen Lichtenauer, Thierry Pun, and Maja Pantic. 2012. A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affective Computing 3, 1 (2012), 42--55.
[31]
Samarth Tripathi, Shrinivas Acharya, Ranti Dev Sharma, Sudhanshu Mittal, and Samit Bhattacharya. 2017. Using Deep and Convolutional Neural Networks for Accurate Emotion Classification on DEAP Dataset. In AAAI. 4746--4752.
[32]
Md Zia Uddin, Mohammad Mehedi Hassan, Ahmad Almogren, Atif Alamri, Majed Alrubaian, and Giancarlo Fortino. 2017. Facial expression recognition utilizing local direction-based robust features and deep belief network. IEEE Access 5 (2017), 4525--4536.
[33]
Gert Van Der Vloed and Jelle Berentsen. 2009. Measuring emotional wellbeing with a non-intrusive bed sensor. Human-Computer Interaction-INTERACT 2009 (2009), 908--911.
[34]
Gyanendra K Verma and Uma Shanker Tiwary. 2014. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage 102 (2014), 162--172.
[35]
Wanhui Wen, Guangyuan Liu, Nanpu Cheng, Jie Wei, Pengchao Shangguan, and Wenjin Huang. 2014. Emotion recognition based on multi-variant correlation of physiological signals. IEEE Transactions on Affective Computing 5, 2 (2014), 126--140.
[36]
Carl E Williams and Kenneth N Stevens. 1981. Vocal correlates of emotional states. Speech evaluation in psychiatry (1981), 221--240.
[37]
Yaqian Xu, Isabel Hübener, Ann-Kathrin Seipp, Sandra Ohly, and Klaus David. 2017. From the lab to the real-world: An investigation on the influence of human movement on Emotion Recognition using physiological signals. In Pervasive Computing and Communications Workshops (PerCom Workshops), 2017 IEEE International Conference on. IEEE, 345--350.
[38]
Zhiding Yu and Cha Zhang. 2015. Image based static facial expression recognition with multiple deep network learning. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. ACM, 435--442.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '18: Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference
June 2018
591 pages
ISBN:9781450363907
DOI:10.1145/3197768
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • NSF: National Science Foundation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Classification
  2. Dynamic Calibration
  3. Electrodermal Activity (EDA)
  4. Emotion Recognition

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

PETRA '18

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)3
Reflects downloads up to 07 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media