A Resource-Efficient Multi-Entropy Fusion Method and Its Application for EEG-Based Emotion Recognition
Abstract
:1. Introduction
- We propose the use of highly interpretable BREM data through a multi-entropy fusion approach to represent emotional EEG signals and the employment of a similarity-based classification approach to reduce the required sample size and the complexity of the training process.
- We identify a suitable similarity measure method for classifying the BREMs from various emotional states, enhancing the performance of the proposed method.
- We investigate the most suitable length (5 s, 10 s, 20 s, or 30 s) and the most resource-efficient data input mode (single-channel or single-region mode) to minimize data sources while maintaining an accuracy of over 80%.
2. Experimental Dataset
3. Proposed Method
3.1. Feature Extraction
3.2. Multi-Entropy Fusion
3.3. Classification Method
4. Results and Discussion
4.1. Statistical Analysis
4.2. Classification Results
4.3. Appropriate Time-Segment and Channel Results
4.4. Comparative Study
4.5. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Padfield, N.; Ren, J.; Qing, C.; Murray, P.; Zhao, H.; Zheng, J. Multi-segment majority voting decision fusion for MI EEG brain-computer interfacing. Cogn. Comput. 2021, 13, 1484–1495. [Google Scholar] [CrossRef]
- Fang, J.; Li, G.; Xu, W.; Liu, W.; Chen, G.; Zhu, Y.; Luo, Y.; Luo, X.; Zhou, B. Exploring abnormal brain functional connectivity in healthy adults, depressive disorder, and generalized anxiety disorder through EEG signals: A machine learning approach for triple classification. Brain Sci. 2024, 14, 245. [Google Scholar] [CrossRef] [PubMed]
- Larradet, F.; Niewiadomski, R.; Barresi, G.; Caldwell, D.G.; Mattos, L.S. Toward emotion recognition from physiological signals in the wild: Approaching the methodological issues in real-life data collection. Front. Psychol. 2020, 11, 1111. [Google Scholar] [CrossRef]
- Ortony, A. Are all “basic emotions” emotions? A problem for the (basic) emotions construct. Perspect. Psychol. Sci. 2022, 17, 41–61. [Google Scholar] [CrossRef] [PubMed]
- Fu, Y.; Yuan, S.; Zhang, C.; Cao, J. Emotion recognition in conversations: A survey focusing on context, speaker dependencies, and fusion methods. Electronics 2023, 12, 4714. [Google Scholar] [CrossRef]
- Jafari, M.; Shoeibi, A.; Khodatars, M.; Bagherzadeh, S.; Shalbaf, A.; García, D.L.; Gorriz, J.M.; Acharya, U.R. Emotion recognition in EEG signals using deep learning methods: A review. Comput. Biol. Med. 2023, 165, 107450. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Feng, G.; Lv, J.; Chen, Y.; Chen, R.; Chen, F.; Zhang, S.; Vai, M.-I.; Pun, S.-H.; Mak, P.-U. A lightweight multi-mental disorders detection method using entropy-based matrix from single-channel EEG signals. Brain Sci. 2024, 14, 987. [Google Scholar] [CrossRef]
- Chen, K.; Jing, H.; Liu, Q.; Ai, Q.; Ma, L. A novel caps-EEGNet combined with channel selection for EEG-based emotion recognition. Biomed. Signal Process. Control 2023, 86, 105312. [Google Scholar] [CrossRef]
- Wu, M.; Hu, S.; Wei, B.; Lv, Z. A novel deep learning model based on the ICA and Riemannian manifold for EEG-based emotion recognition. J. Neurosci. Methods 2022, 378, 109642. [Google Scholar] [CrossRef]
- Akhand, M.A.H.; Maria, M.A.; Kamal, M.A.S.; Murase, K. Improved EEG-based emotion recognition through information enhancement in connectivity feature map. Sci. Rep. 2023, 13, 13804. [Google Scholar] [CrossRef]
- Trujillo, L.; Hernandez, D.E.; Rodriguez, A.; Monroy, O.; Villanueva, O. Effects of feature reduction on emotion recognition using EEG signals and machine learning. Expert Syst. 2024, 41, e13577. [Google Scholar] [CrossRef]
- Yu, X.; Li, Z.; Zang, Z.; Liu, Y. Real-time EEG-based emotion recognition. Sensors 2023, 23, 7853. [Google Scholar] [CrossRef]
- Padfield, N.; Ren, J.; Murray, P.; Zhao, H. Sparse learning of band power features with genetic channel selection for effective classification of EEG signals. Neurocomputing 2021, 463, 566–579. [Google Scholar] [CrossRef]
- Chen, Y.; Wang, S.; Guo, J.F. DCTNet: Hybrid deep neural network-based EEG signal for detecting depression. Multimed. Tools Appl. 2023, 82, 41307–41321. [Google Scholar] [CrossRef]
- Ramzan, M.; Dawn, S. Fused CNN-LSTM deep learning emotion recognition model using electroencephalography signals. Int. J. Neurosci. 2023, 133, 587–597. [Google Scholar] [CrossRef] [PubMed]
- Álvarez-Jiménez, M.; Calle-Jimenez, T.; Hernández-Álvarez, M. A comprehensive evaluation of features and simple machine learning algorithms for electroencephalographic-based emotion recognition. Appl. Sci. 2024, 14, 2228. [Google Scholar] [CrossRef]
- Parui, S.; Kumar, A.; Bajiya, R.; Samanta, D.; Chakravorty, N. Emotion recognition from EEG signal using XGBoost algorithm. In Proceedings of the IEEE 16th India Council International Conference (INDICON), IEEE, Rajkot, India, 13–15 December 2019; pp. 1–4. [Google Scholar]
- Jimenez, I.A.C.; Olivetti, E.C.; Vezzetti, E.; Moos, S.; Celeghin, A.; Marcolin, F. Effective affective EEG-based indicators in emotion-evoking VR environments: An evidence from machine learning. Neural Comput. Appl. 2024, 36, 22245–22263. [Google Scholar] [CrossRef]
- Zong, J.; Xiong, X.; Zhou, J.; Ji, Y.; Zhou, D.; Zhang, Q. FCAN–XGBoost: A novel hybrid model for EEG emotion recognition. Sensors 2023, 23, 5680. [Google Scholar] [CrossRef]
- Islam, M.R.; Moni, M.A.; Islam, M.M.; Rashed-Al-Mahfuz, M.; Islam, M.S.; Hasan, M.K.; Hossain, M.S.; Ahmad, M.; Uddin, S.; Azad, A.; et al. Emotion recognition from EEG signal focusing on deep learning and shallow learning techniques. IEEE Access 2021, 9, 94601–94624. [Google Scholar] [CrossRef]
- Pereira, E.T.; Gomes, H.M.; Veloso, L.R.; Mota, M.R.A. Empirical evidence relating EEG signal duration to emotion classification performance. IEEE Trans. Affect. Comput. 2021, 12, 154–164. [Google Scholar] [CrossRef]
- Fernandes, J.V.M.R.; Alexandria, A.R.d.; Marques, J.A.L.; Assis, D.F.d.; Motta, P.C.; Silva, B.R.d.S. Emotion detection from EEG signals using machine deep learning models. Bioengineering 2024, 11, 782. [Google Scholar] [CrossRef]
- Song, T.; Liu, S.; Zheng, W.; Zong, Y.; Cui, Z.; Li, Y.; Zhou, X. Variational instance—Adaptive graph for EEG emotion recognition. IEEE Trans. Affect. Comput. 2023, 14, 343–356. [Google Scholar] [CrossRef]
- García-Hernández, R.A.; Celaya-Padilla, J.M.; Luna-García, H.; García-Hernández, A.; Galván-Tejada, C.E.; Galván-Tejada, J.I.; Gamboa-Rosales, H.; Rondon, D.; Villalba-Condori, K.O. Emotional state detection using electroencephalogram signals: A genetic algorithm approach. Appl. Sci. 2023, 13, 6394. [Google Scholar] [CrossRef]
- Padhmashree, V.; Bhattacharyya, A. Human emotion recognition based on time-frequency analysis of multivariate EEG signal. Knowl. Based Syst. 2022, 238, 107867. [Google Scholar]
- Li, J.W.; Barma, S.; Mak, P.U.; Chen, F.; Li, C.; Li, M.T.; Vai, M.I.; Pun, S.H. Single-channel selection for EEG-based emotion recognition using brain rhythm sequencing. IEEE J. Biomed. Health Inform. 2022, 26, 2493–2503. [Google Scholar] [CrossRef]
- Li, X.; Zhang, Y.; Tiwari, P.; Song, D.; Hu, B.; Yang, M.; Zhao, Z.; Kumar, N.; Marttinen, P. EEG based emotion recognition: A tutorial and review. ACM Comput. Surv. 2022, 55, 1–57. [Google Scholar] [CrossRef]
- Mir, M.; Nasirzadeh, F.; Bereznicki, H.; Enticott, P.; Lee, S. Investigating the effects of different levels and types of construction noise on emotions using EEG data. Build. Environ. 2022, 225, 109619. [Google Scholar] [CrossRef]
- Vempati, R.; Sharma, L.D. EEG rhythm based emotion recognition using multivariate decomposition and ensemble machine learning classifier. J. Neurosci. Methods 2023, 393, 109879. [Google Scholar] [CrossRef] [PubMed]
- Miljevic, A.; Bailey, N.W.; Murphy, O.W.; Perera, M.P.N.; Fitzgerald, P.B. Alterations in EEG functional connectivity in individuals with depression: A systematic review. J. Affect. Disord. 2023, 328, 287–302. [Google Scholar] [CrossRef] [PubMed]
- Patel, P.; Balasubramanian, S.; Annavarapu, R.N. Cross subject emotion identification from multichannel EEG sub-bands using Tsallis entropy feature and KNN classifier. Brain Inf. 2024, 11, 7. [Google Scholar] [CrossRef]
- Li, J.; Qiu, S.; Du, C.; Wang, Y.; He, H. Domain adaptation for EEG emotion recognition based on latent representation similarity. IEEE Trans. Cogn. Dev. Syst. 2020, 12, 344–353. [Google Scholar] [CrossRef]
- Ma, Y.; Zhao, W.; Meng, M.; Zhang, Q.; She, Q.; Zhang, J. Cross-subject emotion recognition based on domain similarity of EEG signal transfer learning. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 936–943. [Google Scholar] [CrossRef]
- Chen, R.; Huang, H.; Yu, Y.; Ren, J.; Wang, P.; Zhao, H.; Lu, X. Rapid detection of multi-QR codes based on multistage stepwise discrimination and a compressed MobileNet. IEEE Internet Things J. 2023, 10, 15966–15979. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef]
- Agarwal, R.; Andujar, M.; Canavan, S. Classification of emotions using EEG activity associated with different areas of the brain. Pattern Recognit. Lett. 2022, 162, 71–80. [Google Scholar] [CrossRef]
- Abdel-Hamid, L. An efficient machine learning-based emotional valence recognition approach towards wearable EEG. Sensors 2023, 23, 1255. [Google Scholar] [CrossRef]
- Abdumalikov, S.; Kim, J.; Yoon, Y. Performance analysis and improvement of machine learning with various feature selection methods for EEG-based emotion classification. Appl. Sci. 2024, 14, 10511. [Google Scholar] [CrossRef]
- Xu, F.; Pan, D.; Zheng, H.; Ouyang, Y.; Jia, Z.; Zeng, H. EESCN: A novel spiking neural network method for EEG-based emotion recognition. Comput. Methods Programs Biomed. 2024, 243, 107927. [Google Scholar] [CrossRef] [PubMed]
- Kannadasan, K.; Veerasingam, S.; Shameedha Begum, B.; Ramasubramanian, N. An EEG-based subject-independent emotion recognition model using a differential-evolution-based feature selection algorithm. Knowl. Inf. Syst. 2023, 65, 341–377. [Google Scholar] [CrossRef]
- Li, D.D.; Xie, L.; Chai, B.; Wang, Z.; Yang, H. Spatial-frequency convolutional self-attention network for EEG emotion recognition. Appl. Soft Comput. 2022, 122, 108740. [Google Scholar] [CrossRef]
- Wei, M.; Liao, Y.; Liu, J.; Li, L.; Huang, G.; Huang, J.; Li, D.; Xiao, L.; Zhang, Z. EEG beta-band spectral entropy can predict the effect of drug treatment on pain in patients with herpes zoster. J. Clin. Neurophysiol. 2022, 39, 166–173. [Google Scholar] [CrossRef]
- Sharma, L.D.; Bhattacharyya, A. A computerized approach for automatic human emotion recognition using sliding mode singular spectrum analysis. IEEE Sens. J. 2021, 21, 26931–26940. [Google Scholar] [CrossRef]
- Patel, P.; Annavarapu, R.N. EEG-based human emotion recognition using entropy as a feature extraction measure. Brain Inform. 2021, 8, 20. [Google Scholar] [CrossRef] [PubMed]
- Chen, F.; Zhao, L.; Li, B.; Yang, L. Depression evaluation based on prefrontal EEG signals in resting state using fuzzy measure entropy. Physiol. Meas. 2020, 41, 095007. [Google Scholar] [CrossRef] [PubMed]
- Chen, T.; Ju, S.; Yuan, X.; Elhoseny, M.; Ren, F.; Fan, M.; Chen, Z. Emotion recognition using empirical mode decomposition and approximation entropy. Comput. Electr. Eng. 2018, 72, 383–392. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, J.; He, Y.; Zhang, J. EEG emotion recognition using multichannel weighted multiscale permutation entropy. Appl. Intell. 2022, 52, 12064–12076. [Google Scholar] [CrossRef]
- Chen, Y.; Xu, X.; Bian, X.; Qin, X. EEG emotion recognition based on ordinary differential equation graph convolutional networks and dynamic time wrapping. Appl. Soft Comput. 2024, 152, 111181. [Google Scholar] [CrossRef]
- Wang, Z.M.; Hu, S.Y.; Song, H. Channel selection method for EEG emotion recognition using normalized mutual information. IEEE Access 2019, 7, 143303–143311. [Google Scholar] [CrossRef]
- Gao, Z.; Cui, X.; Wan, W.; Zheng, W.; Gu, Z. Long-range correlation analysis of high frequency prefrontal electroencephalogram oscillations for dynamic emotion recognition. Biomed. Signal Process. Control 2022, 72, 103291. [Google Scholar] [CrossRef]
- Mammone, N.; Ieracitano, C.; Adeli, H.; Bramanti, A.; Morabito, F.C. Permutation Jaccard distance-based hierarchical clustering to estimate EEG network density modifications in MCI subjects. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 5122–5135. [Google Scholar] [CrossRef]
- Chen, J.; Ro, T.; Zhu, Z. Emotion recognition with audio, video, EEG, and EMG: A dataset and baseline approaches. IEEE Access 2022, 10, 13229–13242. [Google Scholar] [CrossRef]
- Herman, A.M.; Critchley, H.D.; Duka, T. The role of emotions and physiological arousal in modulating impulsive behaviour. Biol. Psychol. 2018, 133, 30–43. [Google Scholar] [CrossRef] [PubMed]
- Ullah, R.; Amblee, N.; Kim, W.; Lee, H. From valence to emotions: Exploring the distribution of emotions in online product reviews. Decis. Support Syst. 2016, 81, 41–53. [Google Scholar] [CrossRef]
- Min, J.; Nashiro, K.; Yoo, H.J.; Cho, C.; Nasseri, P.; Bachman, S.L.; Porat, S.; Thayer, J.F.; Chang, C.; Lee, T.-H.; et al. Emotion downregulation targets interoceptive brain regions while emotion upregulation targets other affective brain regions. J. Neurosci. Off. J. Soc. Neurosci. 2022, 42, 2973–2985. [Google Scholar] [CrossRef] [PubMed]
- Dhara, T.; Singh, P.K.; Mahmud, M. A fuzzy ensemble-based deep learning model for EEG-based emotion recognition. Cogn. Comput. 2024, 16, 1364–1378. [Google Scholar] [CrossRef]
- Kumar, M.; Molinas, M. Human emotion recognition from EEG signals: Model evaluation in DEAP and SEED datasets. In Proceedings of the First Workshop on Artificial Intelligence for Human-Machine Interaction (AIxHMI 2022), Udine, Italy, 2 December 2022; pp. 1–14. [Google Scholar]
- Gaddanakeri, R.D.; Naik, M.M.; Kulkarni, S.; Patil, P. Analysis of EEG signals in the DEAP dataset for emotion recognition using deep learning algortihms. In Proceedings of the IEEE 9th International Conference for Convergence in Technology (I2CT), IEEE, Pune, India, 5–7 April 2024; pp. 1–7. [Google Scholar]
- Singh, U.; Shaw, R.; Patra, B.K. A data augmentation and channel selection technique for grading human emotions on DEAP dataset. Biomed. Signal Process. Control 2023, 79, 104060. [Google Scholar] [CrossRef]
- Jha, S.K.; Suvvari, S.; Kumar, M. EEG-based emotion recognition: An in-depth analysis using DEAP and SEED Datasets. In Proceedings of the 11th International Conference on Computing for Sustainable Global Development (INDIACom), IEEE, New Delhi, India, 28 February–1 March 2024; pp. 1816–1821. [Google Scholar]
- Pan, C.; Shi, C.; Mu, H.; Li, J.; Gao, X. EEG-based emotion recognition using logistic regression with Gaussian kernel and Laplacian prior and investigation of critical frequency bands. Appl. Sci. 2020, 10, 1619. [Google Scholar] [CrossRef]
- Al-Mashhadani, Z.; Bayat, N.; Kadhim, I.F.; Choudhury, R.; Park, J.-H. The efficacy and utility of lower-dimensional Riemannian geometry for EEG-based emotion classification. Appl. Sci. 2023, 13, 8274. [Google Scholar] [CrossRef]
Number of Subjects | 32 |
---|---|
Age | 27.19 ± 4.45 years |
Number of subjects | 15 females + 17 males |
Number of experimental trials per subject | 40 |
Experimental stimuli | Music videos from YouTube |
Time duration per trial | One minute |
Number of EEG channels | 32 |
Sampling frequency (Hz) | 128 |
Brain Region | EEG Channels |
---|---|
Frontal | FP1, FP2, AF3, AF4, F3, FZ, F4 |
Central | FC5, FC1, FC2, FC6, C3, CZ, C4 |
Parietal | CP5, CP1, CP2, CP6, P3, PZ, P4 |
Temporal | F7, F8, T7, T8, P7, P8 |
Occipital | PO3, PO4, O1, OZ, O2 |
Similarity Measure Method | Time Window (s) | Classification Accuracy (Mean ± Standard Deviation (%)) | |||
---|---|---|---|---|---|
Single-Channel Mode | Single-Region Mode | ||||
Arousal | Valence | Arousal | Valence | ||
DTW | 30 | 80.92 ± 4.38 | 78.95 ± 3.27 | 78.29 ± 5.43 | 76.23 ± 3.81 |
20 | 82.24 ± 4.28 | 79.85 ± 3.18 | 80.51 ± 5.08 | 78.54 ± 4.68 | |
10 | 83.06 ± 4.77 | 81.42 ± 3.40 | 80.92 ± 4.82 | 78.29 ± 3.41 | |
5 | 84.62 ± 4.39 | 82.48 ± 2.88 | 83.06 ± 5.13 | 79.69 ± 2.93 | |
MI | 30 | 81.25 ± 5.62 | 78.78 ± 3.89 | 79.36 ± 6.27 | 77.06 ± 3.68 |
20 | 83.31 ± 4.80 | 79.36 ± 3.21 | 79.61 ± 5.71 | 76.97 ± 3.34 | |
10 | 83.72 ± 4.25 | 80.51 ± 2.40 | 81.42 ± 4.86 | 78.62 ± 3.39 | |
5 | 84.79 ± 5.08 | 81.99 ± 3.07 | 82.24 ± 4.87 | 79.61 ± 3.13 | |
SCC | 30 | 66.37 ± 12.45 | 62.34 ± 9.56 | 70.56 ± 13.17 | 68.42 ± 8.21 |
20 | 65.87 ± 13.31 | 62.17 ± 8.71 | 70.64 ± 12.81 | 67.93 ± 8.06 | |
10 | 65.46 ± 14.38 | 62.58 ± 9.16 | 70.39 ± 13.33 | 68.09 ± 8.71 | |
5 | 68.75 ± 13.97 | 65.54 ± 7.33 | 71.96 ± 11.88 | 68.91 ± 8.41 | |
JSC | 30 | 59.38 ± 16.26 | 56.91 ± 9.70 | 59.38 ± 16.26 | 56.91 ± 9.70 |
20 | 59.38 ± 16.26 | 56.91 ± 9.70 | 59.38 ± 16.26 | 56.91 ± 9.70 | |
10 | 59.38 ± 16.26 | 56.91 ± 9.70 | 59.38 ± 16.26 | 56.91 ± 9.70 | |
5 | 59.38 ± 16.26 | 56.91 ± 9.70 | 59.38 ± 16.26 | 56.91 ± 9.70 |
Subject | Arousal | Valence | ||||
---|---|---|---|---|---|---|
Channel | Time Segment (s) | Classification Accuracy (%) | Channel | Time Segment (s) | Classification Accuracy (%) | |
S1 | T8 | 15–20 | 84.21 | CP5 | 5–10 | 84.21 |
S2 | FC5 | 30–35 | 81.58 | O2 | 5–10 | 81.58 |
S3 | PO4 | 10–15 | 89.47 | FC2 | 20–25 | 84.21 |
S4 | CP6 | 0–5 | 84.21 | AF4 | 30–35 | 84.21 |
S5 | PO3 | 5–10 | 81.58 | P3 | 35–40 | 81.58 |
S6 | CZ | 20–25 | 81.58 | F4 | 20–25 | 89.47 |
S7 | PO3 | 15–20 | 84.21 | FC6 | 5–10 | 86.84 |
S8 | F4 | 15–20 | 81.58 | P7 | 10–15 | 81.58 |
S9 | CP1 | 5–10 | 78.95 | F7 | 30–35 | 81.58 |
S10 | F8 | 0–5 | 78.95 | FC6 | 25–30 | 81.58 |
S11 | P4 | 25–30 | 81.58 | F3 | 5–10 | 81.58 |
S12 | FC1 | 20–25 | 92.11 | P3 | 10–15 | 78.95 |
S13 | F4 | 0–5 | 92.11 | PO3 | 0–5 | 81.58 |
S14 | F7 | 5–10 | 86.84 | AF3 | 0–5 | 81.58 |
S15 | CP5 | 5–10 | 78.95 | FC5 | 25–30 | 81.58 |
S16 | P8 | 30–35 | 84.21 | FC2 | 0–5 | 81.58 |
S17 | C4 | 45–50 | 81.58 | F8 | 50–55 | 81.58 |
S18 | CP1 | 40–45 | 84.21 | C4 | 0–5 | 81.58 |
S19 | F3 | 20–25 | 84.21 | T8 | 25–30 | 81.58 |
S20 | P8 | 35–40 | 89.47 | C3 | 10–15 | 81.58 |
S21 | PO3 | 0–5 | 94.74 | FZ | 20–25 | 78.95 |
S22 | FP1 | 15–20 | 84.21 | CP1 | 40–45 | 81.58 |
S23 | FZ | 35–40 | 86.84 | C3 | 30–35 | 89.47 |
S24 | O2 | 25–30 | 92.11 | PZ | 35–40 | 81.58 |
S25 | PO4 | 50–55 | 86.84 | CP5 | 0–5 | 78.95 |
S26 | FP1 | 50–55 | 84.21 | F7 | 10–15 | 81.58 |
S27 | CP1 | 0–5 | 81.58 | F7 | 5–10 | 86.84 |
S28 | F7 | 5–10 | 78.95 | F4 | 20–25 | 81.58 |
S29 | AF4 | 35–40 | 86.84 | FC1 | 15–20 | 81.58 |
S30 | PO3 | 35–40 | 78.95 | FZ | 50–55 | 86.84 |
S31 | FC2 | 20–25 | 81.58 | C4 | 20–25 | 84.21 |
S32 | O2 | 0–5 | 89.47 | F3 | 20–25 | 76.32 |
Time Window (s) | Number of Channel | Main Methodology | Classification Accuracy (%) | ||
---|---|---|---|---|---|
Arousal | Valence | ||||
Akhand et al. [10] | 8 | 32 | Connectivity feature map with CNN | 91.66 | 91.29 |
Dhara et al. [56] | 2 | 14 | Fuzzy rank-based deep learning approach using Gompertz function | 91.65 | 90.84 |
Kumar and Molinas [57] | 1 | 32 | Differential entropy with MLP | 94.25 | 93.39 |
Differential entropy with CNN | 94.33 | 93.53 | |||
Gaddanakeri et al. [58] | 60 | 14 | Brain rhythms with LSTM (S1–S22) | 82.40 | 78.28 |
Brain rhythms with LSTM (S23–S32) | 63.15 | 62.06 | |||
Singh et al. [59] | 3 | 5 | Grey Wolf Optimization (GWO) and LSTM with data augmentation | 81.25 | 92.50 |
Jha et al. [60] | 60 | 32 | Brain rhythms with SVM | 70.88 | 76.00 |
Pan et al. [61] | 1 | 32 | Logistic regression with Gaussian kernel and Laplacian prior | 77.03 | 77.17 |
Al-Mashhadani et al. [62] | 60 | 32 | MDRM-PCA | 57.42 | 64.06 |
This work | 5 | 1 | BREM from multi-entropy fusion and similarity measure by DTW | 84.62 | 82.48 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, J.; Feng, G.; Ling, C.; Ren, X.; Liu, X.; Zhang, S.; Wang, L.; Chen, Y.; Zeng, X.; Chen, R. A Resource-Efficient Multi-Entropy Fusion Method and Its Application for EEG-Based Emotion Recognition. Entropy 2025, 27, 96. https://doi.org/10.3390/e27010096
Li J, Feng G, Ling C, Ren X, Liu X, Zhang S, Wang L, Chen Y, Zeng X, Chen R. A Resource-Efficient Multi-Entropy Fusion Method and Its Application for EEG-Based Emotion Recognition. Entropy. 2025; 27(1):96. https://doi.org/10.3390/e27010096
Chicago/Turabian StyleLi, Jiawen, Guanyuan Feng, Chen Ling, Ximing Ren, Xin Liu, Shuang Zhang, Leijun Wang, Yanmei Chen, Xianxian Zeng, and Rongjun Chen. 2025. "A Resource-Efficient Multi-Entropy Fusion Method and Its Application for EEG-Based Emotion Recognition" Entropy 27, no. 1: 96. https://doi.org/10.3390/e27010096
APA StyleLi, J., Feng, G., Ling, C., Ren, X., Liu, X., Zhang, S., Wang, L., Chen, Y., Zeng, X., & Chen, R. (2025). A Resource-Efficient Multi-Entropy Fusion Method and Its Application for EEG-Based Emotion Recognition. Entropy, 27(1), 96. https://doi.org/10.3390/e27010096