Over- and under-sedation are common in the ICU, and contribute to poor ICU outcomes including del... more Over- and under-sedation are common in the ICU, and contribute to poor ICU outcomes including delirium. Behavioral assessments, such as Richmond Agitation-Sedation Scale (RASS) for monitoring levels of sedation and Confusion Assessment Method for the ICU (CAM-ICU) for detecting signs of delirium, are often used. As an alternative, brain monitoring with electroencephalography (EEG) has been proposed in the operating room, but is challenging to implement in ICU due to the differences between critical illness and elective surgery, as well as the duration of sedation. Here we present a deep learning model based on a combination of convolutional and recurrent neural networks that automatically tracks both the level of consciousness and delirium using frontal EEG signals in the ICU. For level of consciousness, the system achieves a median accuracy of 70% when allowing prediction to be within one RASS level difference across all patients, which is comparable or higher than the median techn...
In this paper, we present a multimodal emotion recognition framework called EmotionMeter that com... more In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizin...
IEEE Transactions on Autonomous Mental Development
ABSTRACT To investigate critical frequency bands and channels, this paper introduces deep belief ... more ABSTRACT To investigate critical frequency bands and channels, this paper introduces deep belief networks (DBNs) to constructing EEG-based emotion recognition models for three emotions: positive, neutral and negative. We develop an EEG dataset acquired from 15 subjects. Each subject performs the experiments twice at the interval of a few days. DBNs are trained with differential entropy features extracted from multichannel EEG data. We examine the weights of the trained DBNs and investigate the critical frequency bands and channels. Four different profiles of 4, 6, 9 and 12 channels are selected. The recognition accuracies of these four profiles are relatively stable with the best accuracy of 86.65%, which is even better than that of the original 62 channels. The critical frequency bands and channels determined by using the weights of trained DBNs are consistent with the existing observations. In addition, our experiment results show that neural signatures associated with different emotions do exist and they share commonality across sessions and individuals. We compare the performance of deep models with shallow models. The average accuracies of DBN, SVM, LR and KNN are 86.08%, 83.99%, 82.70% and 72.60%, respectively.
Over- and under-sedation are common in the ICU, and contribute to poor ICU outcomes including del... more Over- and under-sedation are common in the ICU, and contribute to poor ICU outcomes including delirium. Behavioral assessments, such as Richmond Agitation-Sedation Scale (RASS) for monitoring levels of sedation and Confusion Assessment Method for the ICU (CAM-ICU) for detecting signs of delirium, are often used. As an alternative, brain monitoring with electroencephalography (EEG) has been proposed in the operating room, but is challenging to implement in ICU due to the differences between critical illness and elective surgery, as well as the duration of sedation. Here we present a deep learning model based on a combination of convolutional and recurrent neural networks that automatically tracks both the level of consciousness and delirium using frontal EEG signals in the ICU. For level of consciousness, the system achieves a median accuracy of 70% when allowing prediction to be within one RASS level difference across all patients, which is comparable or higher than the median techn...
In this paper, we present a multimodal emotion recognition framework called EmotionMeter that com... more In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizin...
IEEE Transactions on Autonomous Mental Development
ABSTRACT To investigate critical frequency bands and channels, this paper introduces deep belief ... more ABSTRACT To investigate critical frequency bands and channels, this paper introduces deep belief networks (DBNs) to constructing EEG-based emotion recognition models for three emotions: positive, neutral and negative. We develop an EEG dataset acquired from 15 subjects. Each subject performs the experiments twice at the interval of a few days. DBNs are trained with differential entropy features extracted from multichannel EEG data. We examine the weights of the trained DBNs and investigate the critical frequency bands and channels. Four different profiles of 4, 6, 9 and 12 channels are selected. The recognition accuracies of these four profiles are relatively stable with the best accuracy of 86.65%, which is even better than that of the original 62 channels. The critical frequency bands and channels determined by using the weights of trained DBNs are consistent with the existing observations. In addition, our experiment results show that neural signatures associated with different emotions do exist and they share commonality across sessions and individuals. We compare the performance of deep models with shallow models. The average accuracies of DBN, SVM, LR and KNN are 86.08%, 83.99%, 82.70% and 72.60%, respectively.
Uploads
Papers by Weilong Zheng