Deep affective computing with short segment ECG
One of the challenging goals of affective computing is building robust models that can predict human affect in real time. Physiological signals are commonly used for this application, with ECG being one of the most promising due to the existence of low-cost, non-invasive sensors. There are numerous publicly available datasets for predicting affective states with ECG, and in recent years deep learning has shown promising model performance. Two new datasets collected by the LifeHD lab at The University of Texas at Austin are evaluated in this thesis, with the author being a primary contributor to the design of the LifeHD-Video study. This thesis details the implementation of a custom, modular pipeline for preprocessing, feature extraction, data visualization, model training, and performance validation for these datasets as well as the Dreamer dataset. Shallow learning models were trained using common heart rate variability features, and hyperparameters were tuned for two deep learning architectures: a convolutional neural network (CNN) and a long short-term memory (LSTM) network. The best performance was achieved on the Dreamer dataset using a CNN architecture, probabilistic binary classification, and only 1 second of ECG signal as model input. Accuracies and F1 scores for this framework were 66.7% / 0.664, 69.3% / 0.692, and 67.9% / 0.675 for valence, arousal, and dominance, respectively. However, more progress in model performance is necessary for ambulatory applications such as real-time affect monitoring. Additionally, a thorough analysis of the challenges that affective computing faces was conducted, highlighting results that were unable to be replicated from a previous study. Suggested solutions for these issues are also provided to aid future work, with the primary focus of developing transfer learning, self-supervised learning, and active learning paradigms for affective state prediction. Deep affecting computing with ECG is a challenging field, but the contributions of this thesis hope to further advance humanity towards a future where computers are able to understand our emotions.