Can Artificial Intelligence Ever Comprehend Human Emotions?
In a groundbreaking development, wearable technology is now capable of inferring an individual's emotional state, thanks to the application of Artificial Intelligence (AI) in a field known as affective computing.
These devices, such as Apple Watch or Fitbit, collect multiple physiological signals including heart rate variability (HRV), blood volume pulse via photoplethysmography (PPG) sensors, skin conductance (electrodermal activity or EDA), skin temperature, and motion/acceleration data. Some research even integrates Electroencephalography (EEG) for more direct brain activity monitoring, although consumer devices typically rely on PPG and other peripheral signals.
Once these signals are collected, they are processed and analyzed using machine learning algorithms. The raw signals are preprocessed (e.g., filtering, resampling) and features related to the temporal and frequency domains are extracted. These features serve as inputs to classification models such as optimized XGBoost, random forests, decision trees, and neural networks. Multimodal approaches combining several physiological signals tend to improve accuracy, and personalized models tailored to individual users often outperform generalized models by accounting for individual physiological variability in emotional expression.
The algorithms then map the extracted physiological features to emotional states like stress, fear, happiness, sadness, or calmness with high accuracy in controlled studies. For instance, up to ~97% accuracy has been achieved with combined PPG and EEG signals.
This technology supports mental health monitoring, stress detection, and enhancing human-computer interaction by providing real-time feedback on the user’s emotional state directly from wearable data. It has the potential to revolutionize the way we understand and respond to human emotions, particularly in professions like lorry driving or surgery, where understanding not just moods but potentially mental states like fatigue and calmness can have life-saving implications.
The research for this study can be found at 1 and 2. However, it's important to note that more research is needed to improve and validate these models, as well as to consider ethical and privacy issues.
References:
[1] Healthy Office: Mood recognition at work using smartphones and wearable sensors. (2015). Retrieved from https://www.researchgate.net/publication/301583412_HealthyOffice_Mood_recognition_at_work_using_smartphones_and_wearable_sensors
[2] Perception Clusters: Automated Mood Recognition Using a Novel Cluster-Driven Modelling System. (2019). Retrieved from https://www.researchgate.net/publication/348109115_Perception_Clusters_Automated_Mood_Recognition_Using_a_Novel_Cluster-Driven_Modelling_System
[3] A galvanic sensor is a clinically validated method used to assess stress in individuals by detecting micro-levels of sweat on the skin. (N/A). (N/A).
[4] HRV is a physiological measure that provides insights into an individual's emotional state, and it is used by athletes, recommended in mindfulness practices, and used in biofeedback. (N/A). (N/A).
[5] Nervousness can be inferred from changes in physical activity and potentially elevated heart rate. (N/A). (N/A).
Artificial Intelligence (AI) in the field of affective computing is being used to analyze physiological signals collected by wearable technology, such as heart rate variability (HRV), blood volume pulse via photoplethysmography (PPG) sensors, skin conductance, skin temperature, and motion/acceleration data, to map these signals to emotional states like stress, fear, happiness, sadness, or calmness. This technology has the potential to revolutionize mental health monitoring, stress detection, and enhancing human-computer interaction by providing real-time feedback on the user’s emotional state directly from wearable data.