Advanced EEG emotion recognition framework integrating fractal dimensions, connectivity metrics, and domain adaptive deep learning
Abstract
This study proposes an advanced framework for EEG-based emotion recognition to address challenges posed by subject variations and signal complexity, aiming to enhance mental health monitoring and human-computer interfaces. A comprehensive feature set was developed, integrating Fractal Dimensions (FD), Phase Locking Value (PLV), Pearson Correlation Coefficient (PCC), and Short-Time Fourier Transform (STFT). The framework employs both conventional classifiers (SVM, Linear Regression) and deep learning models (CNN, DA-RCNN), with a particular emphasis on domain adaptation within DA-RCNNs to mitigate inter-subject variability. Evaluation involved 10-fold cross-validation and rigorous statistical tests. The DA-RCNN model achieved a balanced accuracy of 94.5%, demonstrating competitive or superior performance compared to existing methods. Feature integration significantly improved classification, with FD features boosting accuracy to 94.5% and connectivity measures contributing an additional 7.2%. The approach exhibited computational efficiency and reduced reliance on extensive data augmentation. The proposed framework successfully integrates diverse features and domain adaptation techniques for robust EEG-based emotion recognition, marking a significant advancement in affective computing and neuroscience. The framework's computational efficiency and real-time applicability offer substantial utility for mental health monitoring, adaptive interfaces, and human-computer interaction across diverse populations and operational scenarios.
Authors

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.