Emotion Recognition Using Affective Touch: A Survey
Infant Cry Analysis: A Survey of Datasets, Features, and Machine Learning Techniques
Deep Learning Techniques for Text-Based Emotional Response Generation: A Systematic Review
Unveiling Neural Signatures: A Comprehensive Review of EEG Biomarkers in Stress, Anxiety, and Depression
Distinguishing Depression and Bipolar Disorder From Social Media Data Utilizing Intensity of Emotions and Interpretable Deep Learning Models
UCMIB-PNS: Balancing Sufficiency and Necessity With Probabilistic Causality and Cross-Modal Uncertainty in Multimodal Sentiment Analysis
Mitigating Symptom Heterogeneity in Multimodal Depression Estimation via Level Separation and Deviation Regression
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition
Multimodal Affect Perception With Large Language Model Enhancement Network
MIND-EEG: Multi-Granularity Integration Network With Discrete Codebook for EEG-Based Emotion Recognition
MCGC-Net: Multi-Scale Controllable Graph Convolutional Network on Music Emotion Recognition
Temporal Group Constrained Transformer With Deformable Landmark Attention for Video Dimensional Emotion Recognition
Audio-Visual Feature Disentanglement and Fusion Network for Automatic Depression Severity Prediction
STRFLNet: Spatio-Temporal Representation Fusion Learning Network for EEG-Based Emotion Recognition
Building and Using State-Anxiety-Oriented Graph for Student State Anxiety Assessment in Online Classroom Scenarios
Phy-FusionNet: A Memory-Augmented Transformer for Multimodal Emotion Recognition With Periodicity and Contextual Attention
Robust Multimodal Sentiment Analysis Based on Adaptive Information Distillation and Adversarial Learning
EaNet: Enhanced Multimodal Awareness Alignment Network for Multimodal Aspect-Based Sentiment Analysis
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews
Hierarchical Multi-Criteria Representation Fusion for Robust Incomplete Multimodal Sentiment Analysis
AL-HCL: Active Learning and Hierarchical Contrastive Learning for Multimodal Sentiment Analysis With Fusion Guidance
Quantifying Emotional Patterns for EEG-Based Emotion Recognition: An Interpretable Study on EEG Individual Differences
STREL - Naturalistic Dataset and Methods for Studying Mental Stress and Relaxation Patterns in Critical Leading Roles
MPFNet: A Multi-Prior Fusion Network With a Progressive Training Strategy for Micro-Expression Recognition
Adaptive Key Role Guided Hierarchical Relation Inference for Enhanced Group-Level Emotion Recognition
Hybrid-Supervised Hypergraph-Enhanced Transformer for Micro-Gesture Based Emotion Recognition
Automated Boredom Recognition Using Multimodal Physiological Signals
Lightweight Multimodal Emotion Recognition for Companion Robots: A Deep Learning Framework Integrating Facial and Speech Features
Preparing the Heart for Duty: Virtual Reality Biofeedback in an Arousing Action Game Improves in-Action Voluntary Heart Rate Variability Control in Experienced Police
Static for Dynamic: Towards a Deeper Understanding of Dynamic Facial Expressions Using Static Expression Data
Capturing Dynamic Fear Experiences in Naturalistic Contexts: An Ecologically Valid fMRI Signature Integrating Brain Activation and Connectivity
Interview-Based Depression Detection Using LLM-Based Text Restatement and Emotion Lexicon
Towards Identity-Independent Facial Action Unit Detection: Integrating Decoupled 3D Geometry With Textural Features
Modeling Multimodal Depression Diagnosis From the Perspective of Local Depressive Representation
FMEFF Mechanism: A FastDTW-Based Music-EEG Feature Fusion Approach for Identifying Enjoyment Levels in Music Therapy
Enhancing Emotional Congruence in Sensory Substitution
Appearance- and Relation-Aware Parallel Graph Attention Fusion Network for Facial Expression Recognition
Building Altruistic and Moral AI Agent With Brain-Inspired Emotional Empathy Mechanisms
Progressive Multi-Source Domain Adaptation for Personalized Facial Expression Recognition
Real-World Classification of Student Stress and Fatigue Using Wearable PPG Recordings
Lexicon-Based Graph Attention for Severity Estimation of Eating Disorders on Social Media
Facial AU Recognition With Feature-Based AU Localization and Confidence-Based Relation Mining
DSTC: A Multimodal Network for Depression Emotion Recognition and Sentiment Analysis
Assessing Flow State in Virtual Reality: A Multi-Channel Physiological Framework With Self-Supervised Pre-Training
Describe Where You Are: Improving Noise-Robustness for Speech Emotion Recognition With Text Description of the Environment
MoDE: Improving Mixture of Depression Experts With Mutual Information Estimator for Depression Detection
Multi-Level Interaction for Emotion Recognition From Unaligned Speech and Text
Step-Wise Prompting Meets Uncertainty-Aware Dynamic Fusion for Robust EEG-Visual Emotion Recognition
Affection-Guided Bottleneck Diffusion for Missing Modality Issue in Multimodal Affective Computing
AMUSED: A Multi-Modal Dataset for Usability Smell Identification