Bridging Emotional Understanding: A Multimodal Emotion Detection System for Neurodivergent Individuals
DOI:
https://doi.org/10.55016/pbgrc.v1i1.81420Abstract
Human communication is inherently tied to emotions, which play a critical role in guiding and enhancing social interactions. For neurodivergent individuals, particularly children, challenges often arise in expression and interpretation of emotions. Emotion detection technologies can therefore serve as powerful tools to aid in communication and to improve social interaction. However, emotional changes among neurodivergent individuals span a wider spectrum and exhibit greater subtle differences. Existing emotion detection models have been predominantly trained with data in single modality. Integrating data from multiple modalities provides a more comprehensive approach to understanding emotions, mirroring the way humans naturally perceive the world through all five senses. This study presents a Multimodal Emotion Detection System that leverages publicly available datasets to enhance recognition accuracy. By fusing diverse data sources, the proposed model captures subtle emotional cues more effectively than traditional methods. Experimental results confirm its robustness and suitability for real-world applications.