Bridging Emotional Understanding: A Multimodal Emotion Detection System for Neurodivergent Individuals

Authors

  • Wamika Jha University of Calgary, Faculty of Science
  • Zoe Kirsman University of Calgary, Faculty of Science
  • Mea Wang University of Calgary, Faculty of Science
  • Usman Alim University of Calgary, Faculty of Science

DOI:

https://doi.org/10.55016/pbgrc.v1i1.81420

Abstract

Human communication is inherently tied to emotions, which play a critical role in guiding and enhancing social interactions. For neurodivergent individuals, particularly children, challenges often arise in expression and interpretation of emotions. Emotion detection technologies can therefore serve as powerful tools to aid in communication and to improve social interaction. However, emotional changes among neurodivergent individuals span a wider spectrum and exhibit greater subtle differences. Existing emotion detection models have been predominantly trained with data in single modality. Integrating data from multiple modalities provides a more comprehensive approach to understanding emotions, mirroring the way humans naturally perceive the world through all five senses. This study presents a Multimodal Emotion Detection System that leverages publicly available datasets to enhance recognition accuracy. By fusing diverse data sources, the proposed model captures subtle emotional cues more effectively than traditional methods. Experimental results confirm its robustness and suitability for real-world applications.

Downloads

Download data is not yet available.

Downloads

Published

2025-04-29

How to Cite

Jha, W., Kirsman, Z., Wang, M., & Alim, U. (2025). Bridging Emotional Understanding: A Multimodal Emotion Detection System for Neurodivergent Individuals. Peer Beyond Graduate Research Conference, 1(1), 70–73. https://doi.org/10.55016/pbgrc.v1i1.81420