AL TAMIMI, Abdel-Karim and BROCK, Matthew (2025). EMOLight: Immersive Visual Experience for the Audibly Impaired. In: 2024 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW 2024). IEEE, 198-201. [Book Section]
Documents
35621:911945
PDF
2024314234_CameraReady.pdf - Accepted Version
Available under License Creative Commons Attribution.
2024314234_CameraReady.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (529kB) | Preview
Abstract
In this paper, we introduce EMOLight, an innovative AI-driven ambient lighting solution that enhances viewer immersion, especially for the audibly impaired, by dynamically synchronising with the emotional content of audio cues and sounds. Our proposed solution leverages YAMNet deep learning model and Plutchik's emotion-colour theory to provide real-time audio emotion recognition, user-specific customisation, and multi-label classification for a personalised and engaging experience. This synchronisation enriches the viewing experience by making it more engaging and inclusive. This research demonstrates the feasibility and potential of EMOLight, paving the way for a future where technology adapts to diverse sensory needs and preferences, revolutionising the way we experience and interact with media.
More Information
Statistics
Downloads
Downloads per month over past year
Metrics
Altmetric Badge
Dimensions Badge
Share
Actions (login required)
![]() |
View Item |