Multi-Modal Human-Machine Interaction in Recreational Vehicles (RVs): Enabling Immersive Home-Theatre Experiences through Next-Generation Automotive UI/UX
Presenter: Ms. Sarah Youssef
Date: Friday February 13th, 2026
Time: 9:00 am and 2:00 pm
Location: 4th Floor (Workshop space) at 300 Ouellette Avenue (School of Computer Science Advanced Computing Hub)
LATECOMERS WILL NOT BE ADMITTED once the presentation has begun.
Abstract: The transition from conventional vehicles to autonomous and semi-autonomous platforms is fundamentally redefining the role of the automotive interior—from a driver-centric control space to a human-centered interactive living environment. This shift is particularly significant in Recreational Vehicles (RVs), where occupants are not merely passengers but residents engaging in diverse activities such as relaxation, entertainment, work, and mobility within a moving space. In such contexts, traditional dashboard-based interfaces and single-modality controls are insufficient to support safe, intuitive, and continuous interaction.
This presentation explores the role of Multi-Modal Human–Machine Interaction (MM-HMI) as a key enabler of next-generation automotive user experience in autonomous RV environments. By integrating heterogeneous sensing technologies—including cameras, microphones, motion and occupancy sensors, and environmental monitoring—vehicles achieve multimodal perception of both the external driving scene and the internal human context. Sensor fusion allows the system to infer occupant state, attention, and activity, enabling context-aware adaptation of interaction modalities. Multimodal communication channels such as voice interaction, gesture recognition, gaze awareness, ambient lighting, spatial audio, and haptic feedback then translate system status and intentions into intuitive human-understandable feedback.
The presentation further discusses how multimodal HMI supports immersive in-cabin experiences, particularly home-theatre-like entertainment environments, while maintaining safety and situational awareness in automated driving conditions. By dynamically adapting notifications and interaction methods to occupant posture, location, and cognitive load, the vehicle becomes an intelligent interactive partner rather than a passive transportation device. Ultimately, the work highlights that the success of future autonomous mobility—especially in mobile living spaces such as RVs—depends not only on perception and control algorithms but on effective human-vehicle communication that fosters transparency, comfort, and user trust.
Workshop Outline:
- Introduction to autonomous driving
- Motivation
- From driver-vehicle system to human-AI system
- What is multi-modal human-machine interaction
- Architecture of multi-modal human-machine interaction
Registration