Call for Papers
Background
Driver fatigue remains one of the leading contributors to reduced attention, delayed reaction times, and increased accident risk in both professional and everyday driving contexts. Recent advances in physiological monitoring have shown that signals such as electrocardiography (ECG), galvanic skin response (GSR), and facial behavior provide valuable information about cognitive load and alertness fluctuations. Simulated driving environments offer a safe, repeatable, and ethically suitable framework for studying these signals under controlled conditions. Affordable gaming platforms, particularly PlayStation-based driving simulations, now enable researchers to reproduce challenging or monotonous driving scenarios without requiring high-cost simulators. As a result, multimodal physiological data collection during simulated driving has become an emerging tool for human factors research, drowsiness detection studies, and machine-learning-based behavioral modeling.
Goal/Rationale
The goal of this symposium is to address the growing need for standardized, low-cost, and reproducible multimodal data acquisition protocols for fatigue and attention research. Although numerous studies highlight the importance of ECG-derived heart rate variability, sympathetic arousal indicators from GSR, and eye-related behavioral cues from camera recordings, practitioners often lack practical guidance on how to collect high-quality, time-synchronized data in real or simulated driving contexts. This symposium aims to fill that gap by demonstrating a fully accessible workflow that combines PlayStation driving tasks with real-time physiological monitoring. Participants will learn how to prepare sensors, place electrodes, monitor signal quality, synchronize multiple data streams, and annotate driving events. By the end of the session, attendees will understand both the technical and experimental considerations necessary for producing reliable datasets suitable for fatigue detection, cognitive load estimation, and physiologically informed machine learning approaches. The symposium ultimately empowers researchers and students to design and run their own multimodal experiments with minimal equipment and robust methodological structure.
Scope and Information for Participants
This symposium focuses on practical multimodal physiological data acquisition using ECG, GSR, and camera-based facial monitoring during a PlayStation-driven simulated driving task. Participants will explore themes including:
- Sensor placement and calibration for clean ECG and GSR signals
- Synchronizing physiological and video data streams
- Live monitoring and troubleshooting of signal quality
- Event annotation during driving (e.g., lane changes, speed variations, collisions)
- Exporting, structuring, and preparing datasets for further analysis
No previous experience with biomedical data collection is required. All equipment will be provided, and participants will work in small groups to collect and review their own datasets. The symposium is suitable for students, early-career researchers, and practitioners interested in fatigue detection, physiological computing, or human–machine interaction studies.
Expected Outcomes
Participants will leave the symposium with:
- Their own collected multimodal dataset (ECG + GSR + Camera)
- Practical understanding of setup, calibration, and acquisition
- Awareness of potential pitfalls in physiological experiments
- A reproducible protocol they can use in their own theses or research projects
- Insights into how such data can be modeled for fatigue detection, cognitive load estimation, or behavioral research
This symposium supports the community by promoting hands-on, low-cost, scalable physiological data acquisition techniques, enabling wider participation in fatigue-related human factors research.
