PETMEI 2024 seeks to bring together researchers at different stages of the pipeline from developing algorithms and models to developing applications, and across industry and academia boundaries to discuss the opportunities and needs of future computational approaches for adaptive user interfaces. Submissions can reflect on past work, in-progress projects, present challenges and approaches, identified opportunities, or critical opinions and arguments covering but not limited to the following topics:
Methods
- Eye movement detection, tracking, and analysis
- Gaze-supported multimodal interaction
- Methods for eye tracking on wearables and mobile devices
- Integration of pervasive eye tracking and context-aware computing
- Group interaction eye tracking (beyond the individual)
- Robust eye tracking (in the wild)
- Pervasive eye tracking data analysis
- Pervasive eye tracking interfaces in VR/AR
Applications
- Eye interaction with robots and virtual characters
- Eye-based activity and context recognition
- Error-aware eye tracking systems
- Pervasive attentive user interfaces
- Pervasive eye-tracking-based study of human factors
- Eye tracking and gaze interaction in healthcare