All workhop papers are now available in the ACM digital library. Check out the workshop program.
This years best paper award goes to Oleg Spakov and Päivi Majaranta for their paper “Enhanced Gaze Interaction Using Simple Head Gestures”. Many congratulations! The paper was selected (by Geert and Andreas) because of its high quality and expected impact from the top three papers as ranked by the independent committee of international expert reviewers.
After a great workshop with lots of interesting presentations and discussions as well as a delicious dinner, we now started to put the slides of the presentations online. Check out the workshop program – links to the papers in the ACM digital library will follow soon!
The joint workshop dinner will take place in the Sonoma Grille from 7pm.
Starting this year, we will discuss a specific topic of interest in more detail as part of the joint discussion after the individual paper presentations.
The special topic will be announced before the workshop on this website and the mailing list. This will allow you to take your time, develop your opinion and exchange yourself with your colleagues at your lab or via the mailing list. Well prepared we will then hopefully enter a particularly lively and fruitful interaction at the workshop.
Special topic this year: “Mobile Gaze-Based Interaction Challenge (MobiGaze)”
The first topic that we would like to discuss at PETMEI 2012 is the idea of running a “Mobile Gaze-Based Interaction Challenge (MobiGaze)”. The purpose of the challenge is, similar to other fields such as computer vision, to encourage the community to work on a set of specific benchmark problems and research questions and to continuously improve on earlier results obtained on these problems over the years. This will hopefully not only push the field as a whole and increase the impact of work published in it but also contribute open source hardware, methods and gaze data back to the community.
The team for the challenge is already building up, so it is definitely coming. However, we need the community to carve out the essentials of such a challenge, for example:
- What are typical application scenarios?
- What are the most common problems that are unique to our field?
- Is there a least common infrastructure (hardware and software) which can act as reference platform?
If you would like to receive updates or discuss ideas on the challenge before the workshop please subscribe to the MobiGaze mailing list.
You can now join the brand new PETMEI mailing list
We are very pleased to announce that Mary M. Hayhoe, Ph.D. will give a keynote at PETMEI 2012.
Recent developments in mobile eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that are pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. The potential applications for the ability to track and analyze eye movements anywhere and anytime call for new research to further develop and understand visual behaviour and eye-based interaction in daily life settings.
PETMEI 2012 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these ﬁelds together and to establish the workshop as the premier forum for research on pervasive eye tracking.