2022-07-14: ACM Symposium on Eye Tracking Research and Applications (ETRA) 2022 Trip Report


The 14th ACM Symposium on Eye Tracking Research and Applications (ETRA 2022) was a fully hybrid conference hosted both in-person and virtually by Seattle Children’s Hospital, Washington, USA, between June 8 - 11, 2022. We (Bhanuka, Gavindya, Yasasi) attended paper sessions and keynotes where scientists and practitioners of all disciplines presented their work via Zoom on the Socio platform. ETRA brings together eye-tracking researchers from Computer Science and Psychology/Cognitive Science.

For the first time, full papers accepted for ETRA are published through proceedings of the ACM. Reflecting the interdisciplinary nature of the community, authors were given a choice of having their work published in either the Proceedings of the ACM on Computer Graphics and Interactive Techniques (PACMC- GIT) or the Proceedings of the ACM on Human-Computer Interaction (PAC- MHCI). The acceptance rate of ETRA 2022 was 38% for full papers and 42% for short papers.


Doctoral Symposium

On the first day of the ETRA 2022, the Doctoral Symposium took place immediately preceding the technical program. The Doctoral Symposium allows students to present their research and receive feedback and general advice from the experts in eye-tracking. This year Dr. Sampath Jayarathna served as a Doctoral Symposium Co-Chair at ETRA 2022. Bhanuka (Multi-user Eye-tracking) and Gavindya (Introducing a Real-Time Advanced Eye Movements Analysis Pipeline) presented their doctoral research at ETRA 2022 Doctoral Symposium as wellGavindya's doctoral symposium presentation won the Best Doctoral Symposium Presentation at ETRA 2022.


Dr. Vidhya Navalpakkam, a principal research scientist at Google Research, delivered the keynote of the Doctoral Symposium, titled "Accelerating eye movement research via ML-based smartphone gaze technology". She presented their recent work from Google, which shows that ML applied to smartphone selfie cameras can enable accurate gaze estimation, comparable to state-of-the-art hardware-based mobile eye trackers, at 1/100th the cost and without any additional hardware. She also discussed how the smartphone gaze could serve as a potential digital biomarker for detecting mental fatigue. 


Keynote I

Dr. Sophie Stellmach, Interaction Design & Science Lead at Microsoft Mixed Reality, delivered the first Keynote of ETRA 2022. The keynote was titled "More Than a Look: Multimodal Gaze-Supported Interaction in Mixed Reality". Dr. Sophie Stellmach explores entirely new ways to engage with and blend our virtual and physical realities in products such as Microsoft HoloLens. Having been an avid eye-tracking researcher for over a decade, she was heavily involved in the development of gaze-based interaction techniques for HoloLens 2. In the keynote, she presented how mixed reality glasses allow us to blend our virtual and physical realities and enable entirely new ways to engage not just with digital content, but with each other and our environments. The ability to augment human capabilities using a combination of various input modalities is an exciting yet also challenging area of exploration. 


Session 1 - AR/ VR / MR / XR 

This was the first full paper session at ETRA 2022. It began with Mathias N. Lystbæk, presenting the first full paper of the session, "Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR". In this paper, the authors have used S-Gaze & Finger and S-Gaze & Hand techniques to compare against baseline AirTap and Dwell-Typing techniques based on physical hand movements. Next, Maike Stoeve presented the full paper "Eye Tracking-Based Stress Classification of Athletes in Virtual Reality".


Session 2 - AR/ VR / MR / XR & Improving Gaze Estimation 

This session began with Marie Eckert presenting the first full paper of the session, "Pupillary Light Reflex Correction for Robust Pupillometry in Virtual Reality". ​​This paper presented a method that reveals the subtle cognitively-driven pupil size changes in uncontrolled lighting conditions, otherwise masked by the pupillary light reflex (PLR). This paper won the Best Paper Award at ETRA 2022. 


Next, Riccardo Bovo, presented the first short paper of the session, "Real-time head-based deep-learning model for gaze probability regions in collaborative VR". This paper won the Best Student Short Paper Award at ETRA 2022. 


Then, Johannes Meyer presented their short paper "A Holographic Single-Pixel Stereo Camera Sensor for Calibration-free Eye-Tracking in Retinal Projection Augmented Reality Glasses". This paper won the Best Short Paper Award at ETRA 2022. 


Next, Koki Koshikawa presented their full paper "Model-based Gaze Estimation with Transparent Markers on Large Screens". Finally, Wolfgang Fuhl and Enkelejda Kasneci presented their short paper "HPCGen: Hierarchical K-Means Clustering and Level Based Principal Components for Scan Path Generation". 


Session 3 - Improving Gaze Estimation 2 

This session began with Harsimran Kaur, presenting their full paper "Rethinking Model-Based Gaze Estimation". Then, Rakshit Sunil Kothari presented their full paper "EllSeg-Gen, towards Domain Generalization for Head-Mounted Eye-tracking".


Posters

Since the conference was hosted both in-person and virtually this year, the authors of posters were assigned their own space in the Socio platform if they were attending the conference virtually. In-person conference attendees were directed to the posters space displayed physically. 


We presented our poster “Multidisciplinary Reading Patterns of Digital Documents” on the day 3 of the conference. The study was a sequel to our studies in reading patterns among researchers in our attempts to understand varying characteristics of attention. We presented our paper virtually on the Socio platform along with in-person attendees. 



Keynote 2

Dr. Chawarska delivered the second keynote of ETRA 2022. Dr. Chawarska is the E. Frazer Beede Professor of Child Study, Pediatrics, and Statistics and Data Science at Yale School of Medicine and the Director of the Social and Affective Neuroscience of Autism (SANA) Program at the Child Study Center and the director of the Yale Autism Center of Excellence. The keynote was titled, “Early Attentional Biomarkers in Autism”. During the presentation, she introduced the symptoms of disorder in infants and toddlers. Further she introduced the attentional characteristics in early stages of the disorder, highlighting potential limitations of eye-tracking.


Session 4 - Errors and Visualization

Session 4 was on gaze errors and visualizarions. The first presentation was on potential application of gaze for error correction in augmented reality and virtual reality. The study was "Gaze as an Indicator of Input Recognition Errors". This study highlighted the potential of using eye tracking for detecting true positive input from input errors. In the next presentation, Maurice Koch from University of Stuttgart presented a novel gaze visualization which combining spatio and temporal elements. Their paper was "A Spiral into the Mind – Gaze Spiral Visualization for Mobile Eye Tracking". In the visualization they had combined point-of-view imagery across the span of experiment to form a spiral. 


Session 5 - Building better eye trackers and understanding people through their gaze

The session started with a study "Feasibility of Longitudinal Eye-Gaze Tracking in the Workplace", which explores the potential of capturing gaze in real-world environments using COTS eye trackers instead of confined laboratory environments. The study highlights  the potential of data collection in authentic environments. Next, Dr. Krzysztof Krejtz presented their study on cognitive effort while reading using the pupillary response in children. Their paper was "Measuring Cognitive Effort with Pupillary Activity and Fixational Eye Movements When Reading: Longitudinal Comparison of Children With and Without Primary Music Education". The study results indicate the utility of LHIPA as a measure of cognitive effort. In the next presentation, Lisa Spitzer from Leibniz Institute for Psychology presented their study comparing three remote video-based eye–trackers under laborartoy conditions. Their paper was "Using a test battery to compare three remote, video-based eye-trackers". The final presentation was on "Fairness in Oculomotoric Biometric Identification", where they did not identify any favorations towards genders or age groups in the dataset they had used in the experiments. 


Session 6 - Gaze and AR

In the first presentation of session 6, Johannes Meyer from Robert Bosch GmbH presented a novel eye tracking sensor for retinal projection AR glasses. Their paper was "A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback Interferometry". Their study presents a computationally efficient algorithm to track the bright pupil compared to modern video-oculography (VOG) systems with promising results towards eye-tracking in consumer AR glasses. In the next presentation, Mathias Lystbæk from Larhus University, Denmark presented a novel approach combining hand gestures and eye tracking for interactive menus in augmented reality, eliminating the need for separate input device. Their paper was "Gaze-Hand Alignment: Combining Eye Gaze and Mid-Air Pointing for Interacting with Menus in Augmented Reality".


Session 7 - Understanding People through their Gaze 2

The final session of the conference started with the presentation on “Gaze-enhanced Cross modal Embeddings for Emotion Recognition” demonstrating utility and effectiveness of gaze information for emotion recognition. Next, Johannes Meyer from Robert Bosch GmnH presented a novel approach to recognize human activity through smart glasses. Their paper was "U-HAR: A Convolutional Approach to Human Activity Recognition Combining Head and Eye Movements for Context-Aware Smart Glasses". In the study, they use a U-Net like convolutional neural network on time-series data. In the next presentation, Leslie Woehler presented their paper“Automatic Generation of Customized Areas of Interest and Evaluation of Observers’ Gaze in Portrait Videos”, presenting a novel framework for generating customized AOIs baked on facial landmarks. As the final presentation Yao Ron from University of Tübingen presented their study, “Where and What: Driver Attention-based Object Detection”, a framework for pixel-level and object-level attention prediction framework. 


Townhall Meeting

After four successful days, ETRA 2022 ended with the townhall meeting with general chairs Dr. Frederick Shic (University of Washington) and Dr. Enkelejda Kasneci (University of Tübingen, Germany). During the meeting the committee annouced that the ETRA 2023 will be held at University of Tübingen, Germany as a hybrid conference. 

-- Gavindya (@Gavindya2), Bhanuka (@mahanama94), and Yasasi (@Yasasi_Abey)

Comments