2021-06-04: ACM Symposium on Eye Tracking Research and Applications (ETRA) 2021 Trip Report
The 13th ACM Symposium on Eye Tracking Research and Applications (ETRA2021) was hosted virtually (due to the COVID-19 pandemic) jointly by the University of Stuttgart, Germany and the Ulm University, Germany, between May 24 - 27, 2021 on MeetAnyway platform. This was my (Gavindya Jayawardena) first time attending ETRA. I attended paper sessions and keynotes where scientists and practitioners of all disciplines presented their work via Zoom on MeetAnyway. ETRA aims to bring together eye tracking researchers from Computer Science and Psychology/Cognitive Science. The motto of this year's conference was "Bridging Communities". There were more than 300 participants who attended ETRA 2021 virtually, making ETRA 2021 the largest ETRA Conference in the history. The paper presentations consisted of papers accepted for both ETRA 2021 and ETRA 2020 since ETRA 2020 was cancelled.
@ETRA_conference 2021 is in full effect! More than 300 participants are virtually coming together under the motto “Bridging Communities”. #ETRA2021 is chaired by #SimTechPR and @SfbTrr161 project leader @BullingAndreas from @Uni_Stuttgart and #AnkeHuckauf from @uni_ulm! pic.twitter.com/dQ4pXh2pkM
— SimTech_Stuttgart (@SimTechStuttga2) May 25, 2021
Day 1
#ETRA2021 - Day One - up and running at https://t.co/Pxg0DwBKas
— ETRA (@ETRA_conference) May 24, 2021
You can also follow the happenings and discussion at our discord channel https://t.co/hWcDD9YvmU
Day 2
Welcome
ETRA2021 Welcome - Curated tweets by Gavindya2#ETRA2021 Welcome address by the general chairs Andreas Bulling & Anke Huckauf @BullingAndreas @Uni_Stuttgart @uni_ulm pic.twitter.com/Zj9NM1xutG
— ETRA (@ETRA_conference) May 25, 2021
@THirzle presenting the best paper award winners of short papers.
— Gavindya (@Gavindya2) May 25, 2021
Congratulations to the best paper award winners of short papers #ETRA2021 and #ETRA2020! pic.twitter.com/YtqY011xir
Posters & Demos & Videos I
More interesting posters from the teaser video of posters #ETRA2021 pic.twitter.com/NXkMQ2N14j
— Gavindya (@Gavindya2) May 25, 2021
The poster titled "Repetition effects in task-driven eye movement analyses after longer time-spans" is being presented by Thomas Berger #ETRA2021 pic.twitter.com/MyQUcWMpy5
— Gavindya (@Gavindya2) May 25, 2021
Ryota Nishizono presenting their poster titled "Synchronization of Spontaneous Eyeblink during Formula Car Driving" at #ETRA2021 pic.twitter.com/d2ksdb2yA7
— Gavindya (@Gavindya2) May 25, 2021
The poster - Faces strongly attract early fixations in naturally sampled real-world stimulus materials pic.twitter.com/TdTbxhnNN7
— Gavindya (@Gavindya2) May 25, 2021
The poster - EyeLogin - Calibration-free Authentication Method For Public Displays Using Eye Gaze pic.twitter.com/dnML0zLHx1
— Gavindya (@Gavindya2) May 25, 2021
Mostafa Elshamy presenting their poster "Fixation: A universal framework for experimental eye movement research" #ETRA2021 pic.twitter.com/7zAb3VBpS6
— Gavindya (@Gavindya2) May 25, 2021
Full Papers I: Methods I
The first Full Papers session at #ETRA2021 starting off with "Toward Eye-Tracked Sideline Concussion Assessment in eXtended Reality" by Anderson Schrader pic.twitter.com/WdDps1KNpQ
— Gavindya (@Gavindya2) May 25, 2021
@atduchowski answering the question about technical
— Gavindya (@Gavindya2) May 25, 2021
challenges, mentions that HoloLens 2 requires different SDKs and toolkits making it challenging to program the HoloLens 2 for eye-tracking compared to VR headset HTC Vive #ETRA2021 pic.twitter.com/VSP0gb98xW
The dataset used is available at https://t.co/lyIANkPhjv pic.twitter.com/GMdBtg9bur
— Gavindya (@Gavindya2) May 25, 2021
The authors have used a subjective score as the indicator of cognitive load. Their results show that memorization and eye-typing can be used as cognitive load measurement. pic.twitter.com/b7Ap0mWo9p
— Gavindya (@Gavindya2) May 25, 2021
Raw data for the benchmark is available at https://t.co/4ANUFXbkTO pic.twitter.com/S9pvvk8xO3
— Gavindya (@Gavindya2) May 25, 2021
Anton Mølbjerg Eskildsen presenting the last full paper of the Methods I session, "Label Likelihood Maximisation: Adapting iris segmentation models using domain adaptation" at #ETRA2021
— Gavindya (@Gavindya2) May 25, 2021
This paper has been published in #ETRA2020 and it is available at https://t.co/4vKmHTL7yT pic.twitter.com/abYp2zGJaS
Keynote I
Full Papers II: Gaze Analysis and Interaction
Starting the Gaze Analysis and Interaction session,
— Gavindya (@Gavindya2) May 25, 2021
Nora Castner is presenting their full paper titled "Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing" at #ETRA2021 pic.twitter.com/Xe0N180kvA
Goals of the study
— Gavindya (@Gavindya2) May 25, 2021
- use gained insights while viewing medical images for the education of novice people
- automated scanpath classification#ETRA2021 pic.twitter.com/HMTtb4ziT8
@LeanneChukoskie presenting their full paper titled "Analyzing Gaze Behavior Using Object Detection and Unsupervised Clustering" at #ETRA2021
— Gavindya (@Gavindya2) May 25, 2021
The paper has been published in #ETRA2020 and available at https://t.co/zps1e7IFYb pic.twitter.com/El4LuheiMe
The authors are presenting a MinHash approach for efficient scanpath similarity calculation #ETRA2021 pic.twitter.com/ElJ7BjdoQK
— Gavindya (@Gavindya2) May 25, 2021
@mihaibace presenting their full paper titled "Combining Gaze Estimation and Optical Flow for Pursuits Interaction" at #ETRA2021
— Gavindya (@Gavindya2) May 25, 2021
This paper has been published in #ETRA2020 and is available at https://t.co/z079r2i5de pic.twitter.com/ZfrAJRaJEL
Key takeaway:
— Gavindya (@Gavindya2) May 25, 2021
- Gaze interaction techniques cause digital eye strain#ETRA2021 pic.twitter.com/kUzfR1rlCB
Full Papers III: Applications
Starting the third full paper session, Applications, Beibin Li presented their paper titled "Selection of Eye-Tracking Stimuli for Prediction by Sparsely Grouped Input Variables for Neural Networks: towards Biomarker Refinement for Autism" #ETRA2021 https://t.co/Ry8CzWMxt9 pic.twitter.com/KwlS8D9PXA
— Gavindya (@Gavindya2) May 25, 2021
The authors use several gaze features to compile them into signatures and compare those in real and fake videos, formulating geometric, visual, metric, temporal, and spectral variations. #ETRA2021 pic.twitter.com/XQDd9Zsjxa
— Gavindya (@Gavindya2) May 25, 2021
@Niveta_Ramkumar presenting their full paper titled "Eyes on URLs: Relating Visual Behavior to Safety Decisions" at #ETRA2021
— Gavindya (@Gavindya2) May 25, 2021
This paper has been published in #ETRA2020 and is available at https://t.co/G0NOEFZo4b pic.twitter.com/V9wNayOMb7
@WenxinFeng from @Google presenting their full paper titled "HGaze Typing: Head-Gesture Assisted Gaze Typing" at #ETRA2021 which won the Best Paper Award.
— Gavindya (@Gavindya2) May 25, 2021
It is available at https://t.co/VeBvyU3UEI pic.twitter.com/oZjNUIGsz1
Key takeaways:
— Gavindya (@Gavindya2) May 25, 2021
- HGaze combines both eye movements and head gestures
- HGaze achieves better text entry rate
- Head gestures provide natural command activations#ETRA2021 pic.twitter.com/GujQioYKqj
Robert Bixler presenting their full paper titled "Crossed Eyes: Domain Adaptation for Gaze-Based Mind Wandering Models" at #ETRA2021
— Gavindya (@Gavindya2) May 25, 2021
The paper is available at https://t.co/APbSAvW8Du pic.twitter.com/wXrRjasuNB
Day 3
Posters & Demos & Videos III
Kara J. Emery presenting the best short paper of #ETRA2021 "The OpenNEEDS: A Dataset of Gaze, Head, Hand, and Scene Signals During Exploration in Open-Ended VR Environments"
— Gavindya (@Gavindya2) May 26, 2021
The paper is available at https://t.co/ZC2qz7IpMB pic.twitter.com/VlWLGzmQ10
Full Papers IV: Gaze Input
Marking the second day of the #ETRA2021, the fourth full paper session, Gaze Input, Argenis Ramirez Gomez presenting their full paper titled "Gaze+Hold: Eyes-only Direct Manipulation with Continuous Gaze Modulated by Closure of One Eye".https://t.co/rgQUdOzRv1 pic.twitter.com/sHFs2SrvRC
— Gavindya (@Gavindya2) May 26, 2021
Key takeaways:
— Gavindya (@Gavindya2) May 26, 2021
- eye-only direct manipulation
- use eyes as separate input channels pic.twitter.com/oY4vCIMpsw
They use users’ understanding of password strength in security mechanisms by analyzing the gaze behavior during password creation.
— Gavindya (@Gavindya2) May 26, 2021
Key takeaway:
- Password strength is reflected in user's gaze behavior #ETRA2021 pic.twitter.com/6kSwOgzcH1
@ludwigsidenmark presenting their full paper titled "BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement"#ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
This paper has been published in #ETRA2020 and available at https://t.co/PYVPhbq5WV pic.twitter.com/nt1ckmrszy
Myungguen Choi presenting their full paper "Bubble Gaze Cursor + Bubble Gaze Lens: Applying Area Cursor Technique to Eye-gaze Interface" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
The paper has been published in #ETRA2020 and available at https://t.co/MdTEBeWsHB pic.twitter.com/eALhQdnXMJ
Key takeaways:
— Gavindya (@Gavindya2) May 26, 2021
- Bubble gaze cursor is significantly faster than standard point cursor based eye gaze interface
- Bubble gaze lens is faster than bubble gaze cursor#ETRA2021 pic.twitter.com/OxKnNWeys8
They have used webcam based eye-tracker developed based on OpenFace.#ETRA2021 pic.twitter.com/iINcnMF3zh
— Gavindya (@Gavindya2) May 26, 2021
Keynote II
Full Papers V: Visualization and Annotation
Their paper discusses the design space for gaze-adaptive lenses. The authors have presented an approach that automatically displays additional details with respect to visual focus. #ETRA2021 pic.twitter.com/vKVOp0iV1c
— Gavindya (@Gavindya2) May 26, 2021
They propose a visual analytics approach to annotate pervasive eye tracking video.
— Gavindya (@Gavindya2) May 26, 2021
Their approach enables:
1. efficient annotation
2. direct interpretation of the results#ETRA2021 pic.twitter.com/lZzqFw1NGD
@wallnergue presenting their full paper titled "The Power of Linked Eye Movement Data Visualizations" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
The paper is available at https://t.co/D64MPyjFcu pic.twitter.com/vr9ZoL1sZv
The tool allows users to select data in one visualization, and they will be highlighted in all active visualizations for comparison tasks.The user study has showed tool is understandable & providing linked customizable views is beneficial for analyzing eye-movement data #ETRA2021 pic.twitter.com/UOMKgQTaQC
— Gavindya (@Gavindya2) May 26, 2021
Kuno Kurzhals presenting his full paper titled "Image-Based Projection Labeling for Mobile Eye Tracking" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
The paper is available at https://t.co/pOgQOkpTaZ pic.twitter.com/Y1R1NHG0Ap
Key takeaway:
— Gavindya (@Gavindya2) May 26, 2021
- image based projection labelling enables an efficient way to make eye-movement data from mobile eye-trackers such as HoloLens comparable#ETRA2021 pic.twitter.com/jpQXzvxdza
Lena Stubbemann from University of Kassel, Germany, presenting their full paper titled "Neural Networks for Semantic Gaze Analysis in XR Settings" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
The paper is available at https://t.co/fJSlLYLw08 pic.twitter.com/QS2OU35qsZ
Key takeaways:
— Gavindya (@Gavindya2) May 26, 2021
- Automated process to build suitable training data sets for the annotation of volumes of interest
- Annotation at a feature level using CNNs#ETRA2021 pic.twitter.com/YDYDKBG8Jr
Full Papers VI: Methods II
@atduchowski hosting the last full papers session at #ETRA2021, which is Methods II @ETRA_conference pic.twitter.com/t8I5p2tkuY
— Gavindya (@Gavindya2) May 26, 2021
Starting the Full Papers VI session, Methods II, Maryam Keyvanara presenting their full paper titled "Effect of a Constant Camera Rotation on the Visibility of Transsaccadic Camera Shifts" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
This #ETRA2020 paper is available at https://t.co/TD1A2OZLND pic.twitter.com/OZPCEP8SbE
Key takeaways:
— Gavindya (@Gavindya2) May 26, 2021
- camera motion had a strong effect than saccade direction on transsaccadic detectabilities
- velocity based saccade detection algorithm has been used to detect saccades in real-time#ETRA2021 pic.twitter.com/cOdf1PIZ8Y
They present a large dataset of eye-images captured using a VR headset with 2 synchronized eye-facing cameras at a frame rate of 200 Hz.
— Gavindya (@Gavindya2) May 26, 2021
This dataset is compiled from video captures of the eye-region collected from 152 participants.#ETRA2021 pic.twitter.com/oDtQj8jhzN
Kenneth Holmqvist from Regensburg Universitat presenting their full paper titled "Validation of a prototype hybrid eye-tracker against the DPI and the Tobii Spectrum" at #ETRA2021
— Gavindya (@Gavindya2) May 26, 2021
This paper has been published in #ETRA2020 and is available at https://t.co/rtkY9duOtR pic.twitter.com/sFTPx2nemK
Key takeaways:
— Gavindya (@Gavindya2) May 26, 2021
- hybrid eye-tracker (EWET1) consists of an optoelectronic CR tracker(4000 Hz) & camera-based translational TR movement tracker(120 Hz)
- EWET1 detects microsaccades
- gaze direction data is unaffected by variation in pupil dilation caused by luminance changes pic.twitter.com/Rnbj2lV1vj
They present a framework to model and evaluate obfuscation methods for removing sensitive information in eye-tracking. They have focused on preventing iris-pattern identification. #ETRA2021 pic.twitter.com/xaIpzGRl7u
— Gavindya (@Gavindya2) May 26, 2021
Comments
Post a Comment