2024-05-14: A-DisETrac: Advanced Analytic Dashboard for Distributed Eye Tracking
Distributed Eye Tracking
Distributed eye tracking refers to a system where eye-tracking technology is utilized across multiple locations. Instead of relying on a single eye-tracking device, distributed eye tracking networks use multiple eye trackers to capture eye movement data of users distributed across multiple locations. Distributed eye tracking expands the possibilities for studying visual attention and focus during collaborative tasks.
Real-time visualization of data in a distributed eye tracking system allows us to monitor trends and patterns of eye tracking measures. Eye tracking measures provide informative cues for understanding how individuals visual attention and mental effort during collaborative tasks.
In this blog, I present A-DisETrac, an advanced analytic dashboard for distributed eye tracking. A-DisEtrac is an extension of our previous work, DisETrac and Gaze Analytics Dashboard. A-DisEtrac uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard.
Advanced Analytic Dashboard for Distributed Eye Tracking (A-DisETrac)
A-DisETrac system supports real-time computation of both traditional (fixation duration, saccade duration, and saccade amplitude) and advanced gaze measures, namely, Ambient/Focal Attention with Coefficient K and Real-time Index Pupillary Activity (RIPA). Ambient/Focal Attention with Coefficient K and Real-time Index Pupillary Activity (RIPA) have been widely utilized to study human visual attention and cognitive load. Our system also allows users to re-stream eye-tracking data, and thereby mimic real-time data acquisition. The A-DisETrac system displays advanced gaze measures along with traditional gaze measures in an interactive dashboard in real-time.
Figure 1: Architecture of the A-DisEtrac, distributed eye tracking system for visual attention and cognitive load. In the real-time distributed eye-tracking system, common off-the-shelf eye-trackers are used to collect data from multiple users. In the data re-streaming setup, StreamingHub is called to re-stream data from existing data/experiments and transmit it to the MQTT broker. The realtime traditional and advanced gaze measures are calculated using (RAEMAP). |
As illustrated in Figure 1, the architecture of A-DisEtrac contains the following components.
Real-Time Eye-Tracking Setup
We use the distributed eye-tracking setup proposed by DisETrac, comprising two main components for eye-tracking: (1) data acquisition and transmission, (2) aggregation and visualization. We sample data from common off-the-shelf eye trackers using the vendor API/SDK. Then we transmit data to an MQTT broker through a public network. MQTT broker is a message server that facilitates communication between publisher and subscriber clients.
In this setup, we acquire the gaze position of each user on the screen (x,y) and the pupil dilation of each user, along with confidence estimates as determined by the vendor software. Prior to data transmission, we append an originating timestamp and a sequence number to recover the temporal order at the processing end. Moreover, we periodically perform clock synchronization using Network Time Protocol (NTP).
Data Re-streaming Setup
We integrated StreamingHub to re-stream data from earlier experiments. Internally, StreamingHub handles file-system access and data loading and provides a storage-agnostic Python API to re-stream data. In A-DisEtrac setup, we call this API to re-stream data from earlier experiments and direct this data onto an MQTT broker through a public network.
A-DisEtrac Dashboard
Real-time Gaze Measures
In both real-time setup and data re-streaming setup, at the processing end, we subscribe to the eye-tracking data streams of the MQTT broker and use them to compute eye-tracking measures. For our computations, we use Real-Time Advanced Eye Movements Analysis Pipeline (RAEMAP), an eye movement processing library to compute real-time gaze measures in two steps generating, (1) traditional positional gaze measures for each user, and (2) advanced gaze measures for each user and the group. We utilize user identifier information to distinguish and compute eye-tracking measures for each user, which we then use to compute aggregate measures. Finally, we present the data to a proctor through an interactive dashboard.
Interactive Dashboard
A-DisETrac dashboard provides a detailed real-time visualization of (1) advanced gaze measures for each user, the group, and (2) traditional positional gaze measures for each user for the ongoing experiment. Further, this dashboard provides more interactive functionalities to monitor, analyze, and control the gaze measure visualizations. A-DisETrac dashboard has four main key components as illustrated in Figure 2.
- Tabs: Tabs allow the proctor to switch between the views of different gaze measure types. The different views are designed in the dashboard for different types of gaze measures (advanced gaze measures and traditional positional gaze measures).
- Play/Pause Control: As the gaze measures are visualized in real-time charts (data streaming charts), they automatically update themselves after every n second. Hence, this play/pause control allows the proctor to pause the real-time charts and replay as necessary.
- Gaze Measures: Real-time visualization of gaze measures calculated during the user experiment.
- Controls: The control widgets include box zoom, wheel zoom, save, and reset.
In my previous blog, I presented how we use the MQTT broker as the messaging protocol between the computers and the dashboard, the HoloViz ecosystem to implement the interactive dashboard, and Streamz to transmit data from the MQTT broker into the dashboard in real time.
Demo
A demo of our A-DisETrac system is given below.
Read more of Eye-tracking and Pupillometric measures: Eye Movement and Pupil Measures: A Review
-- Yasasi Abeysinghe (@Yasasi_Abey)
Comments
Post a Comment