2022-03-01: Eye Tracking Streaming Analytics Project funded by the NSF CAREER 2021

Eye Tracking with PupilLabs Core Tracker

"Eyes Are Windows to the Soul” especially when machines can track your eyes!
Eye tracking is a technology that can track where you are looking at and how your eyes react to the various stimulus around your environment. Eye tracking measures can glean extraordinary amount of information about your behavior and cognition. The eye movement data analytics is often computationally expensive, and most vendor software do not support automated real-time analysis of eye movements. If we can process these eye movements in real-time, we can infer quantifiable information about covert cognitive processes such as human attention and working memory.

We are excited to receive 2021 National Science Foundation (NSF) CAREER award for this research towards “Eye Tracking Streaming Analytics.”
 
The CAREER award is the NSF's most prestigious award in support of early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization. CAREER grants are awarded to fewer than 400 early career engineers and scientists each year who are expected to make significant impact in their disciplines.

The five-year, $550,000 award will allow our research lab to fundamentally advance research in real-time eye tracking data processing and analytics. If we can find a correlations between attention and workload using real-time eye tracking, the theoretical impact to the eye tracking research community and the societal contributions can be immense.

Sampath giving a talk at NASA Langley Data Science Group about his Eye Tracking technology



We are collaborating with NASA Langley Crew Systems and Aviation Operations Branch to apply this real-time advance eye movement measures for safe and effective advanced air mobility (drone) operations. We are currently developing eye movement measures for a simulation environment to extract actionable information and alert physiological reactions during unpiloted training sessions. Our recent work on metadata-driven streaming analytics is geared towards this architectural design for real-time analytics. 
Yasith Jayawardana, Gavindya Jayawardena, Andrew Duchowski, and Sampath Jayarathna, "Metadata-Driven Eye Tracking for Real-Time Applications", 21st ACM Symposium on Document Engineering, 2021.
From healthcare domain, we are focusing on “joint attention” which is one of the central impairments of early, nonverbal social communication in Autism. Children with Autism have difficulty with joint attention, which is the ability to share focus on an object or area with another person. Our goal here is to develop real-time multi-user eye tracking measures to detect joint attention and develop intervention and behavior modification using game-based training of joint attention. Our main focus is on developing multi-user eye tracking for joint attention. 

We recently designed a gaze estimation methodology (https://github.com/nirdslab/multigazeinterations) for multi-user gaze tracking using a commodity webcam platforms. We also evaluated joint attention tracking tasks during dynamic interaction with displays and demonstrate implications towards developing low-cost eye-trackers using commodity webcams. Our results show the potential for applications that require approximate gaze positions in multi-user gaze tracking.  

Joint Attention Eye Tracking using sing web Camera
Bhanuka Mahanama, Yasith Jayawardana, and Sampath Jayarathna, "Gaze-Net: Appearance-Based Gaze Estimation using Capsule Networks", 11th Augmented Human International Conference, pp. 1-4, 2020. (Also available as a Technical Report arXiv:2004.07777)

Education and outreach are big part of the CAREER award, and NSF encourages all awardees to think creatively about how their research will impact the education goals. I have partnered with the local high schools to offer an eye tracking game development summer camp.  With the funding support from Virginia Space Grant Consortium (VSGC), we were able to organize a two-week research camp for 25 high school students during summer 2020 to teach computer coding and data science skills. We have also received funding support to continue this effort for a Data Science camp in summer 2021 from PRA Group and including another round of funding support from VSGC to offer a three-day professional development workshop for local high school teachers. With these education activities, I hope that we can attract a steady stream of students over the years from neighboring school districts, including underrepresented minority and female high school students to STEM fields, especially Computer Science. 

Sampath talking to Grade 6-7 students at Old Donation School about Eye Tracking technology

We are passionate about science outreach to marginalized population. I have done computer coding lectures to inmates at the Norfolk City Jail and taught multiple summer and academic semester coding lessons at Juvenile Detention Center in Norfolk for detained youth. One of our long-term goals is to improve educational accessibility and availability for detained students especially the ones with learning disabilities like ADHD and other visual perceptual handicaps. We recently received another NSF grant to support this idea towards developing a STEM educational delivery for Norfolk Juvenile Detention Center.

We are sincerely thankful for the support from the WS-DL research group, Department of Computer Science, College of Science, and the Office of Research Intramural funding for the success in securing the CAREER award. 

NIRDS Research Lab Team. From left, Ibrokhim Iskandarov, Bhanuka @mahanama94, Yasith @yasithmilinda, Gavindya @Gavindya2, Yasasi @Yasasi_Abey, Bathsheba @sheissheba, Patricia Ile-Mendoza, Maryam Salehi, Sampath @OpenMaze, Kayla @kaypineda3, James @15Jowens. Not in the picture Brian @bdhansonjr

Earning the CAREER award is very exciting, it truly broadens the scope of what we've been working on and opens the door to impactful research in the eye tracking domain.

--Sampath Jayarathna (@OpenMaze)

 


Comments