Efficient Event-based Eye-Tracking (3ET) Challenge
CVPR Event-based Vision Workshop 2025

1Leiden University, 2DVSense, 3Prophesee, 4University of Wurzburg, 5TU Delft
Presentation at CVPR Event-based Vision Workshop 2025
Note: Links below are temporarily from 2024. 2025 links coming soon.
Event-based eye-tracking video game demo

Let's play some video games with event-based eye-tracking!

This video was filmed at the 2023 Telluride Neuromorphic Cognition Engineering Workshop, CO, US, by Qinyu Chen and Chang Gao.

About the Challenge

Developing an event-based eye-tracking system presents significant opportunities in diverse fields, notably in consumer electronics and neuroscience. Human eyes exhibit rapid movements, occasionally surpassing speeds of 300°/s. This necessitates using event cameras capable of high-speed sampling and tracking.

In consumer electronics, particularly in augmented and virtual reality (AR/VR) applications, the primary benefits of event-based systems extend beyond their high speed. Their highly sparse input data streams can be exploited to reduce power consumption. This is a pivotal advantage in creating lighter, more efficient wearable headsets that offer prolonged usage and enhanced user comfort. This is instrumental in augmenting the immersive experience in AR/VR and expanding the capabilities of portable technology. In neuroscience and cognitive studies, such technology is crucial for deciphering the complexities of eye movement. It facilitates a deeper comprehension of visual attention processes and aids in diagnosing and understanding neurological disorders.

This challenge aims to develop an efficient event-based eye-tracking system for precise tracking of rapid eye movements to produce lighter and more comfortable devices for a better user experience. The evaluation will be based on the accuracy and the efficiency. Top-1-ranking team will get a Meta Quest 3 as the prize (Sponsored by DVsense). The top-ranked teams will be invited to a challenge report for documenting the challenge and submit their works as workshop papers.


The challenge will held with the 5th Event-based Vision workshop, in conjunction with CVPR 2025.

News

[Jan 9, 2025] Website released!

Important Dates

  • Challenge Start: February 10, 2025
  • Challenge End: March 15, 2025
  • Top-ranking teams will be invited to submit factsheet, code, and paper after competition ends, the submission deadline: March 25, 2025
  • Top-ranking teams will be invited to write challenge report together, the deadline: April 5, 2025
  • Paper review deadline: April 5, 2025

Contact

If you have technical questions on the challenge, please contact us at:
- Qinyu Chen (q.chen [at] liacs [dot] leidenuniv [dot] nl)
- Chang Gao (chang.gao [at] tudelft [dot] nl)
For more details, please contact workshop organizers.

Program Committee Members (TBU)

Qinyu Chen, Leiden University
Chang Gao, TU Delft
Min Liu, DVSense
Junyuan Ding, DVSense
Ziteng Wang, DVSense
Zongwei Wu, University of Wurzburg

Previous challenge

Event-based Eye Tracking Challenge, AI for Streaming workshop, in conjunction with CVPR 2024.

  • 26 teams participated in the challenge.
  • 8 teams were invited to write challenge report together, and 4 teams' submissions were accepted as workshop papers.
  • We acknowledged to Zuowen Wang, Chang Gao, Zongwei Wu, Marcos V. Conde, Radu Timofte, Shih-Chii Liu, Qinyu Chen, and all the participants!

BibTeX


    @inproceedings{chen20233et,
      title={3et: Efficient event-based eye tracking using a change-based convlstm network},
      author={Chen, Qinyu and Wang, Zuowen and Liu, Shih-Chii and Gao, Chang},
      booktitle={2023 IEEE Biomedical Circuits and Systems Conference (BioCAS)},
      pages={1--5},
      year={2023},
      organization={IEEE}
    }

    @inproceedings{wang2024ais_event, 
      title={{E}vent-{B}ased {E}ye {T}racking. {AIS} 2024 {C}hallenge {S}urvey}, 
      author={Zuowen Wang and Chang Gao and Zongwei Wu and Marcos V. Conde and Radu Timofte and Shih-Chii Liu and Qinyu Chen and others}, 
      booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops}, 
      year={2024}
    }

    @inproceedings{zhao2024ev,
      title={Ev-eye: Rethinking high-frequency eye tracking through the lenses of event cameras},
      author={Zhao, Guangrong and Yang, Yurun and Liu, Jingwei and Chen, Ning and Shen, Yiran and Wen, Hongkai and Lan, Guohao},
      journal={Advances in Neural Information Processing Systems},
      volume={36},
      year={2024}
    }
    

This website template is borrowed from Nerfies.