OpenEyes Workshop @ ECCV 2020: Call for Papers, Challenge & Extended Abstract Call for Papers

Call For Participation - 
OpenEyes Workshop @ ECCV 2020: 
Call for Papers, Challenge & Extended Abstract

It is our pleasure to announce to you that we have joined hands with
Facebook Reality Labs to organize a joint full-day workshop at ECCV
2020 in Glasgow, UK. Specifically, this is an event combining the
former OpenEDS workshop (organized by Facebook at ICCV 2019) and our
GAZE workshop. In this way, we will be able to cover a broader range
of topics involving eye-tracking, including the context of AR/VR as
well as in natural settings.

The workshop will host two tracks: the first focuses on gaze
estimation and prediction methods, with a focus on accuracy and
robustness in natural settings (in-the-wild); the second track focuses
on the scale and generalization problem for eye-tracking systems
operating on AR and VR platforms. The second track also includes the
2020 eye-tracking challenge. More information on the OpenEDS 2020
challenge can be found

The webpage for the new workshop can be found

and the following topics are of particular interest to us this year:

    Proposal of novel eye detection, gaze estimation pipelines using deep neural networks that incorporate one or all of the following:
    Geometric/anatomical constraints into the network in a differentiable manner.
    Demonstration of robustness to conditions where current methods fail (illumination, appearance, low-resolution etc.).
    Robust estimation from different data modalities such as RGB, depth, and near IR.
    Use of additional cues, such as task context, temporal data, eye movement classification.
    Designing new, accurate metrics to account for rapid eye movements in the real world.
    Semi-/un-/self-supervised learning, meta-learning, domain adaptation, attention mechanisms and other related machine learning methods for gaze estimation.
    Methods for temporal gaze estimation and prediction including Bayesian methods.
    Unsupervised semantic segmentation of eye regions.
    Active learning frameworks for semantic segmentation of eye images.
    Generative models for eye image synthesis and gaze estimation.
    Transfer learning for eye tracking from simulation data to real data.
    Domain transfer applications for eye tracking.

This workshop will accept submissions of both published and
unpublished works. We will also solicit high-quality eye
tracking-related papers rejected at ECCV 2020, accompanied by the
reviews and a letter of changes which clearly states the changes made
to address comments by the previous reviewers. Accepted papers may be
featured as spotlight talks and posters. In addition to regular
workshop papers, we also invite extended abstracts of ongoing or
published work (e.g. related papers on ECCV main track). We see this
as an opportunity for authors to promote their work to an interested
audience. Extended abstracts are limited to six pages (excluding

++ Paper ++
Submission deadline: June 5, 2020
Notification of acceptance: July 3, 2020
Camera-ready deadline: July 17, 2020
Workshop: August 23, 2020 (Full-day)

* Extended abstract deadline: TBD (in July)

++ Challenge ++
Challenge participation deadline: July 31, 2020
Notifications to winners: August 10, 2020
Winner announcement and prize distribution: TBD

Hyung Jin Chang, University of Birmingham, UK
Seonwook Park, ETH Zürich, Switzerland
Xucong Zhang, ETH Zürich, Switzerland
Otmar Hilliges, ETH Zürich, Switzerland
Aleš Leonardis, University of Birmingham, UK
Robert Cavin, Facebook Reality Labs, USA
Cristina Palmero, Universitat de Barcelona (UB), Spain
Jixu Chen, Facebook, USA
Alexander Fix, Facebook Reality Labs, USA
Elias Guestrin, Facebook Reality Labs, USA
Oleg Komogortsev, Texas State University, USA
Kapil Krishnakumar, Facebook, USA
Abhishek Sharma, Facebook Reality Labs, USA
Yiru Shen, Facebook Reality Labs, USA
Tarek Hefny, Facebook Reality Labs, USA
Karsten Behrendt, Facebook, USA
Sachin S. Talathi, Facebook Reality Labs, USA

We will be updating you soon on more specific details, and keenly look
forward to your ideas and contributions. If you should have any
questions already, please do not hesitate to reply to this message, or
to contact Hyung Jin Chang (regarding paper submissions) at or Sachin S. Talathi (regarding the challenge) at

Best wishes,
OpenEyes 2020 Workshop Organizers